WorldWideScience

Sample records for statistical quality control

  1. Frontiers in statistical quality control

    CERN Document Server

    Wilrich, Peter-Theodor

    2004-01-01

    This volume treats the four main categories of Statistical Quality Control: General SQC Methodology, On-line Control including Sampling Inspection and Statistical Process Control, Off-line Control with Data Analysis and Experimental Design, and, fields related to Reliability. Experts with international reputation present their newest contributions.

  2. Frontiers in statistical quality control

    CERN Document Server

    Wilrich, Peter-Theodor

    2001-01-01

    The book is a collection of papers presented at the 5th International Workshop on Intelligent Statistical Quality Control in Würzburg, Germany. Contributions deal with methodology and successful industrial applications. They can be grouped in four catagories: Sampling Inspection, Statistical Process Control, Data Analysis and Process Capability Studies and Experimental Design.

  3. Quality assurance and statistical control

    DEFF Research Database (Denmark)

    Heydorn, K.

    1991-01-01

    In scientific research laboratories it is rarely possible to use quality assurance schemes, developed for large-scale analysis. Instead methods have been developed to control the quality of modest numbers of analytical results by relying on statistical control: Analysis of precision serves...... to detect analytical errors by comparing the a priori precision of the analytical results with the actual variability observed among replicates or duplicates. The method relies on the chi-square distribution to detect excess variability and is quite sensitive even for 5-10 results. Interference control...... serves to detect analytical bias by comparing results obtained by two different analytical methods, each relying on a different detection principle and therefore exhibiting different influence from matrix elements; only 5-10 sets of results are required to establish whether a regression line passes...

  4. Frontiers in statistical quality control

    CERN Document Server

    Wilrich, Peter-Theodor

    1997-01-01

    Like the preceding volumes, and met with a lively response, the present volume is collecting contributions stressed on methodology or successful industrial applications. The papers are classified under four main headings: sampling inspection, process quality control, data analysis and process capability studies and finally experimental design.

  5. Frontiers in statistical quality control 11

    CERN Document Server

    Schmid, Wolfgang

    2015-01-01

    The main focus of this edited volume is on three major areas of statistical quality control: statistical process control (SPC), acceptance sampling and design of experiments. The majority of the papers deal with statistical process control, while acceptance sampling and design of experiments are also treated to a lesser extent. The book is organized into four thematic parts, with Part I addressing statistical process control. Part II is devoted to acceptance sampling. Part III covers the design of experiments, while Part IV discusses related fields. The twenty-three papers in this volume stem from The 11th International Workshop on Intelligent Statistical Quality Control, which was held in Sydney, Australia from August 20 to August 23, 2013. The event was hosted by Professor Ross Sparks, CSIRO Mathematics, Informatics and Statistics, North Ryde, Australia and was jointly organized by Professors S. Knoth, W. Schmid and Ross Sparks. The papers presented here were carefully selected and reviewed by the scientifi...

  6. Net analyte signal based statistical quality control

    NARCIS (Netherlands)

    Skibsted, E.T.S.; Boelens, H.F.M.; Westerhuis, J.A.; Smilde, A.K.; Broad, N.W.; Rees, D.R.; Witte, D.T.

    2005-01-01

    Net analyte signal statistical quality control (NAS-SQC) is a new methodology to perform multivariate product quality monitoring based on the net analyte signal approach. The main advantage of NAS-SQC is that the systematic variation in the product due to the analyte (or property) of interest is

  7. Statistical quality control a loss minimization approach

    CERN Document Server

    Trietsch, Dan

    1999-01-01

    While many books on quality espouse the Taguchi loss function, they do not examine its impact on statistical quality control (SQC). But using the Taguchi loss function sheds new light on questions relating to SQC and calls for some changes. This book covers SQC in a way that conforms with the need to minimize loss. Subjects often not covered elsewhere include: (i) measurements, (ii) determining how many points to sample to obtain reliable control charts (for which purpose a new graphic tool, diffidence charts, is introduced), (iii) the connection between process capability and tolerances, (iv)

  8. Statistical process control for radiotherapy quality assurance

    International Nuclear Information System (INIS)

    Pawlicki, Todd; Whitaker, Matthew; Boyer, Arthur L.

    2005-01-01

    Every quality assurance process uncovers random and systematic errors. These errors typically consist of many small random errors and a very few number of large errors that dominate the result. Quality assurance practices in radiotherapy do not adequately differentiate between these two sources of error. The ability to separate these types of errors would allow the dominant source(s) of error to be efficiently detected and addressed. In this work, statistical process control is applied to quality assurance in radiotherapy for the purpose of setting action thresholds that differentiate between random and systematic errors. The theoretical development and implementation of process behavior charts are described. We report on a pilot project is which these techniques are applied to daily output and flatness/symmetry quality assurance for a 10 MV photon beam in our department. This clinical case was followed over 52 days. As part of our investigation, we found that action thresholds set using process behavior charts were able to identify systematic changes in our daily quality assurance process. This is in contrast to action thresholds set using the standard deviation, which did not identify the same systematic changes in the process. The process behavior thresholds calculated from a subset of the data detected a 2% change in the process whereas with a standard deviation calculation, no change was detected. Medical physicists must make decisions on quality assurance data as it is acquired. Process behavior charts help decide when to take action and when to acquire more data before making a change in the process

  9. Control cards as a statistical quality control resource

    Directory of Open Access Journals (Sweden)

    Aleksandar Živan Drenovac

    2013-02-01

    Full Text Available Normal 0 false false false MicrosoftInternetExplorer4 This paper proves that applying of statistical methods can significantly contribute increasing of products and services quality, as well as increasing of institutions rating. Determining of optimal, apropos anticipatory and limitary values, is based on sample`s statistical analyze. Control cards represent very confident instrument, which is simple for use and efficient for control of process, by which process is maintained in set borders. Thus, control cards can be applied in quality control of procesess of weapons and military equipment production, maintenance of technical systems, as well as for seting of standards and increasing of quality level for many other activities.

  10. PROCESS VARIABILITY REDUCTION THROUGH STATISTICAL PROCESS CONTROL FOR QUALITY IMPROVEMENT

    Directory of Open Access Journals (Sweden)

    B.P. Mahesh

    2010-09-01

    Full Text Available Quality has become one of the most important customer decision factors in the selection among the competing product and services. Consequently, understanding and improving quality is a key factor leading to business success, growth and an enhanced competitive position. Hence quality improvement program should be an integral part of the overall business strategy. According to TQM, the effective way to improve the Quality of the product or service is to improve the process used to build the product. Hence, TQM focuses on process, rather than results as the results are driven by the processes. Many techniques are available for quality improvement. Statistical Process Control (SPC is one such TQM technique which is widely accepted for analyzing quality problems and improving the performance of the production process. This article illustrates the step by step procedure adopted at a soap manufacturing company to improve the Quality by reducing process variability using Statistical Process Control.

  11. Statistical Process Control: Going to the Limit for Quality.

    Science.gov (United States)

    Training, 1987

    1987-01-01

    Defines the concept of statistical process control, a quality control method used especially in manufacturing. Generally, concept users set specific standard levels that must be met. Makes the point that although employees work directly with the method, management is responsible for its success within the plant. (CH)

  12. The application of statistical process control in linac quality assurance

    International Nuclear Information System (INIS)

    Li Dingyu; Dai Jianrong

    2009-01-01

    Objective: To improving linac quality assurance (QA) program with statistical process control (SPC) method. Methods: SPC is applied to set the control limit of QA data, draw charts and differentiate the random and systematic errors. A SPC quality assurance software named QA M ANAGER has been developed by VB programming for clinical use. Two clinical cases are analyzed with SPC to study daily output QA of a 6MV photon beam. Results: In the clinical case, the SPC is able to identify the systematic errors. Conclusion: The SPC application may be assistant to detect systematic errors in linac quality assurance thus it alarms the abnormal trend to eliminate the systematic errors and improves quality control. (authors)

  13. Improved Statistical Method For Hydrographic Climatic Records Quality Control

    Science.gov (United States)

    Gourrion, J.; Szekely, T.

    2016-02-01

    Climate research benefits from the continuous development of global in-situ hydrographic networks in the last decades. Apart from the increasing volume of observations available on a large range of temporal and spatial scales, a critical aspect concerns the ability to constantly improve the quality of the datasets. In the context of the Coriolis Dataset for ReAnalysis (CORA) version 4.2, a new quality control method based on a local comparison to historical extreme values ever observed is developed, implemented and validated. Temperature, salinity and potential density validity intervals are directly estimated from minimum and maximum values from an historical reference dataset, rather than from traditional mean and standard deviation estimates. Such an approach avoids strong statistical assumptions on the data distributions such as unimodality, absence of skewness and spatially homogeneous kurtosis. As a new feature, it also allows addressing simultaneously the two main objectives of a quality control strategy, i.e. maximizing the number of good detections while minimizing the number of false alarms. The reference dataset is presently built from the fusion of 1) all ARGO profiles up to early 2014, 2) 3 historical CTD datasets and 3) the Sea Mammals CTD profiles from the MEOP database. All datasets are extensively and manually quality controlled. In this communication, the latest method validation results are also presented. The method has been implemented in the latest version of the CORA dataset and will benefit to the next version of the Copernicus CMEMS dataset.

  14. Improved statistical method for temperature and salinity quality control

    Science.gov (United States)

    Gourrion, Jérôme; Szekely, Tanguy

    2017-04-01

    Climate research and Ocean monitoring benefit from the continuous development of global in-situ hydrographic networks in the last decades. Apart from the increasing volume of observations available on a large range of temporal and spatial scales, a critical aspect concerns the ability to constantly improve the quality of the datasets. In the context of the Coriolis Dataset for ReAnalysis (CORA) version 4.2, a new quality control method based on a local comparison to historical extreme values ever observed is developed, implemented and validated. Temperature, salinity and potential density validity intervals are directly estimated from minimum and maximum values from an historical reference dataset, rather than from traditional mean and standard deviation estimates. Such an approach avoids strong statistical assumptions on the data distributions such as unimodality, absence of skewness and spatially homogeneous kurtosis. As a new feature, it also allows addressing simultaneously the two main objectives of an automatic quality control strategy, i.e. maximizing the number of good detections while minimizing the number of false alarms. The reference dataset is presently built from the fusion of 1) all ARGO profiles up to late 2015, 2) 3 historical CTD datasets and 3) the Sea Mammals CTD profiles from the MEOP database. All datasets are extensively and manually quality controlled. In this communication, the latest method validation results are also presented. The method has already been implemented in the latest version of the delayed-time CMEMS in-situ dataset and will be deployed soon in the equivalent near-real time products.

  15. Applying Statistical Process Quality Control Methodology to Educational Settings.

    Science.gov (United States)

    Blumberg, Carol Joyce

    A subset of Statistical Process Control (SPC) methodology known as Control Charting is introduced. SPC methodology is a collection of graphical and inferential statistics techniques used to study the progress of phenomena over time. The types of control charts covered are the null X (mean), R (Range), X (individual observations), MR (moving…

  16. Statistical analysis of quality control of automatic processor

    International Nuclear Information System (INIS)

    Niu Yantao; Zhao Lei; Zhang Wei; Yan Shulin

    2002-01-01

    Objective: To strengthen the scientific management of automatic processor and promote QC, based on analyzing QC management chart for automatic processor by statistical method, evaluating and interpreting the data and trend of the chart. Method: Speed, contrast, minimum density of step wedge of film strip were measured everyday and recorded on the QC chart. Mean (x-bar), standard deviation (s) and range (R) were calculated. The data and the working trend were evaluated and interpreted for management decisions. Results: Using relative frequency distribution curve constructed by measured data, the authors can judge whether it is a symmetric bell-shaped curve or not. If not, it indicates a few extremes overstepping control limits possibly are pulling the curve to the left or right. If it is a normal distribution, standard deviation (s) is observed. When x-bar +- 2s lies in upper and lower control limits of relative performance indexes, it indicates the processor works in stable status in this period. Conclusion: Guided by statistical method, QC work becomes more scientific and quantified. The authors can deepen understanding and application of the trend chart, and improve the quality management to a new step

  17. Optimage central organised image quality control including statistics and reporting

    International Nuclear Information System (INIS)

    Jahnen, A.; Schilz, C.; Shannoun, F.; Schreiner, A.; Hermen, J.; Moll, C.

    2008-01-01

    Quality control of medical imaging systems is performed using dedicated phantoms. As the imaging systems are more and more digital, adequate image processing methods might help to save evaluation time and to receive objective results. The developed software package OPTIMAGE is focusing on this with a central approach: On one hand, OPTIMAGE provides a framework, which includes functions like database integration, DICOM data sources, multilingual user interface and image processing functionality. On the other hand, the test methods are implemented using modules which are able to process the images automatically for the common imaging systems. The integration of statistics and reporting into this environment is paramount: This is the only way to provide these functions in an interactive, user-friendly way. These features enable the users to discover degradation in performance quickly and document performed measurements easily. (authors)

  18. A Total Quality-Control Plan with Right-Sized Statistical Quality-Control.

    Science.gov (United States)

    Westgard, James O

    2017-03-01

    A new Clinical Laboratory Improvement Amendments option for risk-based quality-control (QC) plans became effective in January, 2016. Called an Individualized QC Plan, this option requires the laboratory to perform a risk assessment, develop a QC plan, and implement a QC program to monitor ongoing performance of the QC plan. Difficulties in performing a risk assessment may limit validity of an Individualized QC Plan. A better alternative is to develop a Total QC Plan including a right-sized statistical QC procedure to detect medically important errors. Westgard Sigma Rules provides a simple way to select the right control rules and the right number of control measurements. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. A computerized diagnostic system for nuclear plant control rooms based on statistical quality control

    International Nuclear Information System (INIS)

    Heising, C.D.; Grenzebach, W.S.

    1990-01-01

    In engineering science, statistical quality control techniques have traditionally been applied to control manufacturing processes. An application to commercial nuclear power plant maintenance and control is presented that can greatly improve safety. As a demonstration of such an approach to plant maintenance and control, a specific system is analyzed: the reactor coolant pumps of the St. Lucie Unit 2 nuclear power plant located in Florida. A 30-day history of the four pumps prior to a plant shutdown caused by pump failure and a related fire within the containment was analyzed. Statistical quality control charts of recorded variables were constructed for each pump, which were shown to go out of statistical control many days before the plant trip. The analysis shows that statistical process control methods can be applied as an early warning system capable of identifying significant equipment problems well in advance of traditional control room alarm indicators

  20. Six Sigma Quality Management System and Design of Risk-based Statistical Quality Control.

    Science.gov (United States)

    Westgard, James O; Westgard, Sten A

    2017-03-01

    Six sigma concepts provide a quality management system (QMS) with many useful tools for managing quality in medical laboratories. This Six Sigma QMS is driven by the quality required for the intended use of a test. The most useful form for this quality requirement is the allowable total error. Calculation of a sigma-metric provides the best predictor of risk for an analytical examination process, as well as a design parameter for selecting the statistical quality control (SQC) procedure necessary to detect medically important errors. Simple point estimates of sigma at medical decision concentrations are sufficient for laboratory applications. Copyright © 2016 Elsevier Inc. All rights reserved.

  1. Statistical Data Mining for Efficient Quality Control in Manufacturing

    DEFF Research Database (Denmark)

    Khan, Abdul Rauf; Schiøler, Henrik; Knudsen, Torben Steen

    2015-01-01

    of the process e.g sensor measurements, machine readings etc, and the major contributor of these big data sets are different quality control processes. In this article we will present methodology to extract valuable insight from manufacturing data. The proposed methodology is based on comparison of probabilities...

  2. Statistical method for quality control in presence of measurement errors

    International Nuclear Information System (INIS)

    Lauer-Peccoud, M.R.

    1998-01-01

    In a quality inspection of a set of items where the measurements of values of a quality characteristic of the item are contaminated by random errors, one can take wrong decisions which are damageable to the quality. So of is important to control the risks in such a way that a final quality level is insured. We consider that an item is defective or not if the value G of its quality characteristic is larger or smaller than a given level g. We assume that, due to the lack of precision of the measurement instrument, the measurement M of this characteristic is expressed by ∫ (G) + ξ where f is an increasing function such that the value ∫ (g 0 ) is known and ξ is a random error with mean zero and given variance. First we study the problem of the determination of a critical measure m such that a specified quality target is reached after the classification of a lot of items where each item is accepted or rejected depending on whether its measurement is smaller or greater than m. Then we analyse the problem of testing the global quality of a lot from the measurements for a example of items taken from the lot. For these two kinds of problems and for different quality targets, we propose solutions emphasizing on the case where the function ∫ is linear and the error ξ and the variable G are Gaussian. Simulation results allow to appreciate the efficiency of the different considered control procedures and their robustness with respect to deviations from the assumptions used in the theoretical derivations. (author)

  3. Real-time statistical quality control and ARM

    International Nuclear Information System (INIS)

    Blough, D.K.

    1992-05-01

    An important component of the Atmospheric Radiation Measurement (ARM) Program is real-time quality control of data obtained from meteorological instruments. It is the goal of the ARM program to enhance the predictive capabilities of global circulation models by incorporating in them more detailed information on the radiative characteristics of the earth's atmosphere. To this end, a number of Cloud and Radiation Testbeds (CART's) will be built at various locations worldwide. Each CART will consist of an array of instruments designed to collect radiative data. The large amount of data obtained from these instruments necessitates real-time processing in order to flag outliers and possible instrument malfunction. The Bayesian dynamic linear model (DLM) proves to be an effective way of monitoring the time series data which each instrument generates. It provides a flexible yet powerful approach to detecting in real-time sudden shifts in a non-stationary multivariate time series. An application of these techniques to data arising from a remote sensing instrument to be used in the CART is provided. Using real data from a wind profiler, the ability of the DLM to detect outliers is studied. 5 refs

  4. Statistical process control: An approach to quality assurance in the production of vitrified nuclear waste

    International Nuclear Information System (INIS)

    Pulsipher, B.A.; Kuhn, W.L.

    1987-01-01

    Current planning for liquid high-level nuclear wastes existing in the United States includes processing in a liquid-fed ceramic melter to incorporate it into a high-quality glass, and placement in a deep geologic repository. The nuclear waste vitrification process requires assurance of a quality product with little or no final inspection. Statistical process control (SPC) is a quantitative approach to one quality assurance aspect of vitrified nuclear waste. This method for monitoring and controlling a process in the presence of uncertainties provides a statistical basis for decisions concerning product quality improvement. Statistical process control is shown to be a feasible and beneficial tool to help the waste glass producers demonstrate that the vitrification process can be controlled sufficiently to produce an acceptable product. This quantitative aspect of quality assurance could be an effective means of establishing confidence in the claims to a quality product

  5. Statistical process control: An approach to quality assurance in the production of vitrified nuclear waste

    International Nuclear Information System (INIS)

    Pulsipher, B.A.; Kuhn, W.L.

    1987-02-01

    Current planning for liquid high-level nuclear wastes existing in the US includes processing in a liquid-fed ceramic melter to incorporate it into a high-quality glass, and placement in a deep geologic repository. The nuclear waste vitrification process requires assurance of a quality product with little or no final inspection. Statistical process control (SPC) is a quantitative approach to one quality assurance aspect of vitrified nuclear waste. This method for monitoring and controlling a process in the presence of uncertainties provides a statistical basis for decisions concerning product quality improvement. Statistical process control is shown to be a feasible and beneficial tool to help the waste glass producers demonstrate that the vitrification process can be controlled sufficiently to produce an acceptable product. This quantitative aspect of quality assurance could be an effective means of establishing confidence in the claims to a quality product. 2 refs., 4 figs

  6. Application of Statistical Process Control (SPC in it´s Quality control

    Directory of Open Access Journals (Sweden)

    Carlos Hernández-Pedrera

    2015-12-01

    Full Text Available The overall objective of this paper is to use the SPC to assess the possibility of improving the process of obtaining a sanitary device. As specific objectives we set out to identify the variables to be analyzed to enter the statistical control of process (SPC, analyze possible errors and variations indicated by the control charts in addition to evaluate and compare the results achieved with the study of SPC before and after monitoring direct in the production line were used sampling methods and laboratory replacement to determine the quality of the finished product, then statistical methods were applied seeking to emphasize the importance and contribution from its application to monitor corrective actions and support processes in production. It was shown that the process is under control because the results were found within established control limits. There is a tendency to be displaced toward one end of the boundary, the distribution exceeds the limits, creating the possibility that under certain conditions the process is out of control, the results also showed that the process being within the limits of quality control is operating far from the optimal conditions. In any of the study situations were obtained products outside the limits of weight and discoloration but defective products were obtained.

  7. Interrupted Time Series Versus Statistical Process Control in Quality Improvement Projects.

    Science.gov (United States)

    Andersson Hagiwara, Magnus; Andersson Gäre, Boel; Elg, Mattias

    2016-01-01

    To measure the effect of quality improvement interventions, it is appropriate to use analysis methods that measure data over time. Examples of such methods include statistical process control analysis and interrupted time series with segmented regression analysis. This article compares the use of statistical process control analysis and interrupted time series with segmented regression analysis for evaluating the longitudinal effects of quality improvement interventions, using an example study on an evaluation of a computerized decision support system.

  8. Image-guided radiotherapy quality control: Statistical process control using image similarity metrics.

    Science.gov (United States)

    Shiraishi, Satomi; Grams, Michael P; Fong de Los Santos, Luis E

    2018-05-01

    The purpose of this study was to demonstrate an objective quality control framework for the image review process. A total of 927 cone-beam computed tomography (CBCT) registrations were retrospectively analyzed for 33 bilateral head and neck cancer patients who received definitive radiotherapy. Two registration tracking volumes (RTVs) - cervical spine (C-spine) and mandible - were defined, within which a similarity metric was calculated and used as a registration quality tracking metric over the course of treatment. First, sensitivity to large misregistrations was analyzed for normalized cross-correlation (NCC) and mutual information (MI) in the context of statistical analysis. The distribution of metrics was obtained for displacements that varied according to a normal distribution with standard deviation of σ = 2 mm, and the detectability of displacements greater than 5 mm was investigated. Then, similarity metric control charts were created using a statistical process control (SPC) framework to objectively monitor the image registration and review process. Patient-specific control charts were created using NCC values from the first five fractions to set a patient-specific process capability limit. Population control charts were created using the average of the first five NCC values for all patients in the study. For each patient, the similarity metrics were calculated as a function of unidirectional translation, referred to as the effective displacement. Patient-specific action limits corresponding to 5 mm effective displacements were defined. Furthermore, effective displacements of the ten registrations with the lowest similarity metrics were compared with a three dimensional (3DoF) couch displacement required to align the anatomical landmarks. Normalized cross-correlation identified suboptimal registrations more effectively than MI within the framework of SPC. Deviations greater than 5 mm were detected at 2.8σ and 2.1σ from the mean for NCC and MI

  9. Statistical methods for quality assurance basics, measurement, control, capability, and improvement

    CERN Document Server

    Vardeman, Stephen B

    2016-01-01

    This undergraduate statistical quality assurance textbook clearly shows with real projects, cases and data sets how statistical quality control tools are used in practice. Among the topics covered is a practical evaluation of measurement effectiveness for both continuous and discrete data. Gauge Reproducibility and Repeatability methodology (including confidence intervals for Repeatability, Reproducibility and the Gauge Capability Ratio) is thoroughly developed. Process capability indices and corresponding confidence intervals are also explained. In addition to process monitoring techniques, experimental design and analysis for process improvement are carefully presented. Factorial and Fractional Factorial arrangements of treatments and Response Surface methods are covered. Integrated throughout the book are rich sets of examples and problems that help readers gain a better understanding of where and how to apply statistical quality control tools. These large and realistic problem sets in combination with the...

  10. Use of statistic control of the process as part of a quality assurance plan

    International Nuclear Information System (INIS)

    Acosta, S.; Lewis, C.

    2013-01-01

    One of the technical requirements of the standard IRAM ISO 17025 for the accreditation of testing laboratories, is the assurance of the quality of the results through the control and monitoring of the factors influencing the reliability of them. The grade the factors contribute to the total measurement uncertainty, determines which of them should be considered when developing a quality assurance plan. The laboratory of environmental measurements of strontium-90 in the accreditation process, performs most of its determinations in samples with values close to the detection limit. For this reason the correct characterization of the white, is a critical parameter and is verified through a letter for statistical process control. The scope of the present work is concerned the control of whites and so it was collected a statistically significant amount of data, for a period of time that is covered of different conditions. This allowed consider significant variables in the process, such as temperature and humidity, and build a graph of white control, which forms the basis of a statistical process control. The data obtained were lower and upper limits for the preparation of the charter white control. In this way the process of characterization of white was considered to operate under statistical control and concludes that it can be used as part of a plan of insurance of the quality

  11. Development of nuclear power plant online monitoring system using statistical quality control

    International Nuclear Information System (INIS)

    An, Sang Ha

    2006-02-01

    Statistical Quality Control techniques have been applied to many aspects of industrial engineering. An application to nuclear power plant maintenance and control is also presented that can greatly improve plant safety. As a demonstration of such an approach, a specific system is analyzed: the reactor coolant pumps (RCP) and the fouling resistance of heat exchanger. This research uses Shewart X-bar, R charts, Cumulative Sum charts (CUSUM), and Sequential Probability Ratio Test (SPRT) to analyze the process for the state of statistical control. And we made Control Chart Analyzer (CCA) to support these analyses that can make a decision of error in process. The analysis shows that statistical process control methods can be applied as an early warning system capable of identifying significant equipment problems well in advance of traditional control room alarm indicators. Such a system would provide operators with enough time to respond to possible emergency situations and thus improve plant safety and reliability

  12. Evaluation of statistical protocols for quality control of ecosystem carbon dioxide fluxes

    Science.gov (United States)

    Jorge F. Perez-Quezada; Nicanor Z. Saliendra; William E. Emmerich; Emilio A. Laca

    2007-01-01

    The process of quality control of micrometeorological and carbon dioxide (CO2) flux data can be subjective and may lack repeatability, which would undermine the results of many studies. Multivariate statistical methods and time series analysis were used together and independently to detect and replace outliers in CO2 flux...

  13. Multileaf collimator performance monitoring and improvement using semiautomated quality control testing and statistical process control

    International Nuclear Information System (INIS)

    Létourneau, Daniel; McNiven, Andrea; Keller, Harald; Wang, An; Amin, Md Nurul; Pearce, Jim; Norrlinger, Bernhard; Jaffray, David A.

    2014-01-01

    Purpose: High-quality radiation therapy using highly conformal dose distributions and image-guided techniques requires optimum machine delivery performance. In this work, a monitoring system for multileaf collimator (MLC) performance, integrating semiautomated MLC quality control (QC) tests and statistical process control tools, was developed. The MLC performance monitoring system was used for almost a year on two commercially available MLC models. Control charts were used to establish MLC performance and assess test frequency required to achieve a given level of performance. MLC-related interlocks and servicing events were recorded during the monitoring period and were investigated as indicators of MLC performance variations. Methods: The QC test developed as part of the MLC performance monitoring system uses 2D megavoltage images (acquired using an electronic portal imaging device) of 23 fields to determine the location of the leaves with respect to the radiation isocenter. The precision of the MLC performance monitoring QC test and the MLC itself was assessed by detecting the MLC leaf positions on 127 megavoltage images of a static field. After initial calibration, the MLC performance monitoring QC test was performed 3–4 times/week over a period of 10–11 months to monitor positional accuracy of individual leaves for two different MLC models. Analysis of test results was performed using individuals control charts per leaf with control limits computed based on the measurements as well as two sets of specifications of ±0.5 and ±1 mm. Out-of-specification and out-of-control leaves were automatically flagged by the monitoring system and reviewed monthly by physicists. MLC-related interlocks reported by the linear accelerator and servicing events were recorded to help identify potential causes of nonrandom MLC leaf positioning variations. Results: The precision of the MLC performance monitoring QC test and the MLC itself was within ±0.22 mm for most MLC leaves

  14. Multileaf collimator performance monitoring and improvement using semiautomated quality control testing and statistical process control.

    Science.gov (United States)

    Létourneau, Daniel; Wang, An; Amin, Md Nurul; Pearce, Jim; McNiven, Andrea; Keller, Harald; Norrlinger, Bernhard; Jaffray, David A

    2014-12-01

    High-quality radiation therapy using highly conformal dose distributions and image-guided techniques requires optimum machine delivery performance. In this work, a monitoring system for multileaf collimator (MLC) performance, integrating semiautomated MLC quality control (QC) tests and statistical process control tools, was developed. The MLC performance monitoring system was used for almost a year on two commercially available MLC models. Control charts were used to establish MLC performance and assess test frequency required to achieve a given level of performance. MLC-related interlocks and servicing events were recorded during the monitoring period and were investigated as indicators of MLC performance variations. The QC test developed as part of the MLC performance monitoring system uses 2D megavoltage images (acquired using an electronic portal imaging device) of 23 fields to determine the location of the leaves with respect to the radiation isocenter. The precision of the MLC performance monitoring QC test and the MLC itself was assessed by detecting the MLC leaf positions on 127 megavoltage images of a static field. After initial calibration, the MLC performance monitoring QC test was performed 3-4 times/week over a period of 10-11 months to monitor positional accuracy of individual leaves for two different MLC models. Analysis of test results was performed using individuals control charts per leaf with control limits computed based on the measurements as well as two sets of specifications of ± 0.5 and ± 1 mm. Out-of-specification and out-of-control leaves were automatically flagged by the monitoring system and reviewed monthly by physicists. MLC-related interlocks reported by the linear accelerator and servicing events were recorded to help identify potential causes of nonrandom MLC leaf positioning variations. The precision of the MLC performance monitoring QC test and the MLC itself was within ± 0.22 mm for most MLC leaves and the majority of the

  15. Methods and applications of statistics in engineering, quality control, and the physical sciences

    CERN Document Server

    Balakrishnan, N

    2011-01-01

    Inspired by the Encyclopedia of Statistical Sciences, Second Edition (ESS2e), this volume presents a concise, well-rounded focus on the statistical concepts and applications that are essential for understanding gathered data in the fields of engineering, quality control, and the physical sciences. The book successfully upholds the goals of ESS2e by combining both previously-published and newly developed contributions written by over 100 leading academics, researchers, and practitioner in a comprehensive, approachable format. The result is a succinct reference that unveils modern, cutting-edge approaches to acquiring and analyzing data across diverse subject areas within these three disciplines, including operations research, chemistry, physics, the earth sciences, electrical engineering, and quality assurance. In addition, techniques related to survey methodology, computational statistics, and operations research are discussed, where applicable. Topics of coverage include: optimal and stochastic control, arti...

  16. Statistical methods for quality improvement

    National Research Council Canada - National Science Library

    Ryan, Thomas P

    2011-01-01

    ...."-TechnometricsThis new edition continues to provide the most current, proven statistical methods for quality control and quality improvementThe use of quantitative methods offers numerous benefits...

  17. Assessment of the GPC Control Quality Using Non–Gaussian Statistical Measures

    Directory of Open Access Journals (Sweden)

    Domański Paweł D.

    2017-06-01

    Full Text Available This paper presents an alternative approach to the task of control performance assessment. Various statistical measures based on Gaussian and non-Gaussian distribution functions are evaluated. The analysis starts with the review of control error histograms followed by their statistical analysis using probability distribution functions. Simulation results obtained for a control system with the generalized predictive controller algorithm are considered. The proposed approach using Cauchy and Lévy α-stable distributions shows robustness against disturbances and enables effective control loop quality evaluation. Tests of the predictive algorithm prove its ability to detect the impact of the main controller parameters, such as the model gain, the dynamics or the prediction horizon.

  18. Quality Control of the Print with the Application of Statistical Methods

    Science.gov (United States)

    Simonenko, K. V.; Bulatova, G. S.; Antropova, L. B.; Varepo, L. G.

    2018-04-01

    The basis for standardizing the process of offset printing is the control of print quality indicators. The solution of this problem has various approaches, among which the most important are statistical methods. Practical implementation of them for managing the quality of the printing process is very relevant and is reflected in this paper. The possibility of using the method of constructing a Control Card to identify the reasons for the deviation of the optical density for a triad of inks in offset printing is shown.

  19. Assessing thermal comfort and energy efficiency in buildings by statistical quality control for autocorrelated data

    International Nuclear Information System (INIS)

    Barbeito, Inés; Zaragoza, Sonia; Tarrío-Saavedra, Javier; Naya, Salvador

    2017-01-01

    Highlights: • Intelligent web platform development for energy efficiency management in buildings. • Controlling and supervising thermal comfort and energy consumption in buildings. • Statistical quality control procedure to deal with autocorrelated data. • Open source alternative using R software. - Abstract: In this paper, a case study of performing a reliable statistical procedure to evaluate the quality of HVAC systems in buildings using data retrieved from an ad hoc big data web energy platform is presented. The proposed methodology based on statistical quality control (SQC) is used to analyze the real state of thermal comfort and energy efficiency of the offices of the company FRIDAMA (Spain) in a reliable way. Non-conformities or alarms, and the actual assignable causes of these out of control states are detected. The capability to meet specification requirements is also analyzed. Tools and packages implemented in the open-source R software are employed to apply the different procedures. First, this study proposes to fit ARIMA time series models to CTQ variables. Then, the application of Shewhart and EWMA control charts to the time series residuals is proposed to control and monitor thermal comfort and energy consumption in buildings. Once thermal comfort and consumption variability are estimated, the implementation of capability indexes for autocorrelated variables is proposed to calculate the degree to which standards specifications are met. According with case study results, the proposed methodology has detected real anomalies in HVAC installation, helping to detect assignable causes and to make appropriate decisions. One of the goals is to perform and describe step by step this statistical procedure in order to be replicated by practitioners in a better way.

  20. IMPROVING QUALITY OF STATISTICAL PROCESS CONTROL BY DEALING WITH NON‐NORMAL DATA IN AUTOMOTIVE INDUSTRY

    Directory of Open Access Journals (Sweden)

    Zuzana ANDRÁSSYOVÁ

    2012-07-01

    Full Text Available Study deals with an analysis of data to the effect that it improves the quality of statistical tools in processes of assembly of automobile seats. Normal distribution of variables is one of inevitable conditions for the analysis, examination, and improvement of the manufacturing processes (f. e.: manufacturing process capability although, there are constantly more approaches to non‐normal data handling. An appropriate probability distribution of measured data is firstly tested by the goodness of fit of empirical distribution with theoretical normal distribution on the basis of hypothesis testing using programme StatGraphics Centurion XV.II. Data are collected from the assembly process of 1st row automobile seats for each characteristic of quality (Safety Regulation ‐S/R individually. Study closely processes the measured data of an airbag´s assembly and it aims to accomplish the normal distributed data and apply it the statistical process control. Results of the contribution conclude in a statement of rejection of the null hypothesis (measured variables do not follow the normal distribution therefore it is necessary to begin to work on data transformation supported by Minitab15. Even this approach does not reach a normal distributed data and so should be proposed a procedure that leads to the quality output of whole statistical control of manufacturing processes.

  1. Statistical Process Control: A Quality Tool for a Venous Thromboembolic Disease Registry.

    Science.gov (United States)

    Posadas-Martinez, Maria Lourdes; Rojas, Liliana Paloma; Vazquez, Fernando Javier; De Quiros, Fernan Bernaldo; Waisman, Gabriel Dario; Giunta, Diego Hernan

    2016-01-01

    We aim to describe Statistical Control Process as a quality tool for the Institutional Registry of Venous Thromboembolic Disease (IRTD), a registry developed in a community-care tertiary hospital in Buenos Aires, Argentina. The IRTD is a prospective cohort. The process of data acquisition began with the creation of a computerized alert generated whenever physicians requested imaging or laboratory study to diagnose venous thromboembolism, which defined eligible patients. The process then followed a structured methodology for patient's inclusion, evaluation, and posterior data entry. To control this process, process performance indicators were designed to be measured monthly. These included the number of eligible patients, the number of included patients, median time to patient's evaluation, and percentage of patients lost to evaluation. Control charts were graphed for each indicator. The registry was evaluated in 93 months, where 25,757 patients were reported and 6,798 patients met inclusion criteria. The median time to evaluation was 20 hours (SD, 12) and 7.7% of the total was lost to evaluation. Each indicator presented trends over time, caused by structural changes and improvement cycles, and therefore the central limit suffered inflexions. Statistical process control through process performance indicators allowed us to control the performance of the registry over time to detect systematic problems. We postulate that this approach could be reproduced for other clinical registries.

  2. Development of statistical and analytical techniques for use in national quality control schemes for steroid hormones

    International Nuclear Information System (INIS)

    Wilson, D.W.; Gaskell, S.J.; Fahmy, D.R.; Joyce, B.G.; Groom, G.V.; Griffiths, K.; Kemp, K.W.; Nix, A.B.J.; Rowlands, R.J.

    1979-01-01

    Adopting the rationale that the improvement of intra-laboratory performance of immunometric assays will enable the assessment of national QC schemes to become more meaningful, the group of participating laboratories has developed statistical and analytical techniques for the improvement of accuracy, precision and monitoring of error for the determination of steroid hormones. These developments are now described and their relevance to NQC schemes discussed. Attention has been focussed on some of the factors necessary for improving standards of quality in immunometric assays and their relevance to laboratories participating in NQC schemes as described. These have included the 'accuracy', precision and robustness of assay procedures as well as improved methods for internal quality control. (Auth.)

  3. Using a statistical process control chart during the quality assessment of cancer registry data.

    Science.gov (United States)

    Myles, Zachary M; German, Robert R; Wilson, Reda J; Wu, Manxia

    2011-01-01

    Statistical process control (SPC) charts may be used to detect acute variations in the data while simultaneously evaluating unforeseen aberrations that may warrant further investigation by the data user. Using cancer stage data captured by the Summary Stage 2000 (SS2000) variable, we sought to present a brief report highlighting the utility of the SPC chart during the quality assessment of cancer registry data. Using a county-level caseload for the diagnosis period of 2001-2004 (n=25,648), we found the overall variation of the SS2000 variable to be in control during diagnosis years of 2001 and 2002, exceeded the lower control limit (LCL) in 2003, and exceeded the upper control limit (UCL) in 2004; in situ/localized stages were in control throughout the diagnosis period, regional stage exceeded UCL in 2004, and distant stage exceeded the LCL in 2001 and the UCL in 2004. Our application of the SPC chart with cancer registry data illustrates that the SPC chart may serve as a readily available and timely tool for identifying areas of concern during the data collection and quality assessment of central cancer registry data.

  4. Data exploration, quality control and statistical analysis of ChIP-exo/nexus experiments.

    Science.gov (United States)

    Welch, Rene; Chung, Dongjun; Grass, Jeffrey; Landick, Robert; Keles, Sündüz

    2017-09-06

    ChIP-exo/nexus experiments rely on innovative modifications of the commonly used ChIP-seq protocol for high resolution mapping of transcription factor binding sites. Although many aspects of the ChIP-exo data analysis are similar to those of ChIP-seq, these high throughput experiments pose a number of unique quality control and analysis challenges. We develop a novel statistical quality control pipeline and accompanying R/Bioconductor package, ChIPexoQual, to enable exploration and analysis of ChIP-exo and related experiments. ChIPexoQual evaluates a number of key issues including strand imbalance, library complexity, and signal enrichment of data. Assessment of these features are facilitated through diagnostic plots and summary statistics computed over regions of the genome with varying levels of coverage. We evaluated our QC pipeline with both large collections of public ChIP-exo/nexus data and multiple, new ChIP-exo datasets from Escherichia coli. ChIPexoQual analysis of these datasets resulted in guidelines for using these QC metrics across a wide range of sequencing depths and provided further insights for modelling ChIP-exo data. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  5. Automation in Siemens fuel manufacturing - the basis for quality improvement by statistical process control (SPC)

    International Nuclear Information System (INIS)

    Drecker, St.; Hoff, A.; Dietrich, M.; Guldner, R.

    1999-01-01

    Statistical Process Control (SPC) is one of the systematic tools to perform a valuable contribution to the control and planning activities for manufacturing processes and product quality. Advanced Nuclear Fuels GmbH (ANF) started a program to introduce SPC in all sections of the manufacturing process of fuel assemblies. The concept phase is based on a realization of SPC in 3 pilot projects. The existing manufacturing devices are reviewed for the utilization of SPC. Subsequent modifications were made to provide the necessary interfaces. The processes 'powder/pellet manufacturing'. 'cladding tube manufacturing' and 'laser-welding of spacers' are located at the different locations of ANF. Due to the completion of the first steps and the experience obtained by the pilot projects, the introduction program for SPC has already been extended to other manufacturing processes. (authors)

  6. QUALITY IMPROVEMENT USING STATISTICAL PROCESS CONTROL TOOLS IN GLASS BOTTLES MANUFACTURING COMPANY

    Directory of Open Access Journals (Sweden)

    Yonatan Mengesha Awaj

    2013-03-01

    Full Text Available In order to survive in a competitive market, improving quality and productivity of product or process is a must for any company. This study is about to apply the statistical process control (SPC tools in the production processing line and on final product in order to reduce defects by identifying where the highest waste is occur at and to give suggestion for improvement. The approach used in this study is direct observation, thorough examination of production process lines, brain storming session, fishbone diagram, and information has been collected from potential customers and company's workers through interview and questionnaire, Pareto chart/analysis and control chart (p-chart was constructed. It has been found that the company has many problems; specifically there is high rejection or waste in the production processing line. The highest waste occurs in melting process line which causes loss due to trickle and in the forming process line which causes loss due to defective product rejection. The vital few problems were identified, it was found that the blisters, double seam, stone, pressure failure and overweight are the vital few problems. The principal aim of the study is to create awareness to quality team how to use SPC tools in the problem analysis, especially to train quality team on how to held an effective brainstorming session, and exploit these data in cause-and-effect diagram construction, Pareto analysis and control chart construction. The major causes of non-conformities and root causes of the quality problems were specified, and possible remedies were proposed. Although the company has many constraints to implement all suggestion for improvement within short period of time, the company recognized that the suggestion will provide significant productivity improvement in the long run.

  7. Statistical comparisons of Savannah River anemometer data applied to quality control of instrument networks

    International Nuclear Information System (INIS)

    Porch, W.M.; Dickerson, M.H.

    1976-08-01

    Continuous monitoring of extensive meteorological instrument arrays is a requirement in the study of important mesoscale atmospheric phenomena. The phenomena include pollution transport prediction from continuous area sources, or one time releases of toxic materials and wind energy prospecting in areas of topographic enhancement of the wind. Quality control techniques that can be applied to these data to determine if the instruments are operating within their prescribed tolerances were investigated. Savannah River Plant data were analyzed with both independent and comparative statistical techniques. The independent techniques calculate the mean, standard deviation, moments about the mean, kurtosis, skewness, probability density distribution, cumulative probability and power spectra. The comparative techniques include covariance, cross-spectral analysis and two dimensional probability density. At present the calculating and plotting routines for these statistical techniques do not reside in a single code so it is difficult to ascribe independent memory size and computation time accurately. However, given the flexibility of a data system which includes simple and fast running statistics at the instrument end of the data network (ASF) and more sophisticated techniques at the computational end (ACF) a proper balance will be attained. These techniques are described in detail and preliminary results are presented

  8. Multivariate statistical process control in product quality review assessment - A case study.

    Science.gov (United States)

    Kharbach, M; Cherrah, Y; Vander Heyden, Y; Bouklouze, A

    2017-11-01

    According to the Food and Drug Administration and the European Good Manufacturing Practices (GMP) guidelines, Annual Product Review (APR) is a mandatory requirement in GMP. It consists of evaluating a large collection of qualitative or quantitative data in order to verify the consistency of an existing process. According to the Code of Federal Regulation Part 11 (21 CFR 211.180), all finished products should be reviewed annually for the quality standards to determine the need of any change in specification or manufacturing of drug products. Conventional Statistical Process Control (SPC) evaluates the pharmaceutical production process by examining only the effect of a single factor at the time using a Shewhart's chart. It neglects to take into account the interaction between the variables. In order to overcome this issue, Multivariate Statistical Process Control (MSPC) can be used. Our case study concerns an APR assessment, where 164 historical batches containing six active ingredients, manufactured in Morocco, were collected during one year. Each batch has been checked by assaying the six active ingredients by High Performance Liquid Chromatography according to European Pharmacopoeia monographs. The data matrix was evaluated both by SPC and MSPC. The SPC indicated that all batches are under control, while the MSPC, based on Principal Component Analysis (PCA), for the data being either autoscaled or robust scaled, showed four and seven batches, respectively, out of the Hotelling T 2 95% ellipse. Also, an improvement of the capability of the process is observed without the most extreme batches. The MSPC can be used for monitoring subtle changes in the manufacturing process during an APR assessment. Copyright © 2017 Académie Nationale de Pharmacie. Published by Elsevier Masson SAS. All rights reserved.

  9. Statistical process control analysis for patient quality assurance of intensity modulated radiation therapy

    Science.gov (United States)

    Lee, Rena; Kim, Kyubo; Cho, Samju; Lim, Sangwook; Lee, Suk; Shim, Jang Bo; Huh, Hyun Do; Lee, Sang Hoon; Ahn, Sohyun

    2017-11-01

    This study applied statistical process control to set and verify the quality assurances (QA) tolerance standard for our hospital's characteristics with the criteria standards that are applied to all the treatment sites with this analysis. Gamma test factor of delivery quality assurances (DQA) was based on 3%/3 mm. Head and neck, breast, prostate cases of intensity modulated radiation therapy (IMRT) or volumetric arc radiation therapy (VMAT) were selected for the analysis of the QA treatment sites. The numbers of data used in the analysis were 73 and 68 for head and neck patients. Prostate and breast were 49 and 152 by MapCHECK and ArcCHECK respectively. C p value of head and neck and prostate QA were above 1.0, C pml is 1.53 and 1.71 respectively, which is close to the target value of 100%. C pml value of breast (IMRT) was 1.67, data values are close to the target value of 95%. But value of was 0.90, which means that the data values are widely distributed. C p and C pml of breast VMAT QA were respectively 1.07 and 2.10. This suggests that the VMAT QA has better process capability than the IMRT QA. Consequently, we should pay more attention to planning and QA before treatment for breast Radiotherapy.

  10. Feasibility study of using statistical process control to customized quality assurance in proton therapy.

    Science.gov (United States)

    Rah, Jeong-Eun; Shin, Dongho; Oh, Do Hoon; Kim, Tae Hyun; Kim, Gwe-Ya

    2014-09-01

    To evaluate and improve the reliability of proton quality assurance (QA) processes and, to provide an optimal customized tolerance level using the statistical process control (SPC) methodology. The authors investigated the consistency check of dose per monitor unit (D/MU) and range in proton beams to see whether it was within the tolerance level of the daily QA process. This study analyzed the difference between the measured and calculated ranges along the central axis to improve the patient-specific QA process in proton beams by using process capability indices. The authors established a customized tolerance level of ±2% for D/MU and ±0.5 mm for beam range in the daily proton QA process. In the authors' analysis of the process capability indices, the patient-specific range measurements were capable of a specification limit of ±2% in clinical plans. SPC methodology is a useful tool for customizing the optimal QA tolerance levels and improving the quality of proton machine maintenance, treatment delivery, and ultimately patient safety.

  11. Feasibility study of using statistical process control to customized quality assurance in proton therapy

    International Nuclear Information System (INIS)

    Rah, Jeong-Eun; Oh, Do Hoon; Shin, Dongho; Kim, Tae Hyun; Kim, Gwe-Ya

    2014-01-01

    Purpose: To evaluate and improve the reliability of proton quality assurance (QA) processes and, to provide an optimal customized tolerance level using the statistical process control (SPC) methodology. Methods: The authors investigated the consistency check of dose per monitor unit (D/MU) and range in proton beams to see whether it was within the tolerance level of the daily QA process. This study analyzed the difference between the measured and calculated ranges along the central axis to improve the patient-specific QA process in proton beams by using process capability indices. Results: The authors established a customized tolerance level of ±2% for D/MU and ±0.5 mm for beam range in the daily proton QA process. In the authors’ analysis of the process capability indices, the patient-specific range measurements were capable of a specification limit of ±2% in clinical plans. Conclusions: SPC methodology is a useful tool for customizing the optimal QA tolerance levels and improving the quality of proton machine maintenance, treatment delivery, and ultimately patient safety

  12. Severe postpartum haemorrhage after vaginal delivery: a statistical process control chart to report seven years of continuous quality improvement.

    Science.gov (United States)

    Dupont, Corinne; Occelli, Pauline; Deneux-Tharaux, Catherine; Touzet, Sandrine; Duclos, Antoine; Bouvier-Colle, Marie-Hélène; Rudigoz, René-Charles; Huissoud, Cyril

    2014-07-01

    Severe postpartum haemorrhage after vaginal delivery: a statistical process control chart to report seven years of continuous quality improvement To use statistical process control charts to describe trends in the prevalence of severe postpartum haemorrhage after vaginal delivery. This assessment was performed 7 years after we initiated a continuous quality improvement programme that began with regular criteria-based audits Observational descriptive study, in a French maternity unit in the Rhône-Alpes region. Quarterly clinical audit meetings to analyse all cases of severe postpartum haemorrhage after vaginal delivery and provide feedback on quality of care with statistical process control tools. The primary outcomes were the prevalence of severe PPH after vaginal delivery and its quarterly monitoring with a control chart. The secondary outcomes included the global quality of care for women with severe postpartum haemorrhage, including the performance rate of each recommended procedure. Differences in these variables between 2005 and 2012 were tested. From 2005 to 2012, the prevalence of severe postpartum haemorrhage declined significantly, from 1.2% to 0.6% of vaginal deliveries (pcontrol limits, that is, been out of statistical control. The proportion of cases that were managed consistently with the guidelines increased for all of their main components. Implementation of continuous quality improvement efforts began seven years ago and used, among other tools, statistical process control charts. During this period, the prevalence of severe postpartum haemorrhage after vaginal delivery has been reduced by 50%. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  13. Statistical quality control charts for liver transplant process indicators: evaluation of a single-center experience.

    Science.gov (United States)

    Varona, M A; Soriano, A; Aguirre-Jaime, A; Barrera, M A; Medina, M L; Bañon, N; Mendez, S; Lopez, E; Portero, J; Dominguez, D; Gonzalez, A

    2012-01-01

    Liver transplantation, the best option for many end-stage liver diseases, is indicated in more candidates than the donor availability. In this situation, this demanding treatment must achieve excellence, accessibility and patient satisfaction to be ethical, scientific, and efficient. The current consensus of quality measurements promoted by the Sociedad Española de Trasplante Hepático (SETH) seeks to depict criteria, indicators, and standards for liver transplantation in Spain. According to this recommendation, the Canary Islands liver program has studied its experience. We separated the 411 cadaveric transplants performed in the last 15 years into 2 groups: The first 100 and the other 311. The 8 criteria of SETH 2010 were correctly fulfilled. In most indicators, the outcomes were favorable, with an actuarial survivals at 1, 3, 5, and 10 years of 84%, 79%, 76%, and 65%, respectively; excellent results in retransplant rates (early 0.56% and long-term 5.9%), primary nonfunction rate (0.43%), waiting list mortality (13.34%), and patient satisfaction (91.5%). On the other hand, some indicators of mortality were worse as perioperative, postoperative, and early mortality with normal graft function and reoperation rate. After the analyses of the series with statistical quality control charts, we observed an improvement in all indicators, even in the apparently worst, early mortality with normal graft functions in a stable program. Such results helped us to discover specific areas to improve the program. The application of the quality measurement, as SETH consensus recommends, has shown in our study that despite being a consuming time process, it is a useful tool. Copyright © 2012 Elsevier Inc. All rights reserved.

  14. A quality improvement project using statistical process control methods for type 2 diabetes control in a resource-limited setting.

    Science.gov (United States)

    Flood, David; Douglas, Kate; Goldberg, Vera; Martinez, Boris; Garcia, Pablo; Arbour, MaryCatherine; Rohloff, Peter

    2017-08-01

    Quality improvement (QI) is a key strategy for improving diabetes care in low- and middle-income countries (LMICs). This study reports on a diabetes QI project in rural Guatemala whose primary aim was to improve glycemic control of a panel of adult diabetes patients. Formative research suggested multiple areas for programmatic improvement in ambulatory diabetes care. This project utilized the Model for Improvement and Agile Global Health, our organization's complementary healthcare implementation framework. A bundle of improvement activities were implemented at the home, clinic and institutional level. Control charts of mean hemoglobin A1C (HbA1C) and proportion of patients meeting target HbA1C showed improvement as special cause variation was identified 3 months after the intervention began. Control charts for secondary process measures offered insights into the value of different components of the intervention. Intensity of home-based diabetes education emerged as an important driver of panel glycemic control. Diabetes QI work is feasible in resource-limited settings in LMICs and can improve glycemic control. Statistical process control charts are a promising methodology for use with panels or registries of diabetes patients. © The Author 2017. Published by Oxford University Press in association with the International Society for Quality in Health Care. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  15. Experience in statistical quality control for road construction in South Africa

    CSIR Research Space (South Africa)

    Mitchell, MF

    1977-06-01

    Full Text Available of statistically oriented acceptance control procedures to a major road construction project is examined and it is concluded that such procedures promise to be of benefit to both the client and the contractor....

  16. Quality control statistic for laboratory analysis and assays in Departamento de Tecnologia de Combustiveis - IPEN-BR

    International Nuclear Information System (INIS)

    Lima, Waldir C. de; Lainetti, Paulo E.O.; Lima, Roberto M. de; Peres, Henrique G.

    1996-01-01

    The purpose of this work is the study for introduction of the statistical control in test and analysis realized in the Departamento de Tecnologia de Combustiveis. Are succinctly introduced: theories of statistical process control, elaboration of control graphs, the definition of standards test (or analysis) and how the standards are employed for determination the control limits in the graphs. The more expressive result is the applied form for the practice quality control, moreover it is also exemplified the utilization of one standard of verification and analysis in the laboratory of control. (author)

  17. STATISTICS IN SERVICE QUALITY ASSESSMENT

    Directory of Open Access Journals (Sweden)

    Dragana Gardašević

    2012-09-01

    Full Text Available For any quality evaluation in sports, science, education, and so, it is useful to collect data to construct a strategy to improve the quality of services offered to the user. For this purpose, we use statistical software packages for data processing data collected in order to increase customer satisfaction. The principle is demonstrated by the example of the level of student satisfaction ratings Belgrade Polytechnic (as users the quality of institutions (Belgrade Polytechnic. Here, the emphasis on statistical analysis as a tool for quality control in order to improve the same, and not the interpretation of results. Therefore, the above can be used as a model in sport to improve the overall results.

  18. Initiating statistical process control to improve quality outcomes in colorectal surgery.

    Science.gov (United States)

    Keller, Deborah S; Stulberg, Jonah J; Lawrence, Justin K; Samia, Hoda; Delaney, Conor P

    2015-12-01

    Unexpected variations in postoperative length of stay (LOS) negatively impact resources and patient outcomes. Statistical process control (SPC) measures performance, evaluates productivity, and modifies processes for optimal performance. The goal of this study was to initiate SPC to identify LOS outliers and evaluate its feasibility to improve outcomes in colorectal surgery. Review of a prospective database identified colorectal procedures performed by a single surgeon. Patients were grouped into elective and emergent categories and then stratified by laparoscopic and open approaches. All followed a standardized enhanced recovery protocol. SPC was applied to identify outliers and evaluate causes within each group. A total of 1294 cases were analyzed--83% elective (n = 1074) and 17% emergent (n = 220). Emergent cases were 70.5% open and 29.5% laparoscopic; elective cases were 36.8% open and 63.2% laparoscopic. All groups had a wide range in LOS. LOS outliers ranged from 8.6% (elective laparoscopic) to 10.8% (emergent laparoscopic). Evaluation of outliers demonstrated patient characteristics of higher ASA scores, longer operating times, ICU requirement, and temporary nursing at discharge. Outliers had higher postoperative complication rates in elective open (57.1 vs. 20.0%) and elective lap groups (77.6 vs. 26.1%). Outliers also had higher readmission rates for emergent open (11.4 vs. 5.4%), emergent lap (14.3 vs. 9.2%), and elective lap (32.8 vs. 6.9%). Elective open outliers did not follow trends of longer LOS or higher reoperation rates. SPC is feasible and promising for improving colorectal surgery outcomes. SPC identified patient and process characteristics associated with increased LOS. SPC may allow real-time outlier identification, during quality improvement efforts, and reevaluation of outcomes after introducing process change. SPC has clinical implications for improving patient outcomes and resource utilization.

  19. Measuring and improving the quality of postoperative epidural analgesia for major abdominal surgery using statistical process control charts.

    Science.gov (United States)

    Duncan, Fiona; Haigh, Carol

    2013-10-01

    To explore and improve the quality of continuous epidural analgesia for pain relief using Statistical Process Control tools. Measuring the quality of pain management interventions is complex. Intermittent audits do not accurately capture the results of quality improvement initiatives. The failure rate for one intervention, epidural analgesia, is approximately 30% in everyday practice, so it is an important area for improvement. Continuous measurement and analysis are required to understand the multiple factors involved in providing effective pain relief. Process control and quality improvement Routine prospectively acquired data collection started in 2006. Patients were asked about their pain and side effects of treatment. Statistical Process Control methods were applied for continuous data analysis. A multidisciplinary group worked together to identify reasons for variation in the data and instigated ideas for improvement. The key measure for improvement was a reduction in the percentage of patients with an epidural in severe pain. The baseline control charts illustrated the recorded variation in the rate of several processes and outcomes for 293 surgical patients. The mean visual analogue pain score (VNRS) was four. There was no special cause variation when data were stratified by surgeons, clinical area or patients who had experienced pain before surgery. Fifty-seven per cent of patients were hypotensive on the first day after surgery. We were able to demonstrate a significant improvement in the failure rate of epidurals as the project continued with quality improvement interventions. Statistical Process Control is a useful tool for measuring and improving the quality of pain management. The applications of Statistical Process Control methods offer the potential to learn more about the process of change and outcomes in an Acute Pain Service both locally and nationally. We have been able to develop measures for improvement and benchmarking in routine care that

  20. Application of Statistical Increase in Industrial Quality

    International Nuclear Information System (INIS)

    Akhmad-Fauzy

    2000-01-01

    Application of statistical method in industrial field is slightly newcompared with agricultural and biology. Statistical method which is appliedin industrial field more focus on industrial system control and useful formaintaining economical control of produce quality which is produced on bigscale. Application of statistical method in industrial field has increasedrapidly. This fact is supported by release of ISO 9000 quality system in 1987as international quality standard which is adopted by more than 100countries. (author)

  1. Statistical methods in quality assurance

    International Nuclear Information System (INIS)

    Eckhard, W.

    1980-01-01

    During the different phases of a production process - planning, development and design, manufacturing, assembling, etc. - most of the decision rests on a base of statistics, the collection, analysis and interpretation of data. Statistical methods can be thought of as a kit of tools to help to solve problems in the quality functions of the quality loop with respect to produce quality products and to reduce quality costs. Various statistical methods are represented, typical examples for their practical application are demonstrated. (RW)

  2. Population-based cancer survival in the United States: Data, quality control, and statistical methods.

    Science.gov (United States)

    Allemani, Claudia; Harewood, Rhea; Johnson, Christopher J; Carreira, Helena; Spika, Devon; Bonaventure, Audrey; Ward, Kevin; Weir, Hannah K; Coleman, Michel P

    2017-12-15

    Robust comparisons of population-based cancer survival estimates require tight adherence to the study protocol, standardized quality control, appropriate life tables of background mortality, and centralized analysis. The CONCORD program established worldwide surveillance of population-based cancer survival in 2015, analyzing individual data on 26 million patients (including 10 million US patients) diagnosed between 1995 and 2009 with 1 of 10 common malignancies. In this Cancer supplement, we analyzed data from 37 state cancer registries that participated in the second cycle of the CONCORD program (CONCORD-2), covering approximately 80% of the US population. Data quality checks were performed in 3 consecutive phases: protocol adherence, exclusions, and editorial checks. One-, 3-, and 5-year age-standardized net survival was estimated using the Pohar Perme estimator and state- and race-specific life tables of all-cause mortality for each year. The cohort approach was adopted for patients diagnosed between 2001 and 2003, and the complete approach for patients diagnosed between 2004 and 2009. Articles in this supplement report population coverage, data quality indicators, and age-standardized 5-year net survival by state, race, and stage at diagnosis. Examples of tables, bar charts, and funnel plots are provided in this article. Population-based cancer survival is a key measure of the overall effectiveness of services in providing equitable health care. The high quality of US cancer registry data, 80% population coverage, and use of an unbiased net survival estimator ensure that the survival trends reported in this supplement are robustly comparable by race and state. The results can be used by policymakers to identify and address inequities in cancer survival in each state and for the United States nationally. Cancer 2017;123:4982-93. Published 2017. This article is a U.S. Government work and is in the public domain in the USA. Published 2017. This article is a U

  3. Statistical quality management

    NARCIS (Netherlands)

    Laan, van der P.

    1992-01-01

    Enkele algemene opmerkingen worden gemaakt over statistische kwaliteitszorg. Totale of Integrale Kwaliteitszorg (Total Quality Management) wordt kort besproken. Voordracht gehouden op 21 oktober 1992 voor leden van de studentenvereniging GEWIS voor Wiskunde en Informatica.

  4. Statistical quality management using miniTAB 14

    International Nuclear Information System (INIS)

    An, Seong Jin

    2007-01-01

    This book explains statistical quality management giving descriptions of definition of quality, quality management, quality cost, basic methods of quality management, principles of control chart, control chart for variables, control chart for attributes, capability analysis, other issues of statistical process control, acceptance sampling, sampling for variable acceptance, design and analysis of experiment, Taguchi quality engineering, reaction surface methodology reliability analysis.

  5. Multivariate Statistical Process Control

    DEFF Research Database (Denmark)

    Kulahci, Murat

    2013-01-01

    As sensor and computer technology continues to improve, it becomes a normal occurrence that we confront with high dimensional data sets. As in many areas of industrial statistics, this brings forth various challenges in statistical process control (SPC) and monitoring for which the aim...... is to identify “out-of-control” state of a process using control charts in order to reduce the excessive variation caused by so-called assignable causes. In practice, the most common method of monitoring multivariate data is through a statistic akin to the Hotelling’s T2. For high dimensional data with excessive...... amount of cross correlation, practitioners are often recommended to use latent structures methods such as Principal Component Analysis to summarize the data in only a few linear combinations of the original variables that capture most of the variation in the data. Applications of these control charts...

  6. Statistical Engine Knock Control

    DEFF Research Database (Denmark)

    Stotsky, Alexander A.

    2008-01-01

    A new statistical concept of the knock control of a spark ignition automotive engine is proposed . The control aim is associated with the statistical hy pothesis test which compares the threshold value to the average value of the max imal amplitud e of the knock sensor signal at a given freq uency....... C ontrol algorithm which is used for minimization of the regulation error realizes a simple count-up-count-d own logic. A new ad aptation algorithm for the knock d etection threshold is also d eveloped . C onfi d ence interval method is used as the b asis for ad aptation. A simple statistical mod el...... which includ es generation of the amplitud e signals, a threshold value d etermination and a knock sound mod el is d eveloped for evaluation of the control concept....

  7. A method for evaluating treatment quality using in vivo EPID dosimetry and statistical process control in radiation therapy.

    Science.gov (United States)

    Fuangrod, Todsaporn; Greer, Peter B; Simpson, John; Zwan, Benjamin J; Middleton, Richard H

    2017-03-13

    Purpose Due to increasing complexity, modern radiotherapy techniques require comprehensive quality assurance (QA) programmes, that to date generally focus on the pre-treatment stage. The purpose of this paper is to provide a method for an individual patient treatment QA evaluation and identification of a "quality gap" for continuous quality improvement. Design/methodology/approach A statistical process control (SPC) was applied to evaluate treatment delivery using in vivo electronic portal imaging device (EPID) dosimetry. A moving range control chart was constructed to monitor the individual patient treatment performance based on a control limit generated from initial data of 90 intensity-modulated radiotherapy (IMRT) and ten volumetric-modulated arc therapy (VMAT) patient deliveries. A process capability index was used to evaluate the continuing treatment quality based on three quality classes: treatment type-specific, treatment linac-specific, and body site-specific. Findings The determined control limits were 62.5 and 70.0 per cent of the χ pass-rate for IMRT and VMAT deliveries, respectively. In total, 14 patients were selected for a pilot study the results of which showed that about 1 per cent of all treatments contained errors relating to unexpected anatomical changes between treatment fractions. Both rectum and pelvis cancer treatments demonstrated process capability indices were less than 1, indicating the potential for quality improvement and hence may benefit from further assessment. Research limitations/implications The study relied on the application of in vivo EPID dosimetry for patients treated at the specific centre. Sampling patients for generating the control limits were limited to 100 patients. Whilst the quantitative results are specific to the clinical techniques and equipment used, the described method is generally applicable to IMRT and VMAT treatment QA. Whilst more work is required to determine the level of clinical significance, the

  8. SU-C-BRD-01: A Statistical Modeling Method for Quality Control of Intensity- Modulated Radiation Therapy Planning

    International Nuclear Information System (INIS)

    Gao, S; Meyer, R; Shi, L; D'Souza, W; Zhang, H

    2014-01-01

    Purpose: To apply a statistical modeling approach, threshold modeling (TM), for quality control of intensity-modulated radiation therapy (IMRT) treatment plans. Methods: A quantitative measure, which was the weighted sum of violations of dose/dose-volume constraints, was first developed to represent the quality of each IMRT plan. Threshold modeling approach, which is is an extension of extreme value theory in statistics and is an effect way to model extreme values, was then applied to analyze the quality of the plans summarized by our quantitative measures. Our approach modeled the plans generated by planners as a series of independent and identically distributed random variables and described the behaviors of them if the plan quality was controlled below certain threshold. We tested our approach with five locally advanced head and neck cancer patients retrospectively. Two statistics were incorporated for numerical analysis: probability of quality improvement (PQI) of the plans and expected amount of improvement on the quantitative measure (EQI). Results: After clinical planners generated 15 plans for each patient, we applied our approach to obtain the PQI and EQI as if planners would generate additional 15 plans. For two of the patients, the PQI was significantly higher than the other three (0.17 and 0.18 comparing to 0.08, 0.01 and 0.01). The actual percentage of the additional 15 plans that outperformed the best of initial 15 plans was 20% and 27% comparing to 11%, 0% and 0%. EQI for the two potential patients were 34.5 and 32.9 and the rest of three patients were 9.9, 1.4 and 6.6. The actual improvements obtained were 28.3 and 20.5 comparing to 6.2, 0 and 0. Conclusion: TM is capable of reliably identifying the potential quality improvement of IMRT plans. It provides clinicians an effective tool to assess the trade-off between extra planning effort and achievable plan quality. This work was supported in part by NIH/NCI grant CA130814

  9. Cost and quality effectiveness of objective-based and statistically-based quality control for volatile organic compounds analyses of gases

    International Nuclear Information System (INIS)

    Bennett, J.T.; Crowder, C.A.; Connolly, M.J.

    1994-01-01

    Gas samples from drums of radioactive waste at the Department of Energy (DOE) Idaho National Engineering Laboratory are being characterized for 29 volatile organic compounds to determine the feasibility of storing the waste in DOE's Waste Isolation Pilot Plant (WIPP) in Carlsbad, New Mexico. Quality requirements for the gas chromatography (GC) and GC/mass spectrometry chemical methods used to analyze the waste are specified in the Quality Assurance Program Plan for the WIPP Experimental Waste Characterization Program. Quality requirements consist of both objective criteria (data quality objectives, DQOs) and statistical criteria (process control). The DQOs apply to routine sample analyses, while the statistical criteria serve to determine and monitor precision and accuracy (P ampersand A) of the analysis methods and are also used to assign upper confidence limits to measurement results close to action levels. After over two years and more than 1000 sample analyses there are two general conclusions concerning the two approaches to quality control: (1) Objective criteria (e.g., ± 25% precision, ± 30% accuracy) based on customer needs and the usually prescribed criteria for similar EPA- approved methods are consistently attained during routine analyses. (2) Statistical criteria based on short term method performance are almost an order of magnitude more stringent than objective criteria and are difficult to satisfy following the same routine laboratory procedures which satisfy the objective criteria. A more cost effective and representative approach to establishing statistical method performances criteria would be either to utilize a moving average of P ampersand A from control samples over a several month time period or to determine within a sample variation by one-way analysis of variance of several months replicate sample analysis results or both. Confidence intervals for results near action levels could also be determined by replicate analysis of the sample in

  10. Mediator effect of statistical process control between Total Quality Management (TQM) and business performance in Malaysian Automotive Industry

    Science.gov (United States)

    Ahmad, M. F.; Rasi, R. Z.; Zakuan, N.; Hisyamudin, M. N. N.

    2015-12-01

    In today's highly competitive market, Total Quality Management (TQM) is vital management tool in ensuring a company can success in their business. In order to survive in the global market with intense competition amongst regions and enterprises, the adoption of tools and techniques are essential in improving business performance. There are consistent results between TQM and business performance. However, only few previous studies have examined the mediator effect namely statistical process control (SPC) between TQM and business performance. A mediator is a third variable that changes the association between an independent variable and an outcome variable. This study present research proposed a TQM performance model with mediator effect of SPC with structural equation modelling, which is a more comprehensive model for developing countries, specifically for Malaysia. A questionnaire was prepared and sent to 1500 companies from automotive industry and the related vendors in Malaysia, giving a 21.8 per cent rate. Attempts were made at findings significant impact of mediator between TQM practices and business performance showed that SPC is important tools and techniques in TQM implementation. The result concludes that SPC is partial correlation between and TQM and BP with indirect effect (IE) is 0.25 which can be categorised as high moderator effect.

  11. Statistical methods for quality assurance

    International Nuclear Information System (INIS)

    Rinne, H.; Mittag, H.J.

    1989-01-01

    This is the first German-language textbook on quality assurance and the fundamental statistical methods that is suitable for private study. The material for this book has been developed from a course of Hagen Open University and is characterized by a particularly careful didactical design which is achieved and supported by numerous illustrations and photographs, more than 100 exercises with complete problem solutions, many fully displayed calculation examples, surveys fostering a comprehensive approach, bibliography with comments. The textbook has an eye to practice and applications, and great care has been taken by the authors to avoid abstraction wherever appropriate, to explain the proper conditions of application of the testing methods described, and to give guidance for suitable interpretation of results. The testing methods explained also include latest developments and research results in order to foster their adoption in practice. (orig.) [de

  12. Statistical Control of Measurement Quality; Controle Statistique de la Qualite de la Mesure; Statisticheskim kontrol' kachestva izmerenij; Control Estadistico de la Calidad de las Mediciones

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, C. A. [Battelle Memorial Institute, Richland, WA (United States)

    1966-02-15

    Effective nuclear materials management, and hence design and operation of associated material control systems, depend heavily on the quality of the quantitative data on which they are based. Information concerning the reliability of the measurement methods employed is essential to both the determination of data requirements and the evaluation of results obtained. Any method of analysis should be (1) relatively free from bias and (2) reproducible, or, in more usual terminology, precise. Many statistical techniques are available to evaluate and control the reproducibility of analytical results. Economical and effective experimental designs have been developed for the segregation of different sources of measurement error. Procedures have been developed or adapted tot use in maintaining and controlling the precision of routine measurements. All of these techniques require that at least some measurements must be duplicated, but duplication of all measurements can be justified only when the detection of every gross error, or mistake, is extremely important. Three types of measurement bias can be considered: (1) bias relative to a standard, (2) bias relative to prior experience, and (3) bias relative to a group. The first refers to the degree to which the measurements obtained deviate systematically from some ''standard'' which is unbiased either (1) by definition, or (2) because all known sources of bias have been removed. The second in concerned with the presence of systematic differences over a period of time. The third type of bias concerns the relationship between different physical entities or individuals at a given time. Recent developments in statistical methodology applicable to the evaluation of all three types of bias are discussed. Examples of the use of the statistical techniques discussed on Hanford data are presented. (author) [French] La gestion efficace des matieres nucleaires et, par consequent, la conception et le fonctionnement des systemes de controle

  13. quality control

    International Nuclear Information System (INIS)

    Skujina, A.; Purina, S.; Riekstina, D.

    1999-01-01

    The optimal objects: soils, spruce needles and bracken ferns were found for the environmental monitoring in the regions of possible radioactive contamination - near SalaspiIs nuclear reactor and Ignalina nuclear power plant. The determination of Sr-90 was based on the radiochemical separation of Sr-90 (=Y-90) by HDEHP extraction and counting the Cerenkov radiation. The quality control of the results was carried out. (authors)

  14. Statistical process control for serially correlated data

    NARCIS (Netherlands)

    Wieringa, Jakob Edo

    1999-01-01

    Statistical Process Control (SPC) aims at quality improvement through reduction of variation. The best known tool of SPC is the control chart. Over the years, the control chart has proved to be a successful practical technique for monitoring process measurements. However, its usefulness in practice

  15. Nonparametric predictive inference in statistical process control

    NARCIS (Netherlands)

    Arts, G.R.J.; Coolen, F.P.A.; Laan, van der P.

    2004-01-01

    Statistical process control (SPC) is used to decide when to stop a process as confidence in the quality of the next item(s) is low. Information to specify a parametric model is not always available, and as SPC is of a predictive nature, we present a control chart developed using nonparametric

  16. Statistical Techniques for Project Control

    CERN Document Server

    Badiru, Adedeji B

    2012-01-01

    A project can be simple or complex. In each case, proven project management processes must be followed. In all cases of project management implementation, control must be exercised in order to assure that project objectives are achieved. Statistical Techniques for Project Control seamlessly integrates qualitative and quantitative tools and techniques for project control. It fills the void that exists in the application of statistical techniques to project control. The book begins by defining the fundamentals of project management then explores how to temper quantitative analysis with qualitati

  17. From Quality to Information Quality in Official Statistics

    Directory of Open Access Journals (Sweden)

    Kenett Ron S.

    2016-12-01

    Full Text Available The term quality of statistical data, developed and used in official statistics and international organizations such as the International Monetary Fund (IMF and the Organisation for Economic Co-operation and Development (OECD, refers to the usefulness of summary statistics generated by producers of official statistics. Similarly, in the context of survey quality, official agencies such as Eurostat, National Center for Science and Engineering Statistics (NCSES, and Statistics Canada have created dimensions for evaluating the quality of a survey and its ability to report ‘accurate survey data’.

  18. Statistical process control and verifying positional accuracy of a cobra motion couch using step-wedge quality assurance tool.

    Science.gov (United States)

    Binny, Diana; Lancaster, Craig M; Trapp, Jamie V; Crowe, Scott B

    2017-09-01

    This study utilizes process control techniques to identify action limits for TomoTherapy couch positioning quality assurance tests. A test was introduced to monitor accuracy of the applied couch offset detection in the TomoTherapy Hi-Art treatment system using the TQA "Step-Wedge Helical" module and MVCT detector. Individual X-charts, process capability (cp), probability (P), and acceptability (cpk) indices were used to monitor a 4-year couch IEC offset data to detect systematic and random errors in the couch positional accuracy for different action levels. Process capability tests were also performed on the retrospective data to define tolerances based on user-specified levels. A second study was carried out whereby physical couch offsets were applied using the TQA module and the MVCT detector was used to detect the observed variations. Random and systematic variations were observed for the SPC-based upper and lower control limits, and investigations were carried out to maintain the ongoing stability of the process for a 4-year and a three-monthly period. Local trend analysis showed mean variations up to ±0.5 mm in the three-monthly analysis period for all IEC offset measurements. Variations were also observed in the detected versus applied offsets using the MVCT detector in the second study largely in the vertical direction, and actions were taken to remediate this error. Based on the results, it was recommended that imaging shifts in each coordinate direction be only applied after assessing the machine for applied versus detected test results using the step helical module. User-specified tolerance levels of at least ±2 mm were recommended for a test frequency of once every 3 months to improve couch positional accuracy. SPC enables detection of systematic variations prior to reaching machine tolerance levels. Couch encoding system recalibrations reduced variations to user-specified levels and a monitoring period of 3 months using SPC facilitated in detecting

  19. [Statistical approach to evaluate the occurrence of out-of acceptable ranges and accuracy for antimicrobial susceptibility tests in inter-laboratory quality control program].

    Science.gov (United States)

    Ueno, Tamio; Matuda, Junichi; Yamane, Nobuhisa

    2013-03-01

    To evaluate the occurrence of out-of acceptable ranges and accuracy of antimicrobial susceptibility tests, we applied a new statistical tool to the Inter-Laboratory Quality Control Program established by the Kyushu Quality Control Research Group. First, we defined acceptable ranges of minimum inhibitory concentration (MIC) for broth microdilution tests and inhibitory zone diameter for disk diffusion tests on the basis of Clinical and Laboratory Standards Institute (CLSI) M100-S21. In the analysis, more than two out-of acceptable range results in the 20 tests were considered as not allowable according to the CLSI document. Of the 90 participating laboratories, 46 (51%) experienced one or more occurrences of out-of acceptable range results. Then, a binomial test was applied to each participating laboratory. The results indicated that the occurrences of out-of acceptable range results in the 11 laboratories were significantly higher when compared to the CLSI recommendation (allowable rate laboratory was statistically compared with zero using a Student's t-test. The results revealed that 5 of the 11 above laboratories reported erroneous test results that systematically drifted to the side of resistance. In conclusion, our statistical approach has enabled us to detect significantly higher occurrences and source of interpretive errors in antimicrobial susceptibility tests; therefore, this approach can provide us with additional information that can improve the accuracy of the test results in clinical microbiology laboratories.

  20. Use of statistic control of the process as part of a quality assurance plan; Empleo del control estadistico de proceso como parte de un plan de aseguramiento de la calidad

    Energy Technology Data Exchange (ETDEWEB)

    Acosta, S.; Lewis, C., E-mail: sacosta@am.gob.ar [Autoridad Regulatoria Nuclear (ARN), Buenos Aires (Argentina)

    2013-07-01

    One of the technical requirements of the standard IRAM ISO 17025 for the accreditation of testing laboratories, is the assurance of the quality of the results through the control and monitoring of the factors influencing the reliability of them. The grade the factors contribute to the total measurement uncertainty, determines which of them should be considered when developing a quality assurance plan. The laboratory of environmental measurements of strontium-90 in the accreditation process, performs most of its determinations in samples with values close to the detection limit. For this reason the correct characterization of the white, is a critical parameter and is verified through a letter for statistical process control. The scope of the present work is concerned the control of whites and so it was collected a statistically significant amount of data, for a period of time that is covered of different conditions. This allowed consider significant variables in the process, such as temperature and humidity, and build a graph of white control, which forms the basis of a statistical process control. The data obtained were lower and upper limits for the preparation of the charter white control. In this way the process of characterization of white was considered to operate under statistical control and concludes that it can be used as part of a plan of insurance of the quality.

  1. Statistical Inference at Work: Statistical Process Control as an Example

    Science.gov (United States)

    Bakker, Arthur; Kent, Phillip; Derry, Jan; Noss, Richard; Hoyles, Celia

    2008-01-01

    To characterise statistical inference in the workplace this paper compares a prototypical type of statistical inference at work, statistical process control (SPC), with a type of statistical inference that is better known in educational settings, hypothesis testing. Although there are some similarities between the reasoning structure involved in…

  2. Memory-type control charts in statistical process control

    NARCIS (Netherlands)

    Abbas, N.

    2012-01-01

    Control chart is the most important statistical tool to manage the business processes. It is a graph of measurements on a quality characteristic of the process on the vertical axis plotted against time on the horizontal axis. The graph is completed with control limits that cause variation mark. Once

  3. A statistical rationale for establishing process quality control limits using fixed sample size, for critical current verification of SSC superconducting wire

    International Nuclear Information System (INIS)

    Pollock, D.A.; Brown, G.; Capone, D.W. II; Christopherson, D.; Seuntjens, J.M.; Woltz, J.

    1992-01-01

    This work has demonstrated the statistical concepts behind the XBAR R method for determining sample limits to verify billet I c performance and process uniformity. Using a preliminary population estimate for μ and σ from a stable production lot of only 5 billets, we have shown that reasonable sensitivity to systematic process drift and random within billet variation may be achieved, by using per billet subgroup sizes of moderate proportions. The effects of subgroup size (n) and sampling risk (α and β) on the calculated control limits have been shown to be important factors that need to be carefully considered when selecting an actual number of measurements to be used per billet, for each supplier process. Given the present method of testing in which individual wire samples are ramped to I c only once, with measurement uncertainty due to repeatability and reproducibility (typically > 1.4%), large subgroups (i.e. >30 per billet) appear to be unnecessary, except as an inspection tool to confirm wire process history for each spool. The introduction of the XBAR R method or a similar Statistical Quality Control procedure is recommend for use in the superconducing wire production program, particularly when the program transitions from requiring tests for all pieces of wire to sampling each production unit

  4. The statistical process control methods - SPC

    Directory of Open Access Journals (Sweden)

    Floreková Ľubica

    1998-03-01

    Full Text Available Methods of statistical evaluation of quality – SPC (item 20 of the documentation system of quality control of ISO norm, series 900 of various processes, products and services belong amongst basic qualitative methods that enable us to analyse and compare data pertaining to various quantitative parameters. Also they enable, based on the latter, to propose suitable interventions with the aim of improving these processes, products and services. Theoretical basis and applicatibily of the principles of the: - diagnostics of a cause and effects, - Paret analysis and Lorentz curve, - number distribution and frequency curves of random variable distribution, - Shewhart regulation charts, are presented in the contribution.

  5. Statistical elements in calculations procedures for air quality control; Elementi di statistica nelle procedure di calcolo per il controllo della qualita' dell'aria

    Energy Technology Data Exchange (ETDEWEB)

    Mura, M.C. [Istituto Superiore di Sanita' , Laboratorio di Igiene Ambientale, Rome (Italy)

    2001-07-01

    The statistical processing of data resulting from the monitoring of chemical atmospheric pollution aimed at air quality control is presented. The form of procedural models may offer a practical instrument to the operators in the sector. The procedural models are modular and can be easily integrated with other models. They include elementary calculation procedures and mathematical methods for statistical analysis. The calculation elements have been developed by probabilistic induction so as to relate them to the statistical analysis. The calculation elements have been developed by probabilistic induction so as to relate them to the statistical models, which are the basis of the methods used for the study and the forecast of atmospheric pollution. This report is part of the updating and training activity that the Istituto Superiore di Sanita' has been carrying on for over twenty years, addressed to operators of the environmental field. [Italian] Il processo di elaborazione statistica dei dati provenienti dal monitoraggio dell'inquinamento chimico dell'atmosfera, finalizzato al controllo della qualita' dell'aria, e' presentato in modelli di procedure al fine di fornire un sintetico strumento di lavoro agli operatori del settore. I modelli di procedure sono modulari ed integrabili. Includono gli elementi di calcolo elementare ed i metodi statistici d'analisi. Gli elementi di calcolo sono sviluppati con metodo d'induzione probabilistica per collegarli ai modelli statistici, che sono alla base dei metodi d'analisi nello studio del fenomeno dell'inquinamento atmosferico anche a fini previsionali. Il rapporto si inserisce nell'attivita' di aggiornamento e di formazione che fin dagli anni ottanta l'Istituto Superiore di Sanita' indirizza agli operatori del settore ambientale.

  6. Plan delivery quality assurance for CyberKnife: Statistical process control analysis of 350 film-based patient-specific QAs.

    Science.gov (United States)

    Bellec, J; Delaby, N; Jouyaux, F; Perdrieux, M; Bouvier, J; Sorel, S; Henry, O; Lafond, C

    2017-07-01

    Robotic radiosurgery requires plan delivery quality assurance (DQA) but there has never been a published comprehensive analysis of a patient-specific DQA process in a clinic. We proposed to evaluate 350 consecutive film-based patient-specific DQAs using statistical process control. We evaluated the performance of the process to propose achievable tolerance criteria for DQA validation and we sought to identify suboptimal DQA using control charts. DQAs were performed on a CyberKnife-M6 using Gafchromic-EBT3 films. The signal-to-dose conversion was performed using a multichannel-correction and a scanning protocol that combined measurement and calibration in a single scan. The DQA analysis comprised a gamma-index analysis at 3%/1.5mm and a separate evaluation of spatial and dosimetric accuracy of the plan delivery. Each parameter was plotted on a control chart and control limits were calculated. A capability index (Cpm) was calculated to evaluate the ability of the process to produce results within specifications. The analysis of capability showed that a gamma pass rate of 85% at 3%/1.5mm was highly achievable as acceptance criteria for DQA validation using a film-based protocol (Cpm>1.33). 3.4% of DQA were outside a control limit of 88% for gamma pass-rate. The analysis of the out-of-control DQA helped identify a dosimetric error in our institute for a specific treatment type. We have defined initial tolerance criteria for DQA validations. We have shown that the implementation of a film-based patient-specific DQA protocol with the use of control charts is an effective method to improve patient treatment safety on CyberKnife. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  7. Radiographic rejection index using statistical process control

    International Nuclear Information System (INIS)

    Savi, M.B.M.B.; Camozzato, T.S.C.; Soares, F.A.P.; Nandi, D.M.

    2015-01-01

    The Repeat Analysis Index (IRR) is one of the items contained in the Quality Control Program dictated by brazilian law of radiological protection and should be performed frequently, at least every six months. In order to extract more and better information of IRR, this study presents the Statistical Quality Control applied to reject rate through Statistical Process Control (Control Chart for Attributes ρ - GC) and the Pareto Chart (GP). Data collection was performed for 9 months and the last four months of collection was given on a daily basis. The Limits of Control (LC) were established and Minitab 16 software used to create the charts. IRR obtained for the period was corresponding to 8.8% ± 2,3% and the generated charts analyzed. Relevant information such as orders for X-ray equipment and processors were crossed to identify the relationship between the points that exceeded the control limits and the state of equipment at the time. The GC demonstrated ability to predict equipment failures, as well as the GP showed clearly what causes are recurrent in IRR. (authors) [pt

  8. Statistical tests applied as quality control measures to leaching of nuclear waste glasses and in the evaluation of the leach vessel

    International Nuclear Information System (INIS)

    Bokelund, H.; Deelstra, K.

    1988-01-01

    Simple statistical tests, such as regression analysis and analysis of variance, have been applied to data obtained from leaching experiments carried out under various conditions of time and temperature. The precision and the accuracy of the overall leaching procedure were evaluated considering the short term within laboratory effects. The data originated from determinations of the mass losses of leached glass specimens and from measurements of the electrical conductivity and the pH of the leachants. The solution conductivity correlates highly with the normalized mass loss; hence it provides a consistency check on the measurements of the latter parameter. The overall relative precision of the leaching test method was found to be 5-12%, including the effects caused by inhomogeneity of the glass specimens. The conditions for the application of the teflon inserts often used in leaching devices have been investigated; a modified cleaning procedure is proposed to ascertain the absence of systematic errors by their repeated utilization (quality control). The operational limit of 190 0 C, as specified by the Materials Characterization Center, Richland, USA was confirmed experimentally. 8 refs.; 1 figure; 8 tabs

  9. Using Statistical Process Control to Enhance Student Progression

    Science.gov (United States)

    Hanna, Mark D.; Raichura, Nilesh; Bernardes, Ednilson

    2012-01-01

    Public interest in educational outcomes has markedly increased in the most recent decade; however, quality management and statistical process control have not deeply penetrated the management of academic institutions. This paper presents results of an attempt to use Statistical Process Control (SPC) to identify a key impediment to continuous…

  10. Multivariate Statistical Process Control Charts: An Overview

    OpenAIRE

    Bersimis, Sotiris; Psarakis, Stelios; Panaretos, John

    2006-01-01

    In this paper we discuss the basic procedures for the implementation of multivariate statistical process control via control charting. Furthermore, we review multivariate extensions for all kinds of univariate control charts, such as multivariate Shewhart-type control charts, multivariate CUSUM control charts and multivariate EWMA control charts. In addition, we review unique procedures for the construction of multivariate control charts, based on multivariate statistical techniques such as p...

  11. SPECT quality control tests

    International Nuclear Information System (INIS)

    Robilotta, C.C.; Rebelo, M.F.S.; Oliveira, M.A.; Abe, R.

    1987-01-01

    Quality control tests of tomographic system composed by a rotatory chamber (CGR Gammatomome T-9000) and a microcomputer are presented. Traditional quality control tests for scintilation chambers and specific tests for tomographic systems are reported. (M.A.C.) [pt

  12. Automatic optimisation of beam orientations using the simplex algorithm and optimisation of quality control using statistical process control (S.P.C.) for intensity modulated radiation therapy (I.M.R.T.)

    International Nuclear Information System (INIS)

    Gerard, K.

    2008-11-01

    Intensity Modulated Radiation Therapy (I.M.R.T.) is currently considered as a technique of choice to increase the local control of the tumour while reducing the dose to surrounding organs at risk. However, its routine clinical implementation is partially held back by the excessive amount of work required to prepare the patient treatment. In order to increase the efficiency of the treatment preparation, two axes of work have been defined. The first axis concerned the automatic optimisation of beam orientations. We integrated the simplex algorithm in the treatment planning system. Starting from the dosimetric objectives set by the user, it can automatically determine the optimal beam orientations that best cover the target volume while sparing organs at risk. In addition to time sparing, the simplex results of three patients with a cancer of the oropharynx, showed that the quality of the plan is also increased compared to a manual beam selection. Indeed, for an equivalent or even a better target coverage, it reduces the dose received by the organs at risk. The second axis of work concerned the optimisation of pre-treatment quality control. We used an industrial method: Statistical Process Control (S.P.C.) to retrospectively analyse the absolute dose quality control results performed using an ionisation chamber at Centre Alexis Vautrin (C.A.V.). This study showed that S.P.C. is an efficient method to reinforce treatment security using control charts. It also showed that our dose delivery process was stable and statistically capable for prostate treatments, which implies that a reduction of the number of controls can be considered for this type of treatment at the C.A.V.. (author)

  13. Successfully reducing newborn asphyxia in the labour unit in a large academic medical centre: a quality improvement project using statistical process control.

    Science.gov (United States)

    Hollesen, Rikke von Benzon; Johansen, Rie Laurine Rosenthal; Rørbye, Christina; Munk, Louise; Barker, Pierre; Kjaerbye-Thygesen, Anette

    2018-02-03

    A safe delivery is part of a good start in life, and a continuous focus on preventing harm during delivery is crucial, even in settings with a good safety record. In January 2013, the labour unit at Copenhagen University Hospital, Hvidovre, undertook a quality improvement (QI) project to prevent asphyxia and reduced the percentage of newborns with asphyxia by 48%. The change theory consisted of two primary elements: (1) the clinical content, including three clinical bundles of evidence-based care, a 'delivery bundle', an 'oxytocin bundle' and a 'vacuum extraction bundle'; (2) an implementation theory, including improving skills in interpretation of cardiotocography, use of QI methods and participation in a national learning network. The Model for Improvement and Deming's system of profound knowledge were used as a methodological framework. Data on compliance with the care bundles and the number of deliveries between newborns with asphyxia (Apgar statistical process control. Compliance with all three clinical care bundles improved to 95% or more, and the percentages of newborns with pH <7 and Apgar <7 after 5 min were reduced by 48% and 31%, respectively. In general, the QI approach strengthened multidisciplinary teamwork, systematised workflow and structured communication around the deliveries. Changes included making a standard memo in the medical record, the use of a bedside whiteboard, bedside handovers, shared decisions with a peer when using an oxytocin infusion and the use of a checklist before vacuum extractions. This QI project illustrates how aspects of patient safety, such as the prevention of asphyxia, can be improved using QI methods to more reliably implement best practice, even in high-performing systems. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  14. Statistical process control for electron beam monitoring.

    Science.gov (United States)

    López-Tarjuelo, Juan; Luquero-Llopis, Naika; García-Mollá, Rafael; Quirós-Higueras, Juan David; Bouché-Babiloni, Ana; Juan-Senabre, Xavier Jordi; de Marco-Blancas, Noelia; Ferrer-Albiach, Carlos; Santos-Serra, Agustín

    2015-07-01

    To assess the electron beam monitoring statistical process control (SPC) in linear accelerator (linac) daily quality control. We present a long-term record of our measurements and evaluate which SPC-led conditions are feasible for maintaining control. We retrieved our linac beam calibration, symmetry, and flatness daily records for all electron beam energies from January 2008 to December 2013, and retrospectively studied how SPC could have been applied and which of its features could be used in the future. A set of adjustment interventions designed to maintain these parameters under control was also simulated. All phase I data was under control. The dose plots were characterized by rising trends followed by steep drops caused by our attempts to re-center the linac beam calibration. Where flatness and symmetry trends were detected they were less-well defined. The process capability ratios ranged from 1.6 to 9.3 at a 2% specification level. Simulated interventions ranged from 2% to 34% of the total number of measurement sessions. We also noted that if prospective SPC had been applied it would have met quality control specifications. SPC can be used to assess the inherent variability of our electron beam monitoring system. It can also indicate whether a process is capable of maintaining electron parameters under control with respect to established specifications by using a daily checking device, but this is not practical unless a method to establish direct feedback from the device to the linac can be devised. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  15. Quality assurance and quality control

    International Nuclear Information System (INIS)

    Kaden, W.

    1986-01-01

    General preconditions and methods for QA work in the nuclear field are analysed. The application of general QA principles to actual situations is illustrated by examples in the fields of engineering and of the manufacturing of mechanical and electrical components. All QA measures must be fitted to the complexity and relevance of the work steps, which are under consideration. The key to good product quality is the control of working processes. The term 'controlled process' is discussed in detail and examples of feed back systems are given. The main QA measures for the operation of nuclear power plants include the establishment of a Quality Assurance Program, training and qualification of personnel, procurement control, inspection and tests, reviews and audits. These activities are discussed. (orig.)

  16. Quality assurance and quality control

    International Nuclear Information System (INIS)

    Anon.

    1987-01-01

    The practice of nuclear diagnostic imaging requires an appropriate quality assurance program to attain high standards of efficiency and reliability. The International Atomic Energy Agency defines the term quality assurance as ''the closeness with which the outcome of a given procedure approaches some ideal, free from all errors and artifacts.'' The term quality control is used in reference to the specific measures taken to ensure that one particular aspect of the procedure is satisfactory. Therefore, quality assurance is a hospital-wide concept that should involve all aspects of clinical practice. Quality control is concerned with the submission of requests for procedures; the scheduling of patients; the preparation and dispensing of radiopharmaceuticals; the protection of patients, staff, and the general public against radiation hazards and accidents caused by radioactive materials or by faulty equipment; the setting up, use, and maintenance of electronic instruments; the methodology of the actual procedures; the analysis and interpretation of data; the reporting of results; and, finally, the keeping of records. The chapter discusses each of these areas

  17. Quality control in radiotherapy

    International Nuclear Information System (INIS)

    Batalla, A.

    2009-01-01

    The authors discuss the modalities and periodicities of internal quality control on radiotherapy installations. They indicate the different concerned systems and the aspects and items to be controlled (patient and personnel security, apparatus mechanical characteristics, beam quality, image quality, isodose and irradiation duration calculation, data transfer). They present the measurement instruments and tools used for the mechanical controls, dose measurement, beam homogeneity and symmetry, anatomic data acquisition systems, and dose distribution and control imagery calculation

  18. Quality Control Applications

    CERN Document Server

    Chorafas, Dimitris N

    2013-01-01

    Quality control is a constant priority in electrical, mechanical, aeronautical, and nuclear engineering – as well as in the vast domain of electronics, from home appliances to computers and telecommunications. Quality Control Applications provides guidance and valuable insight into quality control policies; their methods, their implementation, constant observation and associated technical audits. What has previously been a mostly mathematical topic is translated here for engineers concerned with the practical implementation of quality control. Once the fundamentals of quality control are established, Quality Control Applications goes on to develop this knowledge and explain how to apply it in the most effective way. Techniques are described and supported using relevant, real-life, case studies to provide detail and clarity for those without a mathematical background. Among the many practical examples, two case studies dramatize the importance of quality assurance: A shot-by-shot analysis of the errors made ...

  19. Complexity control in statistical learning

    Indian Academy of Sciences (India)

    Then we describe how the method of regularization is used to control complexity in learning. We discuss two examples of regularization, one in which the function space used is finite dimensional, and another in which it is a reproducing kernel Hilbert space. Our exposition follows the formulation of Cucker and Smale.

  20. Software product quality control

    CERN Document Server

    Wagner, Stefan

    2013-01-01

    Quality is not a fixed or universal property of software; it depends on the context and goals of its stakeholders. Hence, when you want to develop a high-quality software system, the first step must be a clear and precise specification of quality. Yet even if you get it right and complete, you can be sure that it will become invalid over time. So the only solution is continuous quality control: the steady and explicit evaluation of a product's properties with respect to its updated quality goals.This book guides you in setting up and running continuous quality control in your environment. Star

  1. Checking quality control?

    DEFF Research Database (Denmark)

    Brodersen, Lars

    2005-01-01

    How is quality control doing within the community of GIS, web-services based on geo-information, GI etc.?......How is quality control doing within the community of GIS, web-services based on geo-information, GI etc.?...

  2. Statistical learning methods: Basics, control and performance

    Energy Technology Data Exchange (ETDEWEB)

    Zimmermann, J. [Max-Planck-Institut fuer Physik, Foehringer Ring 6, 80805 Munich (Germany)]. E-mail: zimmerm@mppmu.mpg.de

    2006-04-01

    The basics of statistical learning are reviewed with a special emphasis on general principles and problems for all different types of learning methods. Different aspects of controlling these methods in a physically adequate way will be discussed. All principles and guidelines will be exercised on examples for statistical learning methods in high energy and astrophysics. These examples prove in addition that statistical learning methods very often lead to a remarkable performance gain compared to the competing classical algorithms.

  3. Statistical learning methods: Basics, control and performance

    International Nuclear Information System (INIS)

    Zimmermann, J.

    2006-01-01

    The basics of statistical learning are reviewed with a special emphasis on general principles and problems for all different types of learning methods. Different aspects of controlling these methods in a physically adequate way will be discussed. All principles and guidelines will be exercised on examples for statistical learning methods in high energy and astrophysics. These examples prove in addition that statistical learning methods very often lead to a remarkable performance gain compared to the competing classical algorithms

  4. Robust control charts in statistical process control

    NARCIS (Netherlands)

    Nazir, H.Z.

    2014-01-01

    The presence of outliers and contaminations in the output of the process highly affects the performance of the design structures of commonly used control charts and hence makes them of less practical use. One of the solutions to deal with this problem is to use control charts which are robust

  5. Manufacturing Squares: An Integrative Statistical Process Control Exercise

    Science.gov (United States)

    Coy, Steven P.

    2016-01-01

    In the exercise, students in a junior-level operations management class are asked to manufacture a simple product. Given product specifications, they must design a production process, create roles and design jobs for each team member, and develop a statistical process control plan that efficiently and effectively controls quality during…

  6. Improving Instruction Using Statistical Process Control.

    Science.gov (United States)

    Higgins, Ronald C.; Messer, George H.

    1990-01-01

    Two applications of statistical process control to the process of education are described. Discussed are the use of prompt feedback to teachers and prompt feedback to students. A sample feedback form is provided. (CW)

  7. A statistical rationale for establishing process quality control limits using fixed sample size, for critical current verification of SSC superconducting wire

    International Nuclear Information System (INIS)

    Pollock, D.A.; Brown, G.; Capone, D.W. II; Christopherson, D.; Seuntjens, J.M.; Woltz, J.

    1992-03-01

    The purpose of this paper is to demonstrate a statistical method for verifying superconducting wire process stability as represented by I c . The paper does not propose changing the I c testing frequency for wire during Phase 1 of the present Vendor Qualification Program. The actual statistical limits demonstrated for one supplier's data are not expected to be suitable for all suppliers. However, the method used to develop the limits and the potential for improved process through their use, may be applied equally. Implementing the demonstrated method implies that the current practice of testing all pieces of wire from each billet, for the purpose of detecting manufacturing process errors (i.e. missing a heat-treatment cycle for a part of the billet, etc.) can be replaced by other less costly process control measures. As used in this paper process control limits for critical current are quantitative indicators of the source manufacturing process uniformity. The limits serve as alarms indicating the need for manufacturing process investigation

  8. Quality control of radiopharmaceuticals

    International Nuclear Information System (INIS)

    Verdera, E.S.

    1994-01-01

    The quality control of radiopharmaceuticals is based in physics, physics-chemical and biological controls. Between the different controls can be enumerated the following: visual aspect,side, number of particle beams,activity,purity,ph,isotonicity,sterility,radioinmunoessay,toxicity,stability and clinical essay

  9. Quality Control in construction.

    Science.gov (United States)

    1984-01-01

    behavioral scientists. In 1962, Dr. Kaoru Ishikawa gave shape to the form of training which featured intradepartmental groups of ten or so workers seated...and Japanese circles bears closer scrutiny. 4.3.1 Japanese Ingredients of Quality The founder of quality circles, Dr. Kaoru Ishikawa , gives six...around 51 a table; hence the name Quality Control Circle. 4 Dr. 0 Ishikawa was an engineering professor at Tokyo University, and the circles were

  10. Nonparametric predictive inference in statistical process control

    NARCIS (Netherlands)

    Arts, G.R.J.; Coolen, F.P.A.; Laan, van der P.

    2000-01-01

    New methods for statistical process control are presented, where the inferences have a nonparametric predictive nature. We consider several problems in process control in terms of uncertainties about future observable random quantities, and we develop inferences for these random quantities hased on

  11. Applicability of statistical process control techniques

    NARCIS (Netherlands)

    Schippers, W.A.J.

    1998-01-01

    This paper concerns the application of Process Control Techniques (PCTs) for the improvement of the technical performance of discrete production processes. Successful applications of these techniques, such as Statistical Process Control Techniques (SPC), can be found in the literature. However, some

  12. SAQC: SNP Array Quality Control

    Directory of Open Access Journals (Sweden)

    Li Ling-Hui

    2011-04-01

    Full Text Available Abstract Background Genome-wide single-nucleotide polymorphism (SNP arrays containing hundreds of thousands of SNPs from the human genome have proven useful for studying important human genome questions. Data quality of SNP arrays plays a key role in the accuracy and precision of downstream data analyses. However, good indices for assessing data quality of SNP arrays have not yet been developed. Results We developed new quality indices to measure the quality of SNP arrays and/or DNA samples and investigated their statistical properties. The indices quantify a departure of estimated individual-level allele frequencies (AFs from expected frequencies via standardized distances. The proposed quality indices followed lognormal distributions in several large genomic studies that we empirically evaluated. AF reference data and quality index reference data for different SNP array platforms were established based on samples from various reference populations. Furthermore, a confidence interval method based on the underlying empirical distributions of quality indices was developed to identify poor-quality SNP arrays and/or DNA samples. Analyses of authentic biological data and simulated data show that this new method is sensitive and specific for the detection of poor-quality SNP arrays and/or DNA samples. Conclusions This study introduces new quality indices, establishes references for AFs and quality indices, and develops a detection method for poor-quality SNP arrays and/or DNA samples. We have developed a new computer program that utilizes these methods called SNP Array Quality Control (SAQC. SAQC software is written in R and R-GUI and was developed as a user-friendly tool for the visualization and evaluation of data quality of genome-wide SNP arrays. The program is available online (http://www.stat.sinica.edu.tw/hsinchou/genetics/quality/SAQC.htm.

  13. An introduction to statistical process control in research proteomics.

    Science.gov (United States)

    Bramwell, David

    2013-12-16

    Statistical process control is a well-established and respected method which provides a general purpose, and consistent framework for monitoring and improving the quality of a process. It is routinely used in many industries where the quality of final products is critical and is often required in clinical diagnostic laboratories [1,2]. To date, the methodology has been little utilised in research proteomics. It has been shown to be capable of delivering quantitative QC procedures for qualitative clinical assays [3] making it an ideal methodology to apply to this area of biological research. To introduce statistical process control as an objective strategy for quality control and show how it could be used to benefit proteomics researchers and enhance the quality of the results they generate. We demonstrate that rules which provide basic quality control are easy to derive and implement and could have a major impact on data quality for many studies. Statistical process control is a powerful tool for investigating and improving proteomics research work-flows. The process of characterising measurement systems and defining control rules forces the exploration of key questions that can lead to significant improvements in performance. This work asserts that QC is essential to proteomics discovery experiments. Every experimenter must know the current capabilities of their measurement system and have an objective means for tracking and ensuring that performance. Proteomic analysis work-flows are complicated and multi-variate. QC is critical for clinical chemistry measurements and huge strides have been made in ensuring the quality and validity of results in clinical biochemistry labs. This work introduces some of these QC concepts and works to bridge their use from single analyte QC to applications in multi-analyte systems. This article is part of a Special Issue entitled: Standardization and Quality Control in Proteomics. Copyright © 2013 The Author. Published by Elsevier

  14. Quality control of radiopharmaceuticals

    International Nuclear Information System (INIS)

    Kristensen, K.

    1981-01-01

    Quality assurance was introduced in the pharmaceutical field long before it was used in many other areas, and the term quality control has been used in a much broader sense than merely analytical quality control. The term Good Manufacturing Practice (GMP) has been used to describe the system used for producing safe and effective drugs of a uniform quality. GMP has also been used for the industrial production of radiopharmaceuticals. For the preparation and control of radiopharmaceuticals in hospitals a similar system has been named Good Radiopharmacy Practice (GRP). It contains the same elements as GMP but takes into account the special nature of this group of drugs. Data on the assessment of the quality of radiopharmaceuticals in relation to present standards are reviewed. The general conclusion is that the quality of radiopharmaceuticals appears comparable to that of other drugs. It seems possible to establish the production of radiopharmaceuticals, generators and preparation kits in such a way that analytical control of the final product at the hospital may be limited provided the final preparation work is carried out in accordance with GRP principles. The elements of GRP are reviewed. (author)

  15. Statistical Process Control in the Practice of Program Evaluation.

    Science.gov (United States)

    Posavac, Emil J.

    1995-01-01

    A technique developed to monitor the quality of manufactured products, statistical process control (SPC), incorporates several features that may prove attractive to evaluators. This paper reviews the history of SPC, suggests how the approach can enrich program evaluation, and illustrates its use in a hospital-based example. (SLD)

  16. Quality assurance and applied statistics. Method 3

    International Nuclear Information System (INIS)

    1992-01-01

    This German-Industry-Standards-paperback contains the International Standards from the Series ISO 9000 (or, as the case may be, the European Standards from the Series EN 29000) concerning quality assurance and including the already completed supplementary guidelines with ISO 9000- and ISO 9004-section numbers, which have been adopted as German Industry Standards and which are observed and applied world-wide to a great extent. It also includes the German-Industry-Standards ISO 10011 parts 1, 2 and 3 concerning the auditing of quality-assurance systems and the German-Industry-Standard ISO 10012 part 1 concerning quality-assurance demands (confirmation system) for measuring devices. The standards also include English and French versions. They are applicable independent of the user's line of industry and thus constitute basic standards. (orig.) [de

  17. The interprocess NIR sampling as an alternative approach to multivariate statistical process control for identifying sources of product-quality variability.

    Science.gov (United States)

    Marković, Snežana; Kerč, Janez; Horvat, Matej

    2017-03-01

    We are presenting a new approach of identifying sources of variability within a manufacturing process by NIR measurements of samples of intermediate material after each consecutive unit operation (interprocess NIR sampling technique). In addition, we summarize the development of a multivariate statistical process control (MSPC) model for the production of enteric-coated pellet product of the proton-pump inhibitor class. By developing provisional NIR calibration models, the identification of critical process points yields comparable results to the established MSPC modeling procedure. Both approaches are shown to lead to the same conclusion, identifying parameters of extrusion/spheronization and characteristics of lactose that have the greatest influence on the end-product's enteric coating performance. The proposed approach enables quicker and easier identification of variability sources during manufacturing process, especially in cases when historical process data is not straightforwardly available. In the presented case the changes of lactose characteristics are influencing the performance of the extrusion/spheronization process step. The pellet cores produced by using one (considered as less suitable) lactose source were on average larger and more fragile, leading to consequent breakage of the cores during subsequent fluid bed operations. These results were confirmed by additional experimental analyses illuminating the underlying mechanism of fracture of oblong pellets during the pellet coating process leading to compromised film coating.

  18. Statistical process control in nursing research.

    Science.gov (United States)

    Polit, Denise F; Chaboyer, Wendy

    2012-02-01

    In intervention studies in which randomization to groups is not possible, researchers typically use quasi-experimental designs. Time series designs are strong quasi-experimental designs but are seldom used, perhaps because of technical and analytic hurdles. Statistical process control (SPC) is an alternative analytic approach to testing hypotheses about intervention effects using data collected over time. SPC, like traditional statistical methods, is a tool for understanding variation and involves the construction of control charts that distinguish between normal, random fluctuations (common cause variation), and statistically significant special cause variation that can result from an innovation. The purpose of this article is to provide an overview of SPC and to illustrate its use in a study of a nursing practice improvement intervention. Copyright © 2011 Wiley Periodicals, Inc.

  19. Statistical process control in wine industry using control cards

    OpenAIRE

    Dimitrieva, Evica; Atanasova-Pacemska, Tatjana; Pacemska, Sanja

    2013-01-01

    This paper is based on the research of the technological process of automatic filling of bottles of wine in winery in Stip, Republic of Macedonia. The statistical process control using statistical control card is created. The results and recommendations for improving the process are discussed.

  20. Michigan's forests, 2004: statistics and quality assurance

    Science.gov (United States)

    Scott A. Pugh; Mark H. Hansen; Gary Brand; Ronald E. McRoberts

    2010-01-01

    The first annual inventory of Michigan's forests was completed in 2004 after 18,916 plots were selected and 10,355 forested plots were visited. This report includes detailed information on forest inventory methods, quality of estimates, and additional tables. An earlier publication presented analyses of the inventoried data (Pugh et al. 2009).

  1. Statistical process control for residential treated wood

    Science.gov (United States)

    Patricia K. Lebow; Timothy M. Young; Stan Lebow

    2017-01-01

    This paper is the first stage of a study that attempts to improve the process of manufacturing treated lumber through the use of statistical process control (SPC). Analysis of industrial and auditing agency data sets revealed there are differences between the industry and agency probability density functions (pdf) for normalized retention data. Resampling of batches of...

  2. Applied Behavior Analysis and Statistical Process Control?

    Science.gov (United States)

    Hopkins, B. L.

    1995-01-01

    Incorporating statistical process control (SPC) methods into applied behavior analysis is discussed. It is claimed that SPC methods would likely reduce applied behavior analysts' intimate contacts with problems and would likely yield poor treatment and research decisions. Cases and data presented by Pfadt and Wheeler (1995) are cited as examples.…

  3. Quality control of dosemeters

    International Nuclear Information System (INIS)

    Mendes, L.

    1984-01-01

    Nuclear medicine laboratories are required to assay samples of radioactivity to be administered to patients. Almost universally, these assays are acomplished by use of a well ionization chamber isotope calibrator. The Instituto de Radioprotecao e Dosimetria (Institute for Radiological Protection and Dosimetry) of the Comissao Nacional de Energia Nuclear (National Commission for Nuclear Energy) is carrying out a National Quality Control Programme in Nuclear Medicine, supported by the International Atomic Energy Agency. The assessment of the current needs and practices of quality control in the entire country of Brazil includes Dose Calibrators and Scintilation Cameras, but this manual is restricted to the former. Quality Control Procedures for these Instruments are described in this document together with specific recommendations and assessment of its accuracy. (Author) [pt

  4. Quality Management, Quality Assurance and Quality Control in Blood Establishments

    OpenAIRE

    Bolbate, N

    2008-01-01

    Quality terms and the roots of the matter are analyzed according to European Committee’s recommendations. Essence of process and product quality control as well as essence of quality assurance is described. Quality system’s structure including quality control, quality assurance and management is justified in the article.

  5. Quality control in haemostasis.

    Science.gov (United States)

    Capel, P; Chatelain, B; Leclerq, R; Lust, A; Masure, R; Arnout, J

    1992-01-01

    Laboratory investigation of the haemostatic system deserves particular procedures in the quality control of analytical variables as well as preanalytical variables. This paper reviews the precautions that have to be taken in the blood prelevement, the transport of the tubes and the performance of the laboratory tests aimed to investigate the haemostatic system in order to obtain reliable results.

  6. Ocean Data Quality Control

    Science.gov (United States)

    2011-11-18

    the aero- sol at the coincident time and location of the satellite SST retrievals. This informa- tion is available in the daytime for the anti-solar...are of the same form, such as probabilities or standard normal deviates. A quality control decision-making algorithm in use at the U.S. Navy oceano

  7. VGI QUALITY CONTROL

    Directory of Open Access Journals (Sweden)

    C. C. Fonte

    2015-08-01

    Full Text Available This paper presents a framework for considering quality control of volunteered geographic information (VGI. Different issues need to be considered during the conception, acquisition and post-acquisition phases of VGI creation. This includes items such as collecting metadata on the volunteer, providing suitable training, giving corrective feedback during the mapping process and use of control data, among others. Two examples of VGI data collection are then considered with respect to this quality control framework, i.e. VGI data collection by National Mapping Agencies and by the most recent Geo-Wiki tool, a game called Cropland Capture. Although good practices are beginning to emerge, there is still the need for the development and sharing of best practice, especially if VGI is to be integrated with authoritative map products or used for calibration and/or validation of land cover in the future.

  8. Quality control of mammography

    International Nuclear Information System (INIS)

    Hering, K.G.

    1986-01-01

    Random checks of mammograms allow to clearly assess quality controls concerning correct application and operation of the radiographic system indicated by rich contrast in breast tissue images, complete imaging of the mammary parenchyma, freedom from blurs due to motion, efficient breast compression, correct film labelling and perfect maintenance of the film screen system. In addition to these subjective assessments, the following points should be considered when using objective measurement procedures and phantoms: Testing the correct function of X-ray and radiographic equipment by means of test specimens to measure KV standard (KV=Association of German Panel Doctors), mAS and automatic exposure timer; comparing dose and density to initial values and checking film processing by using a sensitometer. Quality assurance handling varies from one KV to the next. That is why the users need to obtain the guidelines of the respective KV relative to radiological quality assurance and to proceed according to these. (orig.) [de

  9. Quality control of intelligence research

    International Nuclear Information System (INIS)

    Lu Yan; Xin Pingping; Wu Jian

    2014-01-01

    Quality control of intelligence research is the core issue of intelligence management, is a problem in study of information science This paper focuses on the performance of intelligence to explain the significance of intelligence research quality control. In summing up the results of the study on the basis of the analysis, discusses quality control methods in intelligence research, introduces the experience of foreign intelligence research quality control, proposes some recommendations to improve quality control in intelligence research. (authors)

  10. Quality control of gamma radiation measuring systems

    International Nuclear Information System (INIS)

    Surma, M.J.

    2002-01-01

    The problem of quality control and assurance of gamma radiation measuring systems has been described in detail. The factors deciding of high quality of radiometric measurements as well as statistical testing and calibration of measuring systems have been presented and discussed

  11. Illinois' Forests, 2005: Statistics, Methods, and Quality Assurance

    Science.gov (United States)

    Susan J. Crocker; Charles J. Barnett; Mark A. Hatfield

    2013-01-01

    The first full annual inventory of Illinois' forests was completed in 2005. This report contains 1) descriptive information on methods, statistics, and quality assurance of data collection, 2) a glossary of terms, 3) tables that summarize quality assurance, and 4) a core set of tabular estimates for a variety of forest resources. A detailed analysis of inventory...

  12. Radiation measurements and quality control

    International Nuclear Information System (INIS)

    McLaughlin, W.L.

    1977-01-01

    Accurate measurements are essential to research leading to a successful radiation process and to the commissioning of the process and the facility. On the other hand, once the process is in production, the importance to quality control of measuring radiation quantities (i.e., absorbed dose, dose rate, dose distribution) rather than various other parameters of the process (i.e. conveyor speed, dwell time, radiation field characteristics, product dimensions) is not clearly established. When the safety of the product is determined by the magnitude of the administered dose, as in radiation sterilization, waste control, or food preservation, accuracy and precision of the measurement of the effective dose are vital. Since physical dose measurements are usually simpler, more reliable and reproducible than biological testing of the product, there is a trend toward using standardized dosimetry for quality control of some processes. In many industrial products, however, such as vulcanized rubber, textiles, plastics, coatings, films, wire and cable, the effective dose can be controlled satisfactorily by controlling process variables or by product testing itself. In the measurement of radiation dose profiles by dosimetry, it is necessary to have suitable dose meter calibrations, to account for sources of error and imprecision, and to use correct statistical procedures in specifying dwell times or conveyor speeds and source and product parameters to achieve minimum and maximum doses within specifications. (author)

  13. Statistical Process Control for KSC Processing

    Science.gov (United States)

    Ford, Roger G.; Delgado, Hector; Tilley, Randy

    1996-01-01

    The 1996 Summer Faculty Fellowship Program and Kennedy Space Center (KSC) served as the basis for a research effort into statistical process control for KSC processing. The effort entailed several tasks and goals. The first was to develop a customized statistical process control (SPC) course for the Safety and Mission Assurance Trends Analysis Group. The actual teaching of this course took place over several weeks. In addition, an Internet version of the same course complete with animation and video excerpts from the course when it was taught at KSC was developed. The application of SPC to shuttle processing took up the rest of the summer research project. This effort entailed the evaluation of SPC use at KSC, both present and potential, due to the change in roles for NASA and the Single Flight Operations Contractor (SFOC). Individual consulting on SPC use was accomplished as well as an evaluation of SPC software for KSC use in the future. A final accomplishment of the orientation of the author to NASA changes, terminology, data format, and new NASA task definitions will allow future consultation when the needs arise.

  14. Organizing quality control programmes

    International Nuclear Information System (INIS)

    Hjardemaal, O.

    1989-01-01

    When procuring new equipment, performance and safety should be specified, if possible by reference to international standards. Some of the characteristics of the International Electrotechnical Commission (IEC) standard for X-ray generators, in particular the accuracy of the operating data, are described. The quality control tests to be performed after installation comprise acceptance test, status test and constancy test. The first two involve absolute measurements and will be the responsibility of physicists or engineers. Apparently limiting values stipulated by users are a factor of two lower than the limits of the IEC standard. By means of an example it is shown that modern X-ray generators can meet the lower limits of the users without problems. In order to obtain optimum initial quality when procuring new equipment operating data, limiting values must be specified and must be verified by acceptance testing, etc. However, in many countries physicists and engineers are not available for this job. A relatively uncomplicated test object can be used by radiographers for checks on fluoroscopic systems. The findings from such tests in Denmark are compared with other published findings and good agreement is found. Therefore it is proposed that such uncomplicated tests could form the basis for quality evaluation. (author)

  15. Quality control of radiodiagnostics

    International Nuclear Information System (INIS)

    Alonso Diaz, M.; Castaneda Arronte, M.J.; Matorras Galan, P.; Diaz-Caneja Rodriguez, N.; Gutierrez Diaz Velarde, I.

    1993-01-01

    Since May 1990, a program of quality control of diagnostic X-ray equipment is underway in the University Hospital Marques de Valdecilla. This includes the design and application of measuring specifications and procedures corresponding to the different parameters of the equipment. The specified values are presented, as are those obtained for geometric and exposition parameters using the equipment. The specifications for the geometric parameters are fulfilled in an large proportion (between 52 and 86%) of the units, and the rest can easily be adjusted. However, 85% of the units can be made to operate with a field larger than that of the screen of the image monitor and approximately half of them can operate at a shorter focus-to-patient distance than that specified. With respect to the exposition parameters, in general, these units do not fulfill the specifications and their behavior is not uniform. The results obtained indicate that the equipment studied could be made to comply with the proposed specifications if a Maintenance Program were initiated in coordination with that of Quality Control. (Author)

  16. A Statistical Project Control Tool for Engineering Managers

    Science.gov (United States)

    Bauch, Garland T.

    2001-01-01

    This slide presentation reviews the use of a Statistical Project Control Tool (SPCT) for managing engineering projects. A literature review pointed to a definition of project success, (i.e., A project is successful when the cost, schedule, technical performance, and quality satisfy the customer.) The literature review also pointed to project success factors, and traditional project control tools, and performance measures that are detailed in the report. The essential problem is that with resources becoming more limited, and an increasing number or projects, project failure is increasing, there is a limitation of existing methods and systematic methods are required. The objective of the work is to provide a new statistical project control tool for project managers. Graphs using the SPCT method plotting results of 3 successful projects and 3 failed projects are reviewed, with success and failure being defined by the owner.

  17. Interaction between production control and quality control

    NARCIS (Netherlands)

    Bij, van der J.D.; Ekert, van J.H.W.

    1999-01-01

    Describes a qualitative study on interaction between systems for production control and quality control within industrial organisations. Production control and quality control interact in a sense. Good performance for one aspect often influences or frustrates the performance of the other. As far as

  18. Quality control in nuclear medicine

    International Nuclear Information System (INIS)

    Leme, P.R.

    1983-01-01

    The following topics are discussed: objectives of the quality control in nuclear medicine; the necessity of the quality control in nuclear medicine; guidelines and recommendations. An appendix is given concerning the guidelines for the quality control and instrumentation in nuclear medicine. (M.A.) [pt

  19. An easy and low cost option for economic statistical process control ...

    African Journals Online (AJOL)

    a large number of nonconforming products are manufactured. ... size, n, sampling interval, h, and control limit parameter, k, that minimize the ...... [11] Montgomery DC, 2001, Introduction to statistical quality control, 4th Edition, John Wiley, New.

  20. Printing quality control automation

    Science.gov (United States)

    Trapeznikova, O. V.

    2018-04-01

    One of the most important problems in the concept of standardizing the process of offset printing is the control the quality rating of printing and its automation. To solve the problem, a software has been developed taking into account the specifics of printing system components and the behavior in printing process. In order to characterize the distribution of ink layer on the printed substrate the so-called deviation of the ink layer thickness on the sheet from nominal surface is suggested. The geometric data construction the surface projections of the color gamut bodies allows to visualize the color reproduction gamut of printing systems in brightness ranges and specific color sectors, that provides a qualitative comparison of the system by the reproduction of individual colors in a varying ranges of brightness.

  1. Quality control in urinalysis.

    Science.gov (United States)

    Takubo, T; Tatsumi, N

    1999-01-01

    Quality control (QC) has been introduced in laboratories, and QC surveys in urinalysis have been performed by College of American Pathologist, by Japanese Association of Medical Technologists, by Osaka Medical Association and by manufacturers. QC survey in urinalysis for synthetic urine by the reagent strip and instrument made in same manufacturer, and by an automated urine cell analyser provided satisfactory results among laboratories. QC survey in urinalysis for synthetic urine by the reagent strips and instruments made by various manufacturers indicated differences in the determination values among manufacturers, and between manual and automated methods because the reagent strips and instruments have different characteristics, respectively. QC photo survey in urinalysis on the microscopic photos of urine sediment constituents indicated differences in the identification of cells among laboratories. From the results, it is necessary to standardize a reagent strip method, manual and automated methods, and synthetic urine.

  2. Kansas's forests, 2005: statistics, methods, and quality assurance

    Science.gov (United States)

    Patrick D. Miles; W. Keith Moser; Charles J. Barnett

    2011-01-01

    The first full annual inventory of Kansas's forests was completed in 2005 after 8,868 plots were selected and 468 forested plots were visited and measured. This report includes detailed information on forest inventory methods and data quality estimates. Important resource statistics are included in the tables. A detailed analysis of Kansas inventory is presented...

  3. South Dakota's forests, 2005: statistics, methods, and quality assurance

    Science.gov (United States)

    Patrick D. Miles; Ronald J. Piva; Charles J. Barnett

    2011-01-01

    The first full annual inventory of South Dakota's forests was completed in 2005 after 8,302 plots were selected and 325 forested plots were visited and measured. This report includes detailed information on forest inventory methods and data quality estimates. Important resource statistics are included in the tables. A detailed analysis of the South Dakota...

  4. Nebraska's forests, 2005: statistics, methods, and quality assurance

    Science.gov (United States)

    Patrick D. Miles; Dacia M. Meneguzzo; Charles J. Barnett

    2011-01-01

    The first full annual inventory of Nebraska's forests was completed in 2005 after 8,335 plots were selected and 274 forested plots were visited and measured. This report includes detailed information on forest inventory methods, and data quality estimates. Tables of various important resource statistics are presented. Detailed analysis of the inventory data are...

  5. North Dakota's forests, 2005: statistics, methods, and quality assurance

    Science.gov (United States)

    Patrick D. Miles; David E. Haugen; Charles J. Barnett

    2011-01-01

    The first full annual inventory of North Dakota's forests was completed in 2005 after 7,622 plots were selected and 164 forested plots were visited and measured. This report includes detailed information on forest inventory methods and data quality estimates. Important resource statistics are included in the tables. A detailed analysis of the North Dakota...

  6. Statistical process control for alpha spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Richardson, W; Majoras, R E [Oxford Instruments, Inc. P.O. Box 2560, Oak Ridge TN 37830 (United States); Joo, I O; Seymour, R S [Accu-Labs Research, Inc. 4663 Table Mountain Drive, Golden CO 80403 (United States)

    1995-10-01

    Statistical process control(SPC) allows for the identification of problems in alpha spectroscopy processes before they occur, unlike standard laboratory Q C which only identifies problems after a process fails. SPC tools that are directly applicable to alpha spectroscopy include individual X-charts and X-bar charts, process capability plots, and scatter plots. Most scientists are familiar with the concepts the and methods employed by SPC. These tools allow analysis of process bias, precision, accuracy and reproducibility as well as process capability. Parameters affecting instrument performance are monitored and analyzed using SPC methods. These instrument parameters can also be compared to sampling, preparation, measurement, and analysis Q C parameters permitting the evaluation of cause effect relationships. Three examples of SPC, as applied to alpha spectroscopy , are presented. The first example investigates background contamination using averaging to show trends quickly. A second example demonstrates how SPC can identify sample processing problems, analyzing both how and why this problem occurred. A third example illustrates how SPC can predict when an alpha spectroscopy process is going to fail. This allows for an orderly and timely shutdown of the process to perform preventative maintenance, avoiding the need to repeat costly sample analyses. 7 figs., 2 tabs.

  7. Statistical process control for alpha spectroscopy

    International Nuclear Information System (INIS)

    Richardson, W.; Majoras, R.E.; Joo, I.O.; Seymour, R.S.

    1995-01-01

    Statistical process control(SPC) allows for the identification of problems in alpha spectroscopy processes before they occur, unlike standard laboratory Q C which only identifies problems after a process fails. SPC tools that are directly applicable to alpha spectroscopy include individual X-charts and X-bar charts, process capability plots, and scatter plots. Most scientists are familiar with the concepts the and methods employed by SPC. These tools allow analysis of process bias, precision, accuracy and reproducibility as well as process capability. Parameters affecting instrument performance are monitored and analyzed using SPC methods. These instrument parameters can also be compared to sampling, preparation, measurement, and analysis Q C parameters permitting the evaluation of cause effect relationships. Three examples of SPC, as applied to alpha spectroscopy , are presented. The first example investigates background contamination using averaging to show trends quickly. A second example demonstrates how SPC can identify sample processing problems, analyzing both how and why this problem occurred. A third example illustrates how SPC can predict when an alpha spectroscopy process is going to fail. This allows for an orderly and timely shutdown of the process to perform preventative maintenance, avoiding the need to repeat costly sample analyses. 7 figs., 2 tabs

  8. Quality Control in Production Processes

    Directory of Open Access Journals (Sweden)

    Prístavka Miroslav

    2016-09-01

    Full Text Available The tools for quality management are used for quality improvement throughout the whole Europe and developed countries. Simple statistics are considered one of the most basic methods. The goal was to apply the simple statistical methods to practice and to solve problems by using them. Selected methods are used for processing the list of internal discrepancies within the organization, and for identification of the root cause of the problem and its appropriate solution. Seven basic quality tools are simple graphical tools, but very effective in solving problems related to quality. They are called essential because they are suitable for people with at least basic knowledge in statistics; therefore, they can be used to solve the vast majority of problems.

  9. Statistical Process Control in a Modern Production Environment

    DEFF Research Database (Denmark)

    Windfeldt, Gitte Bjørg

    gathered here and standard statistical software. In Paper 2 a new method for process monitoring is introduced. The method uses a statistical model of the quality characteristic and a sliding window of observations to estimate the probability that the next item will not respect the specications......Paper 1 is aimed at practicians to help them test the assumption that the observations in a sample are independent and identically distributed. An assumption that is essential when using classical Shewhart charts. The test can easily be performed in the control chart setup using the samples....... If the estimated probability exceeds a pre-determined threshold the process will be stopped. The method is exible, allowing a complexity in modeling that remains invisible to the end user. Furthermore, the method allows to build diagnostic plots based on the parameters estimates that can provide valuable insight...

  10. Use of statistical process control in the production of blood components

    DEFF Research Database (Denmark)

    Magnussen, K; Quere, S; Winkel, P

    2008-01-01

    Introduction of statistical process control in the setting of a small blood centre was tested, both on the regular red blood cell production and specifically to test if a difference was seen in the quality of the platelets produced, when a change was made from a relatively large inexperienced...... by an experienced staff with four technologists. We applied statistical process control to examine if time series of quality control values were in statistical control. Leucocyte count in red blood cells was out of statistical control. Platelet concentration and volume of the platelets produced by the occasional...... occasional component manufacturing staff to an experienced regular manufacturing staff. Production of blood products is a semi-automated process in which the manual steps may be difficult to control. This study was performed in an ongoing effort to improve the control and optimize the quality of the blood...

  11. [Statistical process control applied to intensity modulated radiotherapy pretreatment controls with portal dosimetry].

    Science.gov (United States)

    Villani, N; Gérard, K; Marchesi, V; Huger, S; François, P; Noël, A

    2010-06-01

    The first purpose of this study was to illustrate the contribution of statistical process control for a better security in intensity modulated radiotherapy (IMRT) treatments. This improvement is possible by controlling the dose delivery process, characterized by pretreatment quality control results. So, it is necessary to put under control portal dosimetry measurements (currently, the ionisation chamber measurements were already monitored by statistical process control thanks to statistical process control tools). The second objective was to state whether it is possible to substitute ionisation chamber with portal dosimetry in order to optimize time devoted to pretreatment quality control. At Alexis-Vautrin center, pretreatment quality controls in IMRT for prostate and head and neck treatments were performed for each beam of each patient. These controls were made with an ionisation chamber, which is the reference detector for the absolute dose measurement, and with portal dosimetry for the verification of dose distribution. Statistical process control is a statistical analysis method, coming from industry, used to control and improve the studied process quality. It uses graphic tools as control maps to follow-up process, warning the operator in case of failure, and quantitative tools to evaluate the process toward its ability to respect guidelines: this is the capability study. The study was performed on 450 head and neck beams and on 100 prostate beams. Control charts, showing drifts, both slow and weak, and also both strong and fast, of mean and standard deviation have been established and have shown special cause introduced (manual shift of the leaf gap of the multileaf collimator). Correlation between dose measured at one point, given with the EPID and the ionisation chamber has been evaluated at more than 97% and disagreement cases between the two measurements were identified. The study allowed to demonstrate the feasibility to reduce the time devoted to

  12. Statistical process control applied to intensity modulated radiotherapy pretreatment controls with portal dosimetry

    International Nuclear Information System (INIS)

    Villani, N.; Noel, A.; Villani, N.; Gerard, K.; Marchesi, V.; Huger, S.; Noel, A.; Francois, P.

    2010-01-01

    Purpose The first purpose of this study was to illustrate the contribution of statistical process control for a better security in intensity modulated radiotherapy (I.M.R.T.) treatments. This improvement is possible by controlling the dose delivery process, characterized by pretreatment quality control results. So, it is necessary to put under control portal dosimetry measurements (currently, the ionisation chamber measurements were already monitored by statistical process control thanks to statistical process control tools). The second objective was to state whether it is possible to substitute ionisation chamber with portal dosimetry in order to optimize time devoted to pretreatment quality control. Patients and methods At Alexis-Vautrin center, pretreatment quality controls in I.M.R.T. for prostate and head and neck treatments were performed for each beam of each patient. These controls were made with an ionisation chamber, which is the reference detector for the absolute dose measurement, and with portal dosimetry for the verification of dose distribution. Statistical process control is a statistical analysis method, coming from industry, used to control and improve the studied process quality. It uses graphic tools as control maps to follow-up process, warning the operator in case of failure, and quantitative tools to evaluate the process toward its ability to respect guidelines: this is the capability study. The study was performed on 450 head and neck beams and on 100 prostate beams. Results Control charts, showing drifts, both slow and weak, and also both strong and fast, of mean and standard deviation have been established and have shown special cause introduced (manual shift of the leaf gap of the multi-leaf collimator). Correlation between dose measured at one point, given with the E.P.I.D. and the ionisation chamber has been evaluated at more than 97% and disagreement cases between the two measurements were identified. Conclusion The study allowed to

  13. Commercial jet fuel quality control

    Energy Technology Data Exchange (ETDEWEB)

    Strauss, K.H.

    1995-05-01

    The paper discusses the purpose of jet fuel quality control between the refinery and the aircraft. It describes fixed equipment, including various types of filters, and the usefulness and limitations of this equipment. Test equipment is reviewed as are various surveillance procedures. These include the Air Transport Association specification ATA 103, the FAA Advisory Circular 150/5230-4, the International Air Transport Association Guidance Material for Fuel Quality Control and Fuelling Service and the Guidelines for Quality Control at Jointly Operated Fuel Systems. Some past and current quality control problems are briefly mentioned.

  14. Control by quality: proposition of a typology.

    Science.gov (United States)

    Pujo, P; Pillet, M

    The application of Quality tools and methods in industrial management has always had a fundamental impact on the control of production. It influences the behavior of the actors concerned, while introducing the necessary notions and formalizations, especially for production systems with little or no automation, which constitute a large part of the industrial activity. Several quality approaches are applied in the workshop and are implemented at the level of the control. In this paper, the authors present a typology of the various approaches that have successively influenced control, such as statistical process control, quality assurance, and continuous improvement. First the authors present a parallel between production control and quality organizational structure. They note the duality between control, which is aimed at increasing productivity, and quality, which aims to satisfy the needs of the customer. They also note the hierarchical organizational structure of these two systems of management with, at each level, the notion of a feedback loop. This notion is fundamental to any kind of decision making. The paper is organized around the operational, tactical, and strategic levels, by describing for each level the main methods and tools for control by quality. The overview of these tools and methods starts at the operational level, with the Statistical Process Control, the Taguchi technique, and the "six sigma" approach. On the tactical level, we find a quality system approach, with a documented description of the procedures introduced in the firm. The management system can refer here to Quality Assurance, Total Productive Maintenance, or Management by Total Quality. The formalization through procedures of the rules of decision governing the process control enhances the validity of these rules. This leads to the enhancement of their reliability and to their consolidation. All this counterbalances the human, intrinsically fluctuating, behavior of the control

  15. Statistical Process Control Charts for Measuring and Monitoring Temporal Consistency of Ratings

    Science.gov (United States)

    Omar, M. Hafidz

    2010-01-01

    Methods of statistical process control were briefly investigated in the field of educational measurement as early as 1999. However, only the use of a cumulative sum chart was explored. In this article other methods of statistical quality control are introduced and explored. In particular, methods in the form of Shewhart mean and standard deviation…

  16. Project T.E.A.M. (Technical Education Advancement Modules). Advanced Statistical Process Control.

    Science.gov (United States)

    Dunlap, Dale

    This instructional guide, one of a series developed by the Technical Education Advancement Modules (TEAM) project, is a 20-hour advanced statistical process control (SPC) and quality improvement course designed to develop the following competencies: (1) understanding quality systems; (2) knowing the process; (3) solving quality problems; and (4)…

  17. Groundwater quality assessment of urban Bengaluru using multivariate statistical techniques

    Science.gov (United States)

    Gulgundi, Mohammad Shahid; Shetty, Amba

    2018-03-01

    Groundwater quality deterioration due to anthropogenic activities has become a subject of prime concern. The objective of the study was to assess the spatial and temporal variations in groundwater quality and to identify the sources in the western half of the Bengaluru city using multivariate statistical techniques. Water quality index rating was calculated for pre and post monsoon seasons to quantify overall water quality for human consumption. The post-monsoon samples show signs of poor quality in drinking purpose compared to pre-monsoon. Cluster analysis (CA), principal component analysis (PCA) and discriminant analysis (DA) were applied to the groundwater quality data measured on 14 parameters from 67 sites distributed across the city. Hierarchical cluster analysis (CA) grouped the 67 sampling stations into two groups, cluster 1 having high pollution and cluster 2 having lesser pollution. Discriminant analysis (DA) was applied to delineate the most meaningful parameters accounting for temporal and spatial variations in groundwater quality of the study area. Temporal DA identified pH as the most important parameter, which discriminates between water quality in the pre-monsoon and post-monsoon seasons and accounts for 72% seasonal assignation of cases. Spatial DA identified Mg, Cl and NO3 as the three most important parameters discriminating between two clusters and accounting for 89% spatial assignation of cases. Principal component analysis was applied to the dataset obtained from the two clusters, which evolved three factors in each cluster, explaining 85.4 and 84% of the total variance, respectively. Varifactors obtained from principal component analysis showed that groundwater quality variation is mainly explained by dissolution of minerals from rock water interactions in the aquifer, effect of anthropogenic activities and ion exchange processes in water.

  18. INFORMATION SYSTEM QUALITY CONTROL KNOWLEDGE

    Directory of Open Access Journals (Sweden)

    Vladimir Nikolaevich Babeshko

    2017-02-01

    Full Text Available The development of the educational system is associated with the need to control the quality of educational services. Quality control knowledge is an important part of the scientific process. The penetration of computers into all areas of activities changing approaches and technologies that previously they were used.

  19. Statistical Framework for Recreational Water Quality Criteria and Monitoring

    DEFF Research Database (Denmark)

    Halekoh, Ulrich

    2008-01-01

    recreational governmental authorities controlling water quality. The book opens with a historical account of water quality criteria in the USA between 1922 and 2003. Five chapters are related to sampling strategies and decision rules. Chapter 2 discusses the dependence of decision-making rules on short...... modeling exploiting additional information like meteorological data can support the decision process as shown in Chapter 10. The question of which information to extract from water sample analyses is closely related to the task of risk assessment for human health. Beach-water quality is often measured......Administrators of recreational waters face the basic tasks of surveillance of water quality and decisions on beach closure in case of unacceptable quality. Monitoring and subsequent decisions are based on sampled water probes and fundamental questions are which type of data to extract from...

  20. Applying Statistical Process Control to Clinical Data: An Illustration.

    Science.gov (United States)

    Pfadt, Al; And Others

    1992-01-01

    Principles of statistical process control are applied to a clinical setting through the use of control charts to detect changes, as part of treatment planning and clinical decision-making processes. The logic of control chart analysis is derived from principles of statistical inference. Sample charts offer examples of evaluating baselines and…

  1. Quality assurance programme and quality control

    International Nuclear Information System (INIS)

    Alvarez de Buergo, L.

    1979-01-01

    The paper analyses the requirements for the quality assurance and control in nuclear power plant projects which are needed to achieve safe, reliable and economic plants. The author describes the structure for the establishment of a nuclear programme at the national level and the participation of the different bodies involved in a nuclear power plant project. The paper ends with the study of a specific case in Spain. (NEA) [fr

  2. Quality control of radiopharmaceuticals

    International Nuclear Information System (INIS)

    Wallen, O.; Komarov, E.

    1973-01-01

    The International Pharmacopoeia published by WHO constitutes a collection of recommended specifications for pharmaceutical preparations which are not intended to have legal status in any country, but serve as references so that national specifications can be established on a similar basis in any country. Like any pharmacopoeia, it contains monographs for the quality con trol of drugs by means of chemical, physical and simple biological methods, as well as appendices describing general methods. The work on the International Pharmacopoeia is carried out by WHO with the aid of the Expert Advisory Panel on the International Pharmacopoeia and Pharmaceutical Preparations and other specialists from various countries and the Expert Committee on Specifications for Pharmaceutical Preparations. (author)

  3. Errors in patient specimen collection: application of statistical process control.

    Science.gov (United States)

    Dzik, Walter Sunny; Beckman, Neil; Selleng, Kathleen; Heddle, Nancy; Szczepiorkowski, Zbigniew; Wendel, Silvano; Murphy, Michael

    2008-10-01

    Errors in the collection and labeling of blood samples for pretransfusion testing increase the risk of transfusion-associated patient morbidity and mortality. Statistical process control (SPC) is a recognized method to monitor the performance of a critical process. An easy-to-use SPC method was tested to determine its feasibility as a tool for monitoring quality in transfusion medicine. SPC control charts were adapted to a spreadsheet presentation. Data tabulating the frequency of mislabeled and miscollected blood samples from 10 hospitals in five countries from 2004 to 2006 were used to demonstrate the method. Control charts were produced to monitor process stability. The participating hospitals found the SPC spreadsheet very suitable to monitor the performance of the sample labeling and collection and applied SPC charts to suit their specific needs. One hospital monitored subcategories of sample error in detail. A large hospital monitored the number of wrong-blood-in-tube (WBIT) events. Four smaller-sized facilities, each following the same policy for sample collection, combined their data on WBIT samples into a single control chart. One hospital used the control chart to monitor the effect of an educational intervention. A simple SPC method is described that can monitor the process of sample collection and labeling in any hospital. SPC could be applied to other critical steps in the transfusion processes as a tool for biovigilance and could be used to develop regional or national performance standards for pretransfusion sample collection. A link is provided to download the spreadsheet for free.

  4. Quality control of labelled compounds

    International Nuclear Information System (INIS)

    Matucha, M.

    1979-01-01

    Some advantages and disadvantages of methods used for quality control of organic labelled compounds (1 31 I, 14 C) are shortly discussed. The methods used are electrophoresis, ultraviolet and infrared spectrometry, radiogas and thin-layer chromatography. (author)

  5. Evaluation of air quality in a megacity using statistics tools

    Science.gov (United States)

    Ventura, Luciana Maria Baptista; de Oliveira Pinto, Fellipe; Soares, Laiza Molezon; Luna, Aderval Severino; Gioda, Adriana

    2017-03-01

    Local physical characteristics (e.g., meteorology and topography) associate to particle concentrations are important to evaluate air quality in a region. Meteorology and topography affect air pollutant dispersions. This study used statistics tools (PCA, HCA, Kruskal-Wallis, Mann-Whitney's test and others) to a better understanding of the relationship between fine particulate matter (PM2.5) levels and seasons, meteorological conditions and air basins. To our knowledge, it is one of the few studies performed in Latin America involving all parameters together. PM2.5 samples were collected in six sampling sites with different emission sources (industrial, vehicular, soil dust) in Rio de Janeiro, Brazil. The PM2.5 daily concentrations ranged from 1 to 61 µg m-3, with averages higher than the annual limit (15 µg m-3) for some of the sites. The results of the statistics evaluation showed that PM2.5 concentrations were not influenced by seasonality. Furthermore, air basins defined previously were not confirmed, because some sites presented similar emission sources. Therefore, new redefinitions of air basins need to be done, once they are important to air quality management.

  6. Statistical assessment of quality of credit activity of Ukrainian banks

    Directory of Open Access Journals (Sweden)

    Moldavska Olena V.

    2013-03-01

    Full Text Available The article conducts an economic and statistical analysis of the modern state of credit activity of Ukrainian banks and main tendencies of its development. It justifies urgency of the statistical study of credit activity of banks. It offers a complex system of assessment of bank lending at two levels: the level of the banking system and the level of an individual bank. The use of the system analysis allows reflection of interconnection between effectiveness of functioning of the banking system and quality of the credit portfolio. The article considers main aspects of management of quality of the credit portfolio – level of troubled debt and credit risk. The article touches the problem of adequate quantitative assessment of troubled loans in the credit portfolios of banks, since the methodologies of its calculation used by the National Bank of Ukraine and international rating agencies are quite different. The article presents a system of methods of management of credit risk, both theoretically and providing specific examples, in the context of prevention of occurrence of risk situations or elimination of their consequences.

  7. Evaluation of air quality in a megacity using statistics tools

    Science.gov (United States)

    Ventura, Luciana Maria Baptista; de Oliveira Pinto, Fellipe; Soares, Laiza Molezon; Luna, Aderval Severino; Gioda, Adriana

    2018-06-01

    Local physical characteristics (e.g., meteorology and topography) associate to particle concentrations are important to evaluate air quality in a region. Meteorology and topography affect air pollutant dispersions. This study used statistics tools (PCA, HCA, Kruskal-Wallis, Mann-Whitney's test and others) to a better understanding of the relationship between fine particulate matter (PM2.5) levels and seasons, meteorological conditions and air basins. To our knowledge, it is one of the few studies performed in Latin America involving all parameters together. PM2.5 samples were collected in six sampling sites with different emission sources (industrial, vehicular, soil dust) in Rio de Janeiro, Brazil. The PM2.5 daily concentrations ranged from 1 to 61 µg m-3, with averages higher than the annual limit (15 µg m-3) for some of the sites. The results of the statistics evaluation showed that PM2.5 concentrations were not influenced by seasonality. Furthermore, air basins defined previously were not confirmed, because some sites presented similar emission sources. Therefore, new redefinitions of air basins need to be done, once they are important to air quality management.

  8. [Quality control in herbal supplements].

    Science.gov (United States)

    Oelker, Luisa

    2005-01-01

    Quality and safety of food and herbal supplements are the result of a whole of different elements as good manufacturing practice and process control. The process control must be active and able to individuate and correct all possible hazards. The main and most utilized instrument is the hazard analysis critical control point (HACCP) system the correct application of which can guarantee the safety of the product. Herbal supplements need, in addition to standard quality control, a set of checks to assure the harmlessness and safety of the plants used.

  9. Using Statistical Process Control to Drive Improvement in Neonatal Care: A Practical Introduction to Control Charts.

    Science.gov (United States)

    Gupta, Munish; Kaplan, Heather C

    2017-09-01

    Quality improvement (QI) is based on measuring performance over time, and variation in data measured over time must be understood to guide change and make optimal improvements. Common cause variation is natural variation owing to factors inherent to any process; special cause variation is unnatural variation owing to external factors. Statistical process control methods, and particularly control charts, are robust tools for understanding data over time and identifying common and special cause variation. This review provides a practical introduction to the use of control charts in health care QI, with a focus on neonatology. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. Using time series for the statistical monitoring of spectral quality index of electron beams for clinical use; Uso de series temporales para el control estadistico del indice de calidad espectral de haces de electrones para uso clinico

    Energy Technology Data Exchange (ETDEWEB)

    Martinez-Luna, R. J.; Vega, J. M. de la; Vilches, M.; Guirado, D.; Zamora, L. I.; Lallena, A. M.

    2011-07-01

    Using the techniques of statistical process control (SPC) keeps track of the variable that controls the stability of the spectrum of electron beam accelerators in clinical use. In this process, applied since 1995, we obtained a high number of false alarms. Our work shows that this unexpected behavior appears to treat the variable of interest as a normal random variable, independent and identically distributed (iid), when in fact the observations of this variable are positively correlated with each other. (Author)

  11. Air Quality Forecasting through Different Statistical and Artificial Intelligence Techniques

    Science.gov (United States)

    Mishra, D.; Goyal, P.

    2014-12-01

    Urban air pollution forecasting has emerged as an acute problem in recent years because there are sever environmental degradation due to increase in harmful air pollutants in the ambient atmosphere. In this study, there are different types of statistical as well as artificial intelligence techniques are used for forecasting and analysis of air pollution over Delhi urban area. These techniques are principle component analysis (PCA), multiple linear regression (MLR) and artificial neural network (ANN) and the forecasting are observed in good agreement with the observed concentrations through Central Pollution Control Board (CPCB) at different locations in Delhi. But such methods suffers from disadvantages like they provide limited accuracy as they are unable to predict the extreme points i.e. the pollution maximum and minimum cut-offs cannot be determined using such approach. Also, such methods are inefficient approach for better output forecasting. But with the advancement in technology and research, an alternative to the above traditional methods has been proposed i.e. the coupling of statistical techniques with artificial Intelligence (AI) can be used for forecasting purposes. The coupling of PCA, ANN and fuzzy logic is used for forecasting of air pollutant over Delhi urban area. The statistical measures e.g., correlation coefficient (R), normalized mean square error (NMSE), fractional bias (FB) and index of agreement (IOA) of the proposed model are observed in better agreement with the all other models. Hence, the coupling of statistical and artificial intelligence can be use for the forecasting of air pollutant over urban area.

  12. Fuel cycle and quality control

    International Nuclear Information System (INIS)

    Stoll, W.

    1979-01-01

    The volume of the fuel cycle is described in its economic importance and its through put, as it is envisaged for the Federal Republic of Germany. Definitions are given for quality continuing usefulness of an object and translated into quality criteria. Requirements on performance of fuel elements are defined. The way in which experimental results are translated into mass production of fuel rods, is described. The economic potential for further quality effort is derived. Future ways of development for quality control organisation and structure are outlined. (Auth.)

  13. A case study: application of statistical process control tool for determining process capability and sigma level.

    Science.gov (United States)

    Chopra, Vikram; Bairagi, Mukesh; Trivedi, P; Nagar, Mona

    2012-01-01

    Statistical process control is the application of statistical methods to the measurement and analysis of variation process. Various regulatory authorities such as Validation Guidance for Industry (2011), International Conference on Harmonisation ICH Q10 (2009), the Health Canada guidelines (2009), Health Science Authority, Singapore: Guidance for Product Quality Review (2008), and International Organization for Standardization ISO-9000:2005 provide regulatory support for the application of statistical process control for better process control and understanding. In this study risk assessments, normal probability distributions, control charts, and capability charts are employed for selection of critical quality attributes, determination of normal probability distribution, statistical stability, and capability of production processes, respectively. The objective of this study is to determine tablet production process quality in the form of sigma process capability. By interpreting data and graph trends, forecasting of critical quality attributes, sigma process capability, and stability of process were studied. The overall study contributes to an assessment of process at the sigma level with respect to out-of-specification attributes produced. Finally, the study will point to an area where the application of quality improvement and quality risk assessment principles for achievement of six sigma-capable processes is possible. Statistical process control is the most advantageous tool for determination of the quality of any production process. This tool is new for the pharmaceutical tablet production process. In the case of pharmaceutical tablet production processes, the quality control parameters act as quality assessment parameters. Application of risk assessment provides selection of critical quality attributes among quality control parameters. Sequential application of normality distributions, control charts, and capability analyses provides a valid statistical

  14. Water Quality attainment Information from Clean Water Act Statewide Statistical Surveys

    Data.gov (United States)

    U.S. Environmental Protection Agency — Designated uses assessed by statewide statistical surveys and their state and national attainment categories. Statewide statistical surveys are water quality...

  15. Water Quality Stressor Information from Clean Water Act Statewide Statistical Surveys

    Data.gov (United States)

    U.S. Environmental Protection Agency — Stressors assessed by statewide statistical surveys and their state and national attainment categories. Statewide statistical surveys are water quality assessments...

  16. Statistical process control methods allow the analysis and improvement of anesthesia care.

    Science.gov (United States)

    Fasting, Sigurd; Gisvold, Sven E

    2003-10-01

    Quality aspects of the anesthetic process are reflected in the rate of intraoperative adverse events. The purpose of this report is to illustrate how the quality of the anesthesia process can be analyzed using statistical process control methods, and exemplify how this analysis can be used for quality improvement. We prospectively recorded anesthesia-related data from all anesthetics for five years. The data included intraoperative adverse events, which were graded into four levels, according to severity. We selected four adverse events, representing important quality and safety aspects, for statistical process control analysis. These were: inadequate regional anesthesia, difficult emergence from general anesthesia, intubation difficulties and drug errors. We analyzed the underlying process using 'p-charts' for statistical process control. In 65,170 anesthetics we recorded adverse events in 18.3%; mostly of lesser severity. Control charts were used to define statistically the predictable normal variation in problem rate, and then used as a basis for analysis of the selected problems with the following results: Inadequate plexus anesthesia: stable process, but unacceptably high failure rate; Difficult emergence: unstable process, because of quality improvement efforts; Intubation difficulties: stable process, rate acceptable; Medication errors: methodology not suited because of low rate of errors. By applying statistical process control methods to the analysis of adverse events, we have exemplified how this allows us to determine if a process is stable, whether an intervention is required, and if quality improvement efforts have the desired effect.

  17. Principles and Practices for Quality Assurance and Quality Control

    Science.gov (United States)

    Jones, Berwyn E.

    1999-01-01

    Quality assurance and quality control are vital parts of highway runoff water-quality monitoring projects. To be effective, project quality assurance must address all aspects of the project, including project management responsibilities and resources, data quality objectives, sampling and analysis plans, data-collection protocols, data quality-control plans, data-assessment procedures and requirements, and project outputs. Quality control ensures that the data quality objectives are achieved as planned. The historical development and current state of the art of quality assurance and quality control concepts described in this report can be applied to evaluation of data from prior projects.

  18. Statistical assessment of coal charge effect on metallurgical coke quality

    Directory of Open Access Journals (Sweden)

    Pavlína Pustějovská

    2016-06-01

    Full Text Available The paper studies coke quality. Blast furnace technique has been interested in iron ore charge; meanwhile coke was not studied because, in previous conditions, it seemed to be good enough. Nowadays, requirements for blast furnace coke has risen, especially, requirements for coke reactivity. The level of reactivity parameter is determined primarily by the composition and properties of coal mixtures for coking. The paper deals with a statistical analysis of the tightness and characteristics of the relationship between selected properties of coal mixture and coke reactivity. Software Statgraphic using both simple linear regression and multiple linear regressions was used for the calculations. Obtained regression equations provide a statistically significant prediction of the reactivity of coke, or its strength after reduction of CO2, and, thus, their subsequent management by change in composition and properties of coal mixture. There were determined indexes CSR/CRI for coke. Fifty – four results were acquired in the experimental parts where correlation between index CRI and coal components were studied. For linear regression the determinant was 55.0204%, between parameters CRI – Inertinit 21.5873%. For regression between CRI and coal components it was 31.03%. For multiple linear regression between CRI and 3 feedstock components determinant was 34.0691%. The final correlation has shown the decrease in final coke reactivity for higher ash, higher content of volatile combustible in coal increases the total coke reactivity and higher amount of inertinit in coal increases the reactivity. Generally, coke quality is significantly affected by coal processing, carbonization and maceral content of coal mixture.

  19. Statistical Methods in Assembly Quality Management of Multi-Element Products on Automatic Rotor Lines

    Science.gov (United States)

    Pries, V. V.; Proskuriakov, N. E.

    2018-04-01

    To control the assembly quality of multi-element mass-produced products on automatic rotor lines, control methods with operational feedback are required. However, due to possible failures in the operation of the devices and systems of automatic rotor line, there is always a real probability of getting defective (incomplete) products into the output process stream. Therefore, a continuous sampling control of the products completeness, based on the use of statistical methods, remains an important element in managing the quality of assembly of multi-element mass products on automatic rotor lines. The feature of continuous sampling control of the multi-element products completeness in the assembly process is its breaking sort, which excludes the possibility of returning component parts after sampling control to the process stream and leads to a decrease in the actual productivity of the assembly equipment. Therefore, the use of statistical procedures for continuous sampling control of the multi-element products completeness when assembled on automatic rotor lines requires the use of such sampling plans that ensure a minimum size of control samples. Comparison of the values of the limit of the average output defect level for the continuous sampling plan (CSP) and for the automated continuous sampling plan (ACSP) shows the possibility of providing lower limit values for the average output defects level using the ACSP-1. Also, the average sample size when using the ACSP-1 plan is less than when using the CSP-1 plan. Thus, the application of statistical methods in the assembly quality management of multi-element products on automatic rotor lines, involving the use of proposed plans and methods for continuous selective control, will allow to automating sampling control procedures and the required level of quality of assembled products while minimizing sample size.

  20. Blind image quality assessment based on aesthetic and statistical quality-aware features

    Science.gov (United States)

    Jenadeleh, Mohsen; Masaeli, Mohammad Masood; Moghaddam, Mohsen Ebrahimi

    2017-07-01

    The main goal of image quality assessment (IQA) methods is the emulation of human perceptual image quality judgments. Therefore, the correlation between objective scores of these methods with human perceptual scores is considered as their performance metric. Human judgment of the image quality implicitly includes many factors when assessing perceptual image qualities such as aesthetics, semantics, context, and various types of visual distortions. The main idea of this paper is to use a host of features that are commonly employed in image aesthetics assessment in order to improve blind image quality assessment (BIQA) methods accuracy. We propose an approach that enriches the features of BIQA methods by integrating a host of aesthetics image features with the features of natural image statistics derived from multiple domains. The proposed features have been used for augmenting five different state-of-the-art BIQA methods, which use statistical natural scene statistics features. Experiments were performed on seven benchmark image quality databases. The experimental results showed significant improvement of the accuracy of the methods.

  1. Quality control with R an ISO standards approach

    CERN Document Server

    Cano, Emilio L; Prieto Corcoba, Mariano

    2015-01-01

    Presenting a practitioner's guide to capabilities and best practices of quality control systems using the R programming language, this volume emphasizes accessibility and ease-of-use through detailed explanations of R code as well as standard statistical methodologies. In the interest of reaching the widest possible audience of quality-control professionals and statisticians, examples throughout are structured to simplify complex equations and data structures, and to demonstrate their applications to quality control processes, such as ISO standards. The volume balances its treatment of key aspects of quality control, statistics, and programming in R, making the text accessible to beginners and expert quality control professionals alike. Several appendices serve as useful references for ISO standards and common tasks performed while applying quality control with R.

  2. Quality Control - Nike.Inc

    OpenAIRE

    Walter G. Bishop

    2017-01-01

    The purpose of this paper is to present the illustration of quality control approach, which has been adopted by several organizations, in order to manage and improve their production processes. The approach is referred as total quality management (TQM). This study will discuss the implementation of TQ, within the working environment of Nike Inc. One of the major objectives behind the implementation of TQ is to reduce or completely eliminate potential errors and flaws, within the manufacturing...

  3. Quality assurance, quality control and quality audit in diagnostic radiology

    International Nuclear Information System (INIS)

    Vassileva, J.

    2009-01-01

    Full text:The lecture aims to present contemporary view of quality assurance in X-Ray diagnosis and its practical realization in Bulgaria. In the lecture the concepts of quality assurance, quality control and clinical audit will be defined and their scope will be considered. An answer of the following questions will be given: why is it necessary to determine the dose of patient in X-ray studies, what is the reference dose level and how it is used for dosimetric quantity which characterized the patient's exposure in X-ray, mammography and CT scans and how they are measured, who conducted the measurement and how to keep the records, what are the variations of doses in identical tests and what defines them? The findings from a national survey of doses in diagnostic radiology, conducted in 2008-2009 and the developed new national reference levels will be presented. The main findings of the first tests of radiological equipment and the future role of quality control as well as the concept of conducting clinical audit and its role in quality assurance are also presented. Quality assurance of the diagnostic process with minimal exposure of patients is a strategic goal whose realization requires understanding, organization and practical action, both nationally and in every hospital. To achieve this the important role of education and training of physicians, radiological technicians and medical physicists is enhanced

  4. Characterization of groundwater quality using water evaluation indices, multivariate statistics and geostatistics in central Bangladesh

    Directory of Open Access Journals (Sweden)

    Md. Bodrud-Doza

    2016-04-01

    Full Text Available This study investigates the groundwater quality in the Faridpur district of central Bangladesh based on preselected 60 sample points. Water evaluation indices and a number of statistical approaches such as multivariate statistics and geostatistics are applied to characterize water quality, which is a major factor for controlling the groundwater quality in term of drinking purposes. The study reveal that EC, TDS, Ca2+, total As and Fe values of groundwater samples exceeded Bangladesh and international standards. Ground water quality index (GWQI exhibited that about 47% of the samples were belonging to good quality water for drinking purposes. The heavy metal pollution index (HPI, degree of contamination (Cd, heavy metal evaluation index (HEI reveal that most of the samples belong to low level of pollution. However, Cd provide better alternative than other indices. Principle component analysis (PCA suggests that groundwater quality is mainly related to geogenic (rock–water interaction and anthropogenic source (agrogenic and domestic sewage in the study area. Subsequently, the findings of cluster analysis (CA and correlation matrix (CM are also consistent with the PCA results. The spatial distributions of groundwater quality parameters are determined by geostatistical modeling. The exponential semivariagram model is validated as the best fitted models for most of the indices values. It is expected that outcomes of the study will provide insights for decision makers taking proper measures for groundwater quality management in central Bangladesh.

  5. Quality control of imaging devices

    International Nuclear Information System (INIS)

    Soni, P.S.

    1992-01-01

    Quality assurance in nuclear medicine refers collectively to all aspects of a nuclear medicine service. It would include patient scheduling, radiopharmaceutical preparation and dispensing, radiation protection of patients, staff and general public, preventive maintenance and the care of instruments, methodology, data interpretation and records keeping, and many other small things which contribute directly or indirectly to the overall quality of a nuclear medicine service in a hospital. Quality Control, on the other hand, refers to a signal component of the system and is usually applied in relation to a specific instrument and its performance

  6. Quality control for dose calibrators

    International Nuclear Information System (INIS)

    Mendes, L.C.G.

    1984-01-01

    Nuclear medicine laboratories are required to assay samples of radioactivity to be administered to patients. Almost universally, these assays are accomplished by use of a well ionization chamber isotope calibrator. The Instituto de Radioprotecao e Dosimetria (Institute for Radiological Protection and Dosimetry) of the Comissao Nacional de Energia Nuclear (National Commission for Nuclear Energy) is carrying out a National Quality Control Programme in Nuclear Medicine, supported by the International Atomic Energy Agency. The assessment of the current needs and practices of quality control in the entire country of Brazil includes Dose Calibrators and Scintillation Cameras, but this manual is restricted to the former. Quality Control Procedures for these Instruments are described in this document together with specific recommendations and assessment of its accuracy. (author)

  7. Quality control in diagnostic radiology - patient dosimetry

    Energy Technology Data Exchange (ETDEWEB)

    Prlic, I; Radalj, Z; Brumen, V; Cerovac, H [Institute for Medical Research and Occupational Health, Laboratory for Radiation Protection and Dosimetry, Zagreb (Croatia); Gladic, J [Institute for Physics, Laboratory for Solid State Physics, Zagreb (Croatia); Tercek, V [Clinical Hospital Sisters of Mercy, Health Physics Department, Zagreb (Croatia)

    1997-12-31

    In order to establish the Quality Criteria for diagnostic radiographic images in the radiology departments in Republic of Croatia we have started the several Quality Control projects on the field. The measurements are performed according to some methodology recommendations in our law but the methodology, measurement principles, measurement equipment, phantoms, measurable parameters for the good use by radiographers, statistical and numerical evaluation, dosimetric philosophy etc. where first recognized as a private/or group hazard of each person involved in the procedure of evaluation of diagnostic radiology images/diagnosis. The important quality elements of the imaging process are: the diagnostic quality of the radiographic image, the radiation dose to the patient and the choice of the radiographic technique. This depends on the x-ray unit (tube) radiation quality, image processing quality and final image evaluation quality. In this paper we will show how the Quality Control measurements can be easily connected to the dose delivered to the patient for the known diagnostic procedure and how this can be used by radiographers in their daily work. The reproducibility of the x-ray generator was checked before the service calibration and after the service calibration. The table of kV dependence and output dose per mAs was calculated and the ESD (entrance surface dose) was measured/calculated for the specific diagnostic procedure. After the phantom calculation were made and the dose prediction for the given procedure was done, measurements were done on the patients (digital dosemeters, TLD and film dosemeter combinations). We are claiming that there is no need to measure each patient if the proper Quality Control measurements are done and the proper table of ESD for each particular x-ray tube in diagnostic departments is calculated for the radiographers daily use. (author). 1 example, 1 fig., 13 refs.

  8. Using Statistical Process Control Methods to Classify Pilot Mental Workloads

    National Research Council Canada - National Science Library

    Kudo, Terence

    2001-01-01

    .... These include cardiac, ocular, respiratory, and brain activity measures. The focus of this effort is to apply statistical process control methodology on different psychophysiological features in an attempt to classify pilot mental workload...

  9. Sensometrics for Food Quality Control

    DEFF Research Database (Denmark)

    Brockhoff, Per B.

    2011-01-01

    The industrial development of innovative and succesful food items and the measuring of food quality in general is difficult without actually letting human beings evaluate the products using their senses at some point in the process. The use of humans as measurement instruments calls for special...... attention in the modelling and data analysis phase. In this paper the focus is on sensometrics – the „metric“ side of the sensory science field. The sensometrics field is introduced and related to the fields of statistics, chemometrics and psychometrics. Some of the most commonly used sensory testing...

  10. Quality of reporting statistics in two Indian pharmacology journals.

    Science.gov (United States)

    Jaykaran; Yadav, Preeti

    2011-04-01

    To evaluate the reporting of the statistical methods in articles published in two Indian pharmacology journals. All original articles published since 2002 were downloaded from the journals' (Indian Journal of Pharmacology (IJP) and Indian Journal of Physiology and Pharmacology (IJPP)) website. These articles were evaluated on the basis of appropriateness of descriptive statistics and inferential statistics. Descriptive statistics was evaluated on the basis of reporting of method of description and central tendencies. Inferential statistics was evaluated on the basis of fulfilling of assumption of statistical methods and appropriateness of statistical tests. Values are described as frequencies, percentage, and 95% confidence interval (CI) around the percentages. Inappropriate descriptive statistics was observed in 150 (78.1%, 95% CI 71.7-83.3%) articles. Most common reason for this inappropriate descriptive statistics was use of mean ± SEM at the place of "mean (SD)" or "mean ± SD." Most common statistical method used was one-way ANOVA (58.4%). Information regarding checking of assumption of statistical test was mentioned in only two articles. Inappropriate statistical test was observed in 61 (31.7%, 95% CI 25.6-38.6%) articles. Most common reason for inappropriate statistical test was the use of two group test for three or more groups. Articles published in two Indian pharmacology journals are not devoid of statistical errors.

  11. Robust Control Methods for On-Line Statistical Learning

    Directory of Open Access Journals (Sweden)

    Capobianco Enrico

    2001-01-01

    Full Text Available The issue of controlling that data processing in an experiment results not affected by the presence of outliers is relevant for statistical control and learning studies. Learning schemes should thus be tested for their capacity of handling outliers in the observed training set so to achieve reliable estimates with respect to the crucial bias and variance aspects. We describe possible ways of endowing neural networks with statistically robust properties by defining feasible error criteria. It is convenient to cast neural nets in state space representations and apply both Kalman filter and stochastic approximation procedures in order to suggest statistically robustified solutions for on-line learning.

  12. Control of Bank Consolidated Financial Statements Quality

    Directory of Open Access Journals (Sweden)

    Margarita S. Ambarchyan

    2013-01-01

    Full Text Available The author presents the multiple linear regression model of bank consolidated financial statements quality. The article considers six characteristics that can be used to estimate the level of bank consolidated financial statements quality. The multiple linear regression model was developed, using the results of point-based system of consolidated financial statements of thirty European bank and financial groups on the basis of the developed characteristics. The author offers to use the characteristic significance factor in the process of consolidated financial statements appraisal by points. The constructed regression model is checked on accuracy and statistical significance. The model can be used by internal auditors and financial analytics as an instrument for bank and non-bank consolidated financial statements quality control

  13. Quality of reporting statistics in two Indian pharmacology journals

    OpenAIRE

    Jaykaran,; Yadav, Preeti

    2011-01-01

    Objective: To evaluate the reporting of the statistical methods in articles published in two Indian pharmacology journals. Materials and Methods: All original articles published since 2002 were downloaded from the journals′ (Indian Journal of Pharmacology (IJP) and Indian Journal of Physiology and Pharmacology (IJPP)) website. These articles were evaluated on the basis of appropriateness of descriptive statistics and inferential statistics. Descriptive statistics was evaluated on the basis of...

  14. An Improvement of the Hotelling T2 Statistic in Monitoring Multivariate Quality Characteristics

    Directory of Open Access Journals (Sweden)

    Ashkan Shabbak

    2012-01-01

    Full Text Available The Hotelling T2 statistic is the most popular statistic used in multivariate control charts to monitor multiple qualities. However, this statistic is easily affected by the existence of more than one outlier in the data set. To rectify this problem, robust control charts, which are based on the minimum volume ellipsoid and the minimum covariance determinant, have been proposed. Most researchers assess the performance of multivariate control charts based on the number of signals without paying much attention to whether those signals are really outliers. With due respect, we propose to evaluate control charts not only based on the number of detected outliers but also with respect to their correct positions. In this paper, an Upper Control Limit based on the median and the median absolute deviation is also proposed. The results of this study signify that the proposed Upper Control Limit improves the detection of correct outliers but that it suffers from a swamping effect when the positions of outliers are not taken into consideration. Finally, a robust control chart based on the diagnostic robust generalised potential procedure is introduced to remedy this drawback.

  15. Quality Assurance/Quality Control Jobs

    Science.gov (United States)

    Fanslau, Melody; Young, Janelle

    The production of a quality and safe food product is essential to the success of any food manufacturing facility. Because of this great importance, a career in quality can be extremely rewarding. Without happy customers willing to buy a product, a company would not be able to survive. Quality issues such as foreign objects, spoiled or mislabeled product, failure to meet net weight requirements, or a recall can all turn customers away from buying a product. The food industry is a customer-driven market in which some consumers are brand loyal based on a history of high quality or in which a single bad experience with a product will turn them away for a lifetime. With this said, the main role of a quality department is to help ensure that quality issues such as these are eliminated or kept to a minimum to maintain or increase the number of customers purchasing their product.

  16. Internal quality control: planning and implementation strategies.

    Science.gov (United States)

    Westgard, James O

    2003-11-01

    The first essential in setting up internal quality control (IQC) of a test procedure in the clinical laboratory is to select the proper IQC procedure to implement, i.e. choosing the statistical criteria or control rules, and the number of control measurements, according to the quality required for the test and the observed performance of the method. Then the right IQC procedure must be properly implemented. This review focuses on strategies for planning and implementing IQC procedures in order to improve the quality of the IQC. A quantitative planning process is described that can be implemented with graphical tools such as power function or critical-error graphs and charts of operating specifications. Finally, a total QC strategy is formulated to minimize cost and maximize quality. A general strategy for IQC implementation is recommended that employs a three-stage design in which the first stage provides high error detection, the second stage low false rejection and the third stage prescribes the length of the analytical run, making use of an algorithm involving the average of normal patients' data.

  17. Using Paper Helicopters to Teach Statistical Process Control

    Science.gov (United States)

    Johnson, Danny J.

    2011-01-01

    This hands-on project uses a paper helicopter to teach students how to distinguish between common and special causes of variability when developing and using statistical process control charts. It allows the student to experience a process that is out-of-control due to imprecise or incomplete product design specifications and to discover how the…

  18. Control of quality in mammography

    International Nuclear Information System (INIS)

    2006-10-01

    The present protocol of quality control/quality assurance in mammography is the result of the work of two regional projects realised in Latin America within the frame of ARCAL with the support of the IAEA. The first is ARCAL LV (RLA/6/043) project on quality assurance/quality control in mammography studies which analysed the present situation of the mammography in the member countries of the project which include: Bolivia, Colombia, Costa Rica, Cuba, El Salvador, Guatemala, Nicaragua, Panama, Paraguay, Peru, Dominican Republic and Republic of Venezuela and the second is ARCAL XLIX (RLA/9/035) project, whose members were Brazil, Colombia, Cuba, Chile, Mexico, and Peru, worked the application of Basic Safety Standards for the protection against ionising radiation with the aim to improve radiation protection in X-ray diagnosis medical practices through the implementation of the Basic Safety Standards (BSS) related to x-ray diagnosis in selected hospitals located in each country involved in the project. The work of both projects had been consolidated and harmonized in the present publication

  19. quality control of the radiopharmaceuticals

    International Nuclear Information System (INIS)

    Boukarra, Hajer; Boubakri, Rania

    2006-01-01

    This work is a contribution to the quality control of two radio pharmaceutical. Our study was carried out on the rat. These results enable us to draw the following conclusions: - the control of the purity of the cerebral tracers (Cytectrenes) is carried out by HPLC by using a detector of radioactivity which offers a great sensitivity. - the radiochemical output of marking of Kit MDP determined by thin chromatography of layer is 99%. - The study of the biodistribution in the rat showed an affinity raised for the feet bone. - These results are in conformity with the European pharmacopoeia, which enables us to require a marketing authorization. (author)

  20. Implementation of quality control systematics for personnel monitoring services

    International Nuclear Information System (INIS)

    Franco, J.O.A.

    1984-01-01

    The implementation of statistical quality control techniques used in industrial practise is proposed to dosimetric services. 'Control charts' and 'sampling inspection' are adapted respectively for control of measuring process and of dose results produced in routine. A chapter on Radiation Protection and Personnel Monitoring was included. (M.A.C.) [pt

  1. Metrology and quality control handbook

    International Nuclear Information System (INIS)

    Hofmann, D.

    1983-01-01

    This book tries to present the fundamentals of metrology and quality control in brief surveys. Compromises had to be made in order to reduce the material available to a sensible volume for the sake of clarity. This becomes evident by the following two restrictions which had to made: First, in dealing with the theoretical principles of metrology and quality control, mere reference had to be made in many cases to the great variety of special literature without discussing it to explain further details. Second, in dealing with the application of metrology and quality control techniques in practice, only the basic qantities of the International System of Units (SI) could be taken into account as a rule. Some readers will note that many special measuring methods and equipment known to them are not included in this book. I do hope, however, that this short-coming will show to have a positive effect, too. This book will show the reader how to find the basic quantities and units from the derived quantities and units, and the steps that are necessary to solve any kind of measuring task. (orig./RW) [de

  2. Statistic techniques of process control for MTR type

    International Nuclear Information System (INIS)

    Oliveira, F.S.; Ferrufino, F.B.J.; Santos, G.R.T.; Lima, R.M.

    2002-01-01

    This work aims at introducing some improvements on the fabrication of MTR type fuel plates, applying statistic techniques of process control. The work was divided into four single steps and their data were analyzed for: fabrication of U 3 O 8 fuel plates; fabrication of U 3 Si 2 fuel plates; rolling of small lots of fuel plates; applying statistic tools and standard specifications to perform a comparative study of these processes. (author)

  3. Cleaving of TOPAS and PMMA microstructured polymer optical fibers: Core-shift and statistical quality optimization

    DEFF Research Database (Denmark)

    Stefani, Alessio; Nielsen, Kristian; Rasmussen, Henrik K.

    2012-01-01

    We fabricated an electronically controlled polymer optical fiber cleaver, which uses a razor-blade guillotine and provides independent control of fiber temperature, blade temperature, and cleaving speed. To determine the optimum cleaving conditions of microstructured polymer optical fibers (m......POFs) with hexagonal hole structures we developed a program for cleaving quality optimization, which reads in a microscope image of the fiber end-facet and determines the core-shift and the statistics of the hole diameter, hole-to-hole pitch, hole ellipticity, and direction of major ellipse axis. For 125μm in diameter...

  4. Quality in statistics education : Determinants of course outcomes in methods & statistics education at universities and colleges

    NARCIS (Netherlands)

    Verhoeven, P.S.

    2009-01-01

    Although Statistics is not a very popular course according to most students, a majority of students still take it, as it is mandatory at most Social Science departments. Therefore it takes special teacher’s skills to teach statistics. In order to do so it is essential for teachers to know what

  5. A history of industrial statistics and quality and efficiency improvement

    NARCIS (Netherlands)

    de Mast, J.; Coleman, S.; Greenfield, T.; Stewardson, D.; Montgomery, D.C.

    2008-01-01

    The twentieth century witnessed incredible increases in product quality, while in the same period product priced dropped dramatically. These important improvements in quality and efficiency in industry were the result of innovations in management and engineering. But these developments were

  6. Effective control of complex turbulent dynamical systems through statistical functionals.

    Science.gov (United States)

    Majda, Andrew J; Qi, Di

    2017-05-30

    Turbulent dynamical systems characterized by both a high-dimensional phase space and a large number of instabilities are ubiquitous among complex systems in science and engineering, including climate, material, and neural science. Control of these complex systems is a grand challenge, for example, in mitigating the effects of climate change or safe design of technology with fully developed shear turbulence. Control of flows in the transition to turbulence, where there is a small dimension of instabilities about a basic mean state, is an important and successful discipline. In complex turbulent dynamical systems, it is impossible to track and control the large dimension of instabilities, which strongly interact and exchange energy, and new control strategies are needed. The goal of this paper is to propose an effective statistical control strategy for complex turbulent dynamical systems based on a recent statistical energy principle and statistical linear response theory. We illustrate the potential practical efficiency and verify this effective statistical control strategy on the 40D Lorenz 1996 model in forcing regimes with various types of fully turbulent dynamics with nearly one-half of the phase space unstable.

  7. Quality control in nuclear medicine

    International Nuclear Information System (INIS)

    Kostadinova, I.

    2007-01-01

    Nuclear medicine comprises diagnosis and therapy of the diseases with radiopharmaceuticals. The ambition of all specialists in our country is their activity to reach European standards. In this connection, a Commission for external audit was formed to evaluate the quality of work in the centers of nuclear medicine. This Commission create a long-lasting programme based on the objective European criteria and the national standard of nuclear medicine, having in mind to increase quality of the work and the expert evaluation of activity in every center. The program comprises measures for quality control of instrumentation, radiopharmaceuticals, performed investigations, obtained results and the whole organization from the receiving of the isotopes to the results of the patients. The ambition is most of the centers to fulfill the requirements. As a conclusion it could be said that not only the quality of everyday nuclear medicine work is enough to increase the prestige of the specialty. It is also necessary we to have understanding expert and financial support from corresponding institutions, incl. Ministry of health for a delivery of a new, contemporary instrumentation with new possibilities. Thus it would be possible Bulgarian patients to reach the high technology apparatuses for an early functional diagnosis of the diseases and optimal treatment, which possibility have the patients from the developed countries. (author)

  8. Quality control programme for radiotherapy

    International Nuclear Information System (INIS)

    Campos de Araujo, A.M.; Viegas, C.C.B.; Viamonte, A.M.

    2002-01-01

    A 3 years pilot programme started in January 2000 with 33 philanthropic cancer institutions that provides medical services to 60% of the patients from the national social security system. Brazil has today 161 radiotherapy services (144 operating with megavoltage equipment). These 33 institutions are distributed over 19 Brazilian states. The aim of this programme is: To create conditions to allow the participants to apply the radiotherapy with quality and efficacy; To promote up dating courses for the physicians, physicists and technicians of these 33 Institutions. With the following objectives: To recommend dosimetric and radiological protection procedures in order to guarantee the tumor prescribed dose and safe working conditions; To help in establishing and implementing these procedures. The main activities are: local quality control evaluations, postal TLD audits in reference conditions, postal TLD audits in off axis conditions and training. The local quality control program has already evaluated 22 institutions with 43 machines (25 Co-60 and 18 linear accelerators). In these visits we perform dosimetric, electrical, mechanical and safety tests. As foreseen, we found more problems among the old Co-60 machines i.e., field flatness, size, symmetry and relative output factors; lasers positioning system alignment; optical distance indicator; radiation and light field coincidence; optical and mechanical distance indicators agreement, than among the linear accelerators i.e., field flatness and size; lasers positioning system alignment; tray interlocking and wedge filter factors

  9. Batch-to-Batch Quality Consistency Evaluation of Botanical Drug Products Using Multivariate Statistical Analysis of the Chromatographic Fingerprint

    OpenAIRE

    Xiong, Haoshu; Yu, Lawrence X.; Qu, Haibin

    2013-01-01

    Botanical drug products have batch-to-batch quality variability due to botanical raw materials and the current manufacturing process. The rational evaluation and control of product quality consistency are essential to ensure the efficacy and safety. Chromatographic fingerprinting is an important and widely used tool to characterize the chemical composition of botanical drug products. Multivariate statistical analysis has showed its efficacy and applicability in the quality evaluation of many ...

  10. Management of Uncertainty by Statistical Process Control and a Genetic Tuned Fuzzy System

    Directory of Open Access Journals (Sweden)

    Stephan Birle

    2016-01-01

    Full Text Available In food industry, bioprocesses like fermentation often are a crucial part of the manufacturing process and decisive for the final product quality. In general, they are characterized by highly nonlinear dynamics and uncertainties that make it difficult to control these processes by the use of traditional control techniques. In this context, fuzzy logic controllers offer quite a straightforward way to control processes that are affected by nonlinear behavior and uncertain process knowledge. However, in order to maintain process safety and product quality it is necessary to specify the controller performance and to tune the controller parameters. In this work, an approach is presented to establish an intelligent control system for oxidoreductive yeast propagation as a representative process biased by the aforementioned uncertainties. The presented approach is based on statistical process control and fuzzy logic feedback control. As the cognitive uncertainty among different experts about the limits that define the control performance as still acceptable may differ a lot, a data-driven design method is performed. Based upon a historic data pool statistical process corridors are derived for the controller inputs control error and change in control error. This approach follows the hypothesis that if the control performance criteria stay within predefined statistical boundaries, the final process state meets the required quality definition. In order to keep the process on its optimal growth trajectory (model based reference trajectory a fuzzy logic controller is used that alternates the process temperature. Additionally, in order to stay within the process corridors, a genetic algorithm was applied to tune the input and output fuzzy sets of a preliminarily parameterized fuzzy controller. The presented experimental results show that the genetic tuned fuzzy controller is able to keep the process within its allowed limits. The average absolute error to the

  11. Family Control and Earnings Quality

    Directory of Open Access Journals (Sweden)

    Carolina Bona Sánchez

    2007-06-01

    Full Text Available El trabajo analiza la relación entre el control familiar y la calidad de la información contable en un contexto en el que el tradicional conflicto de agencia entre directivos y accionistas se desplaza a la divergencia de intereses entre accionistas controladores y minoritarios. Los resultados alcanzados muestran que, en comparación con las no familiares, las empresas de naturaleza familiar divulgan unos resultados de mayor calidad, tanto en términos de menores ajustes por devengo discrecionales como de mayor capacidad de los componentes actuales del resultado para predecir los cash flows futuros. Además, el aumento en los derechos de voto en manos de la familia controladora incrementa la calidad de los resultados contables. La evidencia obtenida se muestra consistente con la presencia de un efecto reputación/vinculación a largo plazo asociado a la empresa familiar. Adicionalmente, el trabajo refleja que a medida que disminuye la divergencia entre los derechos de voto y de cash flow en manos de la familia controladora, aumenta la calidad de la información contable.PALABRAS CLAVE: derechos de voto, divergencia, empresa familiar, calidad delresultado, reputación, beneficios privados.This work examines the relationship between family control and earnings quality in a context where the salient agency problem shifts away from the classical divergence between managers and shareholders to conflicts between the controlling owner and minority shareholders. The results reveal that, compared to non-family firms, family firms reveal higher earnings quality in terms of both lower discretionary accruals and greater predictability of future cash flows. They also show a positive relationship between the level of voting rights held by the controlling family and earnings quality. The evidence is consistent with the presence of a reputation/long-term involvement effect associated with the family firm. Moreover, the work reflects that, as the divergence

  12. Physicians, radiologists, and quality control

    International Nuclear Information System (INIS)

    Payne, W.F.

    1973-01-01

    Factors involved in quality control in medical x-ray examinations to achieve the least possible exposure to the patient are discussed. It would be hoped that film quality will remain in the position of paramount importance that it must in order to achieve the greatest amount of diagnostic information on each radiographic examination. At the same time, it is hoped that this can be done by further reducing the exposure of the patient to ionizing radiation by the methods that have been discussed; namely, education of the physician, radiologist, and technologist, modern protective equipment and departmental construction, efficient collimation whether automatic or manual, calibration and output measurement of the radiographic and fluoroscopic units, ongoing programs of education within each department of radiographic facility, film badge monitoring, education of and cooperation with the nonradiologic physician, and hopefully, more intensive programs by the National and State Bureaus and Departments of Radiological Health in education and encouragement to the medical community. (U.S.)

  13. Quality control of pesticide products

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2009-07-15

    In light of an established need for more efficient analytical procedures, this publication, which documents the findings of an IAEA coordinated research project (CRP) on “Quality Control of Pesticide Products”, simplifies the existing protocol for pesticide analysis while simultaneously upholding existing standards of quality. This publication includes both a report on the development work done in the CRP and a training manual for use by pesticide analysis laboratories. Based on peer reviewed and internationally recognized methods published by the Association of Analytical Communities (AOAC) and the Collaborative International Pesticides Analytical Council (CIPAC), this report provides laboratories with versatile tools to enhance the analysis of pesticide chemicals and to extend the scope of available analytical repertoires. Adoption of the proposed analytical methodologies promises to reduce laboratories’ use of solvents and the time spent on reconfiguration and set-up of analytical equipment.

  14. Quality control of pesticide products

    International Nuclear Information System (INIS)

    2009-07-01

    In light of an established need for more efficient analytical procedures, this publication, which documents the findings of an IAEA coordinated research project (CRP) on “Quality Control of Pesticide Products”, simplifies the existing protocol for pesticide analysis while simultaneously upholding existing standards of quality. This publication includes both a report on the development work done in the CRP and a training manual for use by pesticide analysis laboratories. Based on peer reviewed and internationally recognized methods published by the Association of Analytical Communities (AOAC) and the Collaborative International Pesticides Analytical Council (CIPAC), this report provides laboratories with versatile tools to enhance the analysis of pesticide chemicals and to extend the scope of available analytical repertoires. Adoption of the proposed analytical methodologies promises to reduce laboratories’ use of solvents and the time spent on reconfiguration and set-up of analytical equipment

  15. Numerical and Qualitative Contrasts of Two Statistical Models for Water Quality Change in Tidal Waters

    Science.gov (United States)

    Two statistical approaches, weighted regression on time, discharge, and season and generalized additive models, have recently been used to evaluate water quality trends in estuaries. Both models have been used in similar contexts despite differences in statistical foundations and...

  16. Water quality control system and water quality control method

    International Nuclear Information System (INIS)

    Itsumi, Sachio; Ichikawa, Nagayoshi; Uruma, Hiroshi; Yamada, Kazuya; Seki, Shuji

    1998-01-01

    In the water quality control system of the present invention, portions in contact with water comprise a metal material having a controlled content of iron or chromium, and the chromium content on the surface is increased than that of mother material in a state where compression stresses remain on the surface by mechanical polishing to form an uniform corrosion resistant coating film. In addition, equipments and/or pipelines to which a material controlling corrosion potential stably is applied on the surface are used. There are disposed a cleaning device made of a material less forming impurities, and detecting intrusion of impurities and removing them selectively depending on chemical species and/or a cleaning device for recovering drain from various kinds of equipment to feedwater, connecting a feedwater pipeline and a condensate pipeline and removing impurities and corrosion products. Then, water can be kept to neutral purified water, and the concentrations of oxygen and hydrogen in water are controlled within an optimum range to suppress occurrence of corrosion products. (N.H.)

  17. Quality control in breast tomosynthesis

    International Nuclear Information System (INIS)

    Jakubiak, Rosangela Requi; Messias, Pricila Cordeiro; Santos, Marilia Fernanda; Urban, Linei Augusta B.D.

    2014-01-01

    In Brazil breast cancer is the most common and the leading cause of death among women, with estimated 57,000 new cases in 2014. The mammography (2D) plays an important role in the early detection of breast cancer, but in some cases can be difficult to detect malignant lesions due overlap of breast tissues. The Breast Digital Tomosynthesis (BDT: 3D) reduces the effects of overlap, providing improved characterization of mammographic findings. However, the dose may double as compared to the mammography. This study presents results of Contrast Ratio Noise tests (CRN) and quality image on a Siemens mammography equipment Mammomat Inspiration with tomosynthesis. The CRN was determined with plates Polymethylmethacrylate (PMMA) of 20 to 70 mm thickness and an aluminum plate of 10 mm 2 and 0.2 mm thickness. Image quality was assessed with the ACR Breast Simulator. In assessment of image quality, the detectability of fibers and masses was identical in 2D and 3D systems. Were visualized 4.5 fibers and 4 mass in both modes. In 2D mode groups have been identified 3.5 microcalcifications, and 3D were 3 groups. The Mean Glandular Dose for the simulator in 2D mode was 1.17 mGy and 2.35 mGy for the 3D mode. The result reinforces the importance of quality control in the process of obtaining the images and obtained in accordance CRN values, ensuring image quality and dose compatible in 2D and 3D processes

  18. Quality control in breast tomosynthesis

    Energy Technology Data Exchange (ETDEWEB)

    Jakubiak, R.R.; Messias, P.C.; Santos, M.F., E-mail: requi@utfpr.edu.br [Universidade Tecnologia Federal do Parana (UTFPR), Curitiba, PR (Brazil). Departamento Academico de Fisica; Urban, L.A.B.D., E-mail: lineiurban@hotmail.com [Diagnostico Avancado por Imagem, Curitiba, PR (Brazil)

    2015-07-01

    In Brazil, breast cancer is the most common and the leading cause of death among women, with estimated 57,000 new cases in 2014. The mammography (2D) plays an important role in the early detection of breast cancer, but in some cases can be difficult to detect malignant lesions due overlap of breast tissues. The Digital Breasts Tomosynthesis (DBT: 3D) reduces the effects of overlap, providing improved characterization of mammographic findings. However, the dose may double as compared with mammography. This study presents results of Contrast to Noise Ratio (CNR) and image quality evaluation on Siemens mammography equipment Mammomat Inspiration with tomosynthesis. The CNR was determined with Polymethylmethacrylate (PMMA) layers of 20 to 70 mm thick and an aluminum foils of 0,2 mm thickness and area of 10 mm². Image quality was assessed with the ACR Breast Simulator. In the evaluation of image quality, the detectability of fibers and masses was identical in 2D and 3D systems. Displaying fibers were 4,5 and 4 mass in both modes. In 2D mode were identified 3,5 microcalcifications groups, and 3D showed 3 groups. The Mean Glandular Dose (MGD) for the simulator in 2D mode was 1,17 mGy and 2,35 mGy for the 3D mode. The result reinforces the importance of quality control in the process of obtaining the images and obtained in accordance CNR values, ensuring image quality and compatible dose in 2D and 3D processes. (author)

  19. Quality control in breast tomosynthesis

    International Nuclear Information System (INIS)

    Jakubiak, R.R.; Messias, P.C.; Santos, M.F.

    2015-01-01

    In Brazil, breast cancer is the most common and the leading cause of death among women, with estimated 57,000 new cases in 2014. The mammography (2D) plays an important role in the early detection of breast cancer, but in some cases can be difficult to detect malignant lesions due overlap of breast tissues. The Digital Breasts Tomosynthesis (DBT: 3D) reduces the effects of overlap, providing improved characterization of mammographic findings. However, the dose may double as compared with mammography. This study presents results of Contrast to Noise Ratio (CNR) and image quality evaluation on Siemens mammography equipment Mammomat Inspiration with tomosynthesis. The CNR was determined with Polymethylmethacrylate (PMMA) layers of 20 to 70 mm thick and an aluminum foils of 0,2 mm thickness and area of 10 mm². Image quality was assessed with the ACR Breast Simulator. In the evaluation of image quality, the detectability of fibers and masses was identical in 2D and 3D systems. Displaying fibers were 4,5 and 4 mass in both modes. In 2D mode were identified 3,5 microcalcifications groups, and 3D showed 3 groups. The Mean Glandular Dose (MGD) for the simulator in 2D mode was 1,17 mGy and 2,35 mGy for the 3D mode. The result reinforces the importance of quality control in the process of obtaining the images and obtained in accordance CNR values, ensuring image quality and compatible dose in 2D and 3D processes. (author)

  20. 40 CFR 51.359 - Quality control.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 2 2010-07-01 2010-07-01 false Quality control. 51.359 Section 51.359....359 Quality control. Quality control measures shall insure that emission testing equipment is calibrated and maintained properly, and that inspection, calibration records, and control charts are...

  1. Multivariate statistical characterization of groundwater quality in Ain ...

    African Journals Online (AJOL)

    Administrator

    depends much on the sustainability of the available water resources. Water of .... 18 wells currently in use were selected based on the preliminary field survey carried out to ... In recent times, multivariate statistical methods have been applied ...

  2. Statistical Process Control. Impact and Opportunities for Ohio.

    Science.gov (United States)

    Brown, Harold H.

    The first purpose of this study is to help the reader become aware of the evolution of Statistical Process Control (SPC) as it is being implemented and used in industry today. This is approached through the presentation of a brief historical account of SPC, from its inception through the technological miracle that has occurred in Japan. The…

  3. Statistical Process Control. A Summary. FEU/PICKUP Project Report.

    Science.gov (United States)

    Owen, M.; Clark, I.

    A project was conducted to develop a curriculum and training materials to be used in training industrial operatives in statistical process control (SPC) techniques. During the first phase of the project, questionnaires were sent to 685 companies (215 of which responded) to determine where SPC was being used, what type of SPC firms needed, and how…

  4. Automatic optimisation of beam orientations using the simplex algorithm and optimisation of quality control using statistical process control (S.P.C.) for intensity modulated radiation therapy (I.M.R.T.); Optimisation automatique des incidences des faisceaux par l'algorithme du simplexe et optimisation des controles qualite par la Maitrise Statistique des Processus (MSP) en Radiotherapie Conformationnelle par Modulation d'Intensite (RCMI)

    Energy Technology Data Exchange (ETDEWEB)

    Gerard, K

    2008-11-15

    Intensity Modulated Radiation Therapy (I.M.R.T.) is currently considered as a technique of choice to increase the local control of the tumour while reducing the dose to surrounding organs at risk. However, its routine clinical implementation is partially held back by the excessive amount of work required to prepare the patient treatment. In order to increase the efficiency of the treatment preparation, two axes of work have been defined. The first axis concerned the automatic optimisation of beam orientations. We integrated the simplex algorithm in the treatment planning system. Starting from the dosimetric objectives set by the user, it can automatically determine the optimal beam orientations that best cover the target volume while sparing organs at risk. In addition to time sparing, the simplex results of three patients with a cancer of the oropharynx, showed that the quality of the plan is also increased compared to a manual beam selection. Indeed, for an equivalent or even a better target coverage, it reduces the dose received by the organs at risk. The second axis of work concerned the optimisation of pre-treatment quality control. We used an industrial method: Statistical Process Control (S.P.C.) to retrospectively analyse the absolute dose quality control results performed using an ionisation chamber at Centre Alexis Vautrin (C.A.V.). This study showed that S.P.C. is an efficient method to reinforce treatment security using control charts. It also showed that our dose delivery process was stable and statistically capable for prostate treatments, which implies that a reduction of the number of controls can be considered for this type of treatment at the C.A.V.. (author)

  5. Multivariate Statistical Analysis of Water Quality data in Indian River Lagoon, Florida

    Science.gov (United States)

    Sayemuzzaman, M.; Ye, M.

    2015-12-01

    The Indian River Lagoon, is part of the longest barrier island complex in the United States, is a region of particular concern to the environmental scientist because of the rapid rate of human development throughout the region and the geographical position in between the colder temperate zone and warmer sub-tropical zone. Thus, the surface water quality analysis in this region always brings the newer information. In this present study, multivariate statistical procedures were applied to analyze the spatial and temporal water quality in the Indian River Lagoon over the period 1998-2013. Twelve parameters have been analyzed on twelve key water monitoring stations in and beside the lagoon on monthly datasets (total of 27,648 observations). The dataset was treated using cluster analysis (CA), principle component analysis (PCA) and non-parametric trend analysis. The CA was used to cluster twelve monitoring stations into four groups, with stations on the similar surrounding characteristics being in the same group. The PCA was then applied to the similar groups to find the important water quality parameters. The principal components (PCs), PC1 to PC5 was considered based on the explained cumulative variances 75% to 85% in each cluster groups. Nutrient species (phosphorus and nitrogen), salinity, specific conductivity and erosion factors (TSS, Turbidity) were major variables involved in the construction of the PCs. Statistical significant positive or negative trends and the abrupt trend shift were detected applying Mann-Kendall trend test and Sequential Mann-Kendall (SQMK), for each individual stations for the important water quality parameters. Land use land cover change pattern, local anthropogenic activities and extreme climate such as drought might be associated with these trends. This study presents the multivariate statistical assessment in order to get better information about the quality of surface water. Thus, effective pollution control/management of the surface

  6. 2. Product quality control and assurance system

    International Nuclear Information System (INIS)

    1990-01-01

    Product quality control and assurance are dealt with in relation to reliability in nuclear power engineering. The topics treated include product quality control in nuclear power engineering, product quality assurance of nuclear power plant equipment, quality assurance programs, classification of selected nuclear power equipment, and standards relating to quality control and assurance and to nuclear power engineering. Particular attention is paid to Czechoslovak and CMEA standards. (P.A.). 2 figs., 1 tab., 12 refs

  7. Internal quality control: best practice.

    Science.gov (United States)

    Kinns, Helen; Pitkin, Sarah; Housley, David; Freedman, Danielle B

    2013-12-01

    There is a wide variation in laboratory practice with regard to implementation and review of internal quality control (IQC). A poor approach can lead to a spectrum of scenarios from validation of incorrect patient results to over investigation of falsely rejected analytical runs. This article will provide a practical approach for the routine clinical biochemistry laboratory to introduce an efficient quality control system that will optimise error detection and reduce the rate of false rejection. Each stage of the IQC system is considered, from selection of IQC material to selection of IQC rules, and finally the appropriate action to follow when a rejection signal has been obtained. The main objective of IQC is to ensure day-to-day consistency of an analytical process and thus help to determine whether patient results are reliable enough to be released. The required quality and assay performance varies between analytes as does the definition of a clinically significant error. Unfortunately many laboratories currently decide what is clinically significant at the troubleshooting stage. Assay-specific IQC systems will reduce the number of inappropriate sample-run rejections compared with the blanket use of one IQC rule. In practice, only three or four different IQC rules are required for the whole of the routine biochemistry repertoire as assays are assigned into groups based on performance. The tools to categorise performance and assign IQC rules based on that performance are presented. Although significant investment of time and education is required prior to implementation, laboratories have shown that such systems achieve considerable reductions in cost and labour.

  8. Statistical physics of human beings in games: Controlled experiments

    International Nuclear Information System (INIS)

    Liang Yuan; Huang Ji-Ping

    2014-01-01

    It is important to know whether the laws or phenomena in statistical physics for natural systems with non-adaptive agents still hold for social human systems with adaptive agents, because this implies whether it is possible to study or understand social human systems by using statistical physics originating from natural systems. For this purpose, we review the role of human adaptability in four kinds of specific human behaviors, namely, normal behavior, herd behavior, contrarian behavior, and hedge behavior. The approach is based on controlled experiments in the framework of market-directed resource-allocation games. The role of the controlled experiments could be at least two-fold: adopting the real human decision-making process so that the system under consideration could reflect the performance of genuine human beings; making it possible to obtain macroscopic physical properties of a human system by tuning a particular factor of the system, thus directly revealing cause and effect. As a result, both computer simulations and theoretical analyses help to show a few counterparts of some laws or phenomena in statistical physics for social human systems: two-phase phenomena or phase transitions, entropy-related phenomena, and a non-equilibrium steady state. This review highlights the role of human adaptability in these counterparts, and makes it possible to study or understand some particular social human systems by means of statistical physics coming from natural systems. (topical review - statistical physics and complex systems)

  9. Statistical process control: separating signal from noise in emergency department operations.

    Science.gov (United States)

    Pimentel, Laura; Barrueto, Fermin

    2015-05-01

    Statistical process control (SPC) is a visually appealing and statistically rigorous methodology very suitable to the analysis of emergency department (ED) operations. We demonstrate that the control chart is the primary tool of SPC; it is constructed by plotting data measuring the key quality indicators of operational processes in rationally ordered subgroups such as units of time. Control limits are calculated using formulas reflecting the variation in the data points from one another and from the mean. SPC allows managers to determine whether operational processes are controlled and predictable. We review why the moving range chart is most appropriate for use in the complex ED milieu, how to apply SPC to ED operations, and how to determine when performance improvement is needed. SPC is an excellent tool for operational analysis and quality improvement for these reasons: 1) control charts make large data sets intuitively coherent by integrating statistical and visual descriptions; 2) SPC provides analysis of process stability and capability rather than simple comparison with a benchmark; 3) SPC allows distinction between special cause variation (signal), indicating an unstable process requiring action, and common cause variation (noise), reflecting a stable process; and 4) SPC keeps the focus of quality improvement on process rather than individual performance. Because data have no meaning apart from their context, and every process generates information that can be used to improve it, we contend that SPC should be seriously considered for driving quality improvement in emergency medicine. Copyright © 2015 Elsevier Inc. All rights reserved.

  10. Implementing self sustained quality control procedures in a clinical laboratory.

    Science.gov (United States)

    Khatri, Roshan; K C, Sanjay; Shrestha, Prabodh; Sinha, J N

    2013-01-01

    Quality control is an essential component in every clinical laboratory which maintains the excellence of laboratory standards, supplementing to proper disease diagnosis, patient care and resulting in overall strengthening of health care system. Numerous quality control schemes are available, with combinations of procedures, most of which are tedious, time consuming and can be "too technical" whereas commercially available quality control materials can be expensive especially for laboratories in developing nations like Nepal. Here, we present a procedure performed at our centre with self prepared control serum and use of simple statistical tools for quality assurance. The pooled serum was prepared as per guidelines for preparation of stabilized liquid quality control serum from human sera. Internal Quality Assessment was performed on this sample, on a daily basis which included measurement of 12 routine biochemical parameters. The results were plotted on Levey-Jennings charts and analysed with quality control rules, for a period of one month. The mean levels of biochemical analytes in self prepared control serum were within normal physiological range. This serum was evaluated every day along with patients' samples. The results obtained were plotted on control charts and analysed using common quality control rules to identify possible systematic and random errors. Immediate mitigation measures were taken and the dispatch of erroneous reports was avoided. In this study we try to highlight on a simple internal quality control procedure which can be performed by laboratories, with minimum technology, expenditure, and expertise and improve reliability and validity of the test reports.

  11. Statistics

    CERN Document Server

    Hayslett, H T

    1991-01-01

    Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the

  12. Multivariate statistical characterization of groundwater quality in Ain ...

    African Journals Online (AJOL)

    Administrator

    blended water (group 3), based on the similarity of groundwater quality characteristics. Principal component analysis, applied to the data sets of the three different groups obtained from ...... from Butucatu aquifer in Sao Paulo State, Brazil.

  13. Application of machine learning and expert systems to Statistical Process Control (SPC) chart interpretation

    Science.gov (United States)

    Shewhart, Mark

    1991-01-01

    Statistical Process Control (SPC) charts are one of several tools used in quality control. Other tools include flow charts, histograms, cause and effect diagrams, check sheets, Pareto diagrams, graphs, and scatter diagrams. A control chart is simply a graph which indicates process variation over time. The purpose of drawing a control chart is to detect any changes in the process signalled by abnormal points or patterns on the graph. The Artificial Intelligence Support Center (AISC) of the Acquisition Logistics Division has developed a hybrid machine learning expert system prototype which automates the process of constructing and interpreting control charts.

  14. Quality of statistical reporting in developmental disability journals.

    Science.gov (United States)

    Namasivayam, Aravind K; Yan, Tina; Wong, Wing Yiu Stephanie; van Lieshout, Pascal

    2015-12-01

    Null hypothesis significance testing (NHST) dominates quantitative data analysis, but its use is controversial and has been heavily criticized. The American Psychological Association has advocated the reporting of effect sizes (ES), confidence intervals (CIs), and statistical power analysis to complement NHST results to provide a more comprehensive understanding of research findings. The aim of this paper is to carry out a sample survey of statistical reporting practices in two journals with the highest h5-index scores in the areas of developmental disability and rehabilitation. Using a checklist that includes critical recommendations by American Psychological Association, we examined 100 randomly selected articles out of 456 articles reporting inferential statistics in the year 2013 in the Journal of Autism and Developmental Disorders (JADD) and Research in Developmental Disabilities (RDD). The results showed that for both journals, ES were reported only half the time (JADD 59.3%; RDD 55.87%). These findings are similar to psychology journals, but are in stark contrast to ES reporting in educational journals (73%). Furthermore, a priori power and sample size determination (JADD 10%; RDD 6%), along with reporting and interpreting precision measures (CI: JADD 13.33%; RDD 16.67%), were the least reported metrics in these journals, but not dissimilar to journals in other disciplines. To advance the science in developmental disability and rehabilitation and to bridge the research-to-practice divide, reforms in statistical reporting, such as providing supplemental measures to NHST, are clearly needed.

  15. 14 CFR 21.139 - Quality control.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Quality control. 21.139 Section 21.139... PROCEDURES FOR PRODUCTS AND PARTS Production Certificates § 21.139 Quality control. The applicant must show that he has established and can maintain a quality control system for any product, for which he...

  16. Quality and reliability control on assemblies

    International Nuclear Information System (INIS)

    Mueller, H.

    1976-01-01

    Taking as an example electronic assemblies in printed circuit board engineering, quality control during manufacture is dealt with. After giving a survey of four phases of quality and reliability control, some specific methods of quality control are dealt with by means of a flowchart, and by some examples the necessity and the success of these measures are shown. (RW) [de

  17. 7 CFR 930.44 - Quality control.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 8 2010-01-01 2010-01-01 false Quality control. 930.44 Section 930.44 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Marketing Agreements... Control § 930.44 Quality control. (a) Quality standards. The Board may establish, with the approval of the...

  18. 33 CFR 385.21 - Quality control.

    Science.gov (United States)

    2010-07-01

    ... 33 Navigation and Navigable Waters 3 2010-07-01 2010-07-01 false Quality control. 385.21 Section... Processes § 385.21 Quality control. (a) The Corps of Engineers and the non-Federal sponsor shall prepare a quality control plan, in accordance with applicable Corps of Engineers regulations, for each product that...

  19. 7 CFR 981.42 - Quality control.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 8 2010-01-01 2010-01-01 false Quality control. 981.42 Section 981.42 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Marketing Agreements... Regulating Handling Quality Control § 981.42 Quality control. (a) Incoming. Except as provided in this...

  20. Association between product quality control and process quality control of bulk milk

    NARCIS (Netherlands)

    Velthuis, A.; Asseldonk, van M.A.P.M.

    2010-01-01

    Assessment of dairy-milk quality is based on product quality control (testing bulk-milk samples) and process quality control (auditing dairy farms). It is unknown whether process control improves product quality. To quantify possible association between product control and process control a

  1. Multivariate Statistical Process Control Charts and the Problem of Interpretation: A Short Overview and Some Applications in Industry

    OpenAIRE

    Bersimis, Sotiris; Panaretos, John; Psarakis, Stelios

    2005-01-01

    Woodall and Montgomery [35] in a discussion paper, state that multivariate process control is one of the most rapidly developing sections of statistical process control. Nowadays, in industry, there are many situations in which the simultaneous monitoring or control, of two or more related quality - process characteristics is necessary. Process monitoring problems in which several related variables are of interest are collectively known as Multivariate Statistical Process Control (MSPC).This ...

  2. Statistical transformation and the interpretation of inpatient glucose control data.

    Science.gov (United States)

    Saulnier, George E; Castro, Janna C; Cook, Curtiss B

    2014-03-01

    To introduce a statistical method of assessing hospital-based non-intensive care unit (non-ICU) inpatient glucose control. Point-of-care blood glucose (POC-BG) data from hospital non-ICUs were extracted for January 1 through December 31, 2011. Glucose data distribution was examined before and after Box-Cox transformations and compared to normality. Different subsets of data were used to establish upper and lower control limits, and exponentially weighted moving average (EWMA) control charts were constructed from June, July, and October data as examples to determine if out-of-control events were identified differently in nontransformed versus transformed data. A total of 36,381 POC-BG values were analyzed. In all 3 monthly test samples, glucose distributions in nontransformed data were skewed but approached a normal distribution once transformed. Interpretation of out-of-control events from EWMA control chart analyses also revealed differences. In the June test data, an out-of-control process was identified at sample 53 with nontransformed data, whereas the transformed data remained in control for the duration of the observed period. Analysis of July data demonstrated an out-of-control process sooner in the transformed (sample 55) than nontransformed (sample 111) data, whereas for October, transformed data remained in control longer than nontransformed data. Statistical transformations increase the normal behavior of inpatient non-ICU glycemic data sets. The decision to transform glucose data could influence the interpretation and conclusions about the status of inpatient glycemic control. Further study is required to determine whether transformed versus nontransformed data influence clinical decisions or evaluation of interventions.

  3. Topological and statistical properties of quantum control transition landscapes

    International Nuclear Information System (INIS)

    Hsieh, Michael; Wu Rebing; Rabitz, Herschel; Rosenthal, Carey

    2008-01-01

    A puzzle arising in the control of quantum dynamics is to explain the relative ease with which high-quality control solutions can be found in the laboratory and in simulations. The emerging explanation appears to lie in the nature of the quantum control landscape, which is an observable as a function of the control variables. This work considers the common case of the observable being the transition probability between an initial and a target state. For any controllable quantum system, this landscape contains only global maxima and minima, and no local extrema traps. The probability distribution function for the landscape value is used to calculate the relative volume of the region of the landscape corresponding to good control solutions. The topology of the global optima of the landscape is analysed and the optima are shown to have inherent robustness to variations in the controls. Although the relative landscape volume of good control solutions is found to shrink rapidly as the system Hilbert space dimension increases, the highly favourable landscape topology at and away from the global optima provides a rationale for understanding the relative ease of finding high-quality, stable quantum optimal control solutions

  4. SALE, Quality Control of Analytical Chemical Measurements

    International Nuclear Information System (INIS)

    Bush, W.J.; Gentillon, C.D.

    1985-01-01

    1 - Description of problem or function: The Safeguards Analytical Laboratory Evaluation (SALE) program is a statistical analysis program written to analyze the data received from laboratories participating in the SALE quality control and evaluation program. The system is aimed at identifying and reducing analytical chemical measurement errors. Samples of well-characterized materials are distributed to laboratory participants at periodic intervals for determination of uranium or plutonium concentration and isotopic distributions. The results of these determinations are statistically evaluated and participants are informed of the accuracy and precision of their results. 2 - Method of solution: Various statistical techniques produce the SALE output. Assuming an unbalanced nested design, an analysis of variance is performed, resulting in a test of significance for time and analyst effects. A trend test is performed. Both within- laboratory and between-laboratory standard deviations are calculated. 3 - Restrictions on the complexity of the problem: Up to 1500 pieces of data for each nuclear material sampled by a maximum of 75 laboratories may be analyzed

  5. Statistical physics of human beings in games: Controlled experiments

    Science.gov (United States)

    Liang, Yuan; Huang, Ji-Ping

    2014-07-01

    It is important to know whether the laws or phenomena in statistical physics for natural systems with non-adaptive agents still hold for social human systems with adaptive agents, because this implies whether it is possible to study or understand social human systems by using statistical physics originating from natural systems. For this purpose, we review the role of human adaptability in four kinds of specific human behaviors, namely, normal behavior, herd behavior, contrarian behavior, and hedge behavior. The approach is based on controlled experiments in the framework of market-directed resource-allocation games. The role of the controlled experiments could be at least two-fold: adopting the real human decision-making process so that the system under consideration could reflect the performance of genuine human beings; making it possible to obtain macroscopic physical properties of a human system by tuning a particular factor of the system, thus directly revealing cause and effect. As a result, both computer simulations and theoretical analyses help to show a few counterparts of some laws or phenomena in statistical physics for social human systems: two-phase phenomena or phase transitions, entropy-related phenomena, and a non-equilibrium steady state. This review highlights the role of human adaptability in these counterparts, and makes it possible to study or understand some particular social human systems by means of statistical physics coming from natural systems.

  6. Model-generated air quality statistics for application in vegetation response models in Alberta

    International Nuclear Information System (INIS)

    McVehil, G.E.; Nosal, M.

    1990-01-01

    To test and apply vegetation response models in Alberta, air pollution statistics representative of various parts of the Province are required. At this time, air quality monitoring data of the requisite accuracy and time resolution are not available for most parts of Alberta. Therefore, there exists a need to develop appropriate air quality statistics. The objectives of the work reported here were to determine the applicability of model generated air quality statistics and to develop by modelling, realistic and representative time series of hourly SO 2 concentrations that could be used to generate the statistics demanded by vegetation response models

  7. Statistical process control charts for monitoring military injuries.

    Science.gov (United States)

    Schuh, Anna; Canham-Chervak, Michelle; Jones, Bruce H

    2017-12-01

    An essential aspect of an injury prevention process is surveillance, which quantifies and documents injury rates in populations of interest and enables monitoring of injury frequencies, rates and trends. To drive progress towards injury reduction goals, additional tools are needed. Statistical process control charts, a methodology that has not been previously applied to Army injury monitoring, capitalise on existing medical surveillance data to provide information to leadership about injury trends necessary for prevention planning and evaluation. Statistical process control Shewhart u-charts were created for 49 US Army installations using quarterly injury medical encounter rates, 2007-2015, for active duty soldiers obtained from the Defense Medical Surveillance System. Injuries were defined according to established military injury surveillance recommendations. Charts display control limits three standard deviations (SDs) above and below an installation-specific historical average rate determined using 28 data points, 2007-2013. Charts are available in Army strategic management dashboards. From 2007 to 2015, Army injury rates ranged from 1254 to 1494 unique injuries per 1000 person-years. Installation injury rates ranged from 610 to 2312 injuries per 1000 person-years. Control charts identified four installations with injury rates exceeding the upper control limits at least once during 2014-2015, rates at three installations exceeded the lower control limit at least once and 42 installations had rates that fluctuated around the historical mean. Control charts can be used to drive progress towards injury reduction goals by indicating statistically significant increases and decreases in injury rates. Future applications to military subpopulations, other health outcome metrics and chart enhancements are suggested. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  8. 77 FR 46096 - Statistical Process Controls for Blood Establishments; Public Workshop

    Science.gov (United States)

    2012-08-02

    ...] Statistical Process Controls for Blood Establishments; Public Workshop AGENCY: Food and Drug Administration... workshop entitled: ``Statistical Process Controls for Blood Establishments.'' The purpose of this public workshop is to discuss the implementation of statistical process controls to validate and monitor...

  9. A new instrument for statistical process control of thermoset molding

    International Nuclear Information System (INIS)

    Day, D.R.; Lee, H.L.; Shepard, D.D.; Sheppard, N.F.

    1991-01-01

    The recent development of a rugged ceramic mold mounted dielectric sensor and high speed dielectric instrumentation now enables monitoring and statistical process control of production molding over thousands of runs. In this work special instrumentation and software (ICAM-1000) was utilized that automatically extracts critical point during the molding process including flow point, viscosity minimum gel inflection, and reaction endpoint. In addition, other sensors were incorporated to measure temperature and pressure. The critical point as well as temperature and pressure were then recorded during normal production and then plotted in the form of statistical process control (SPC) charts. Experiments have been carried out in RIM, SMC, and RTM type molding operations. The influence of temperature, pressure chemistry, and other variables has been investigated. In this paper examples of both RIM and SMC are discussed

  10. Statistical disclosure control for microdata methods and applications in R

    CERN Document Server

    Templ, Matthias

    2017-01-01

    This book on statistical disclosure control presents the theory, applications and software implementation of the traditional approach to (micro)data anonymization, including data perturbation methods, disclosure risk, data utility, information loss and methods for simulating synthetic data. Introducing readers to the R packages sdcMicro and simPop, the book also features numerous examples and exercises with solutions, as well as case studies with real-world data, accompanied by the underlying R code to allow readers to reproduce all results. The demand for and volume of data from surveys, registers or other sources containing sensible information on persons or enterprises have increased significantly over the last several years. At the same time, privacy protection principles and regulations have imposed restrictions on the access and use of individual data. Proper and secure microdata dissemination calls for the application of statistical disclosure control methods to the data before release. This book is in...

  11. Statistics

    Science.gov (United States)

    Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.

  12. Application of statistical process control to qualitative molecular diagnostic assays.

    Directory of Open Access Journals (Sweden)

    Cathal P O'brien

    2014-11-01

    Full Text Available Modern pathology laboratories and in particular high throughput laboratories such as clinical chemistry have developed a reliable system for statistical process control. Such a system is absent from the majority of molecular laboratories and where present is confined to quantitative assays. As the inability to apply statistical process control to assay is an obvious disadvantage this study aimed to solve this problem by using a frequency estimate coupled with a confidence interval calculation to detect deviations from an expected mutation frequency. The results of this study demonstrate the strengths and weaknesses of this approach and highlight minimum sample number requirements. Notably, assays with low mutation frequencies and detection of small deviations from an expected value require greater samples with a resultant protracted time to detection. Modelled laboratory data was also used to highlight how this approach might be applied in a routine molecular laboratory. This article is the first to describe the application of statistical process control to qualitative laboratory data.

  13. Quality control analysis at the hospital

    International Nuclear Information System (INIS)

    Kristensen, K.

    1979-01-01

    Quality control analysis is an integral part of quality assurance. In a system as with radiopharmaceuticals where part of the finishing of the product takes place at individual hospitals, the need for quality control analysis at the hospital can be discussed. Data are presented that stresses the importance of quality control by the manufacturer as a basis for limitation of such work at hospitals. A simplified programme is proposed

  14. Quality control guarantees the safety of radiotherapy

    International Nuclear Information System (INIS)

    Aaltonen, P.

    1994-01-01

    While radiotherapy equipment has seen some decisive improvements in the last few decades, the technology has also become more complicated. The advanced equipment produces increasingly good treatment results, but the condition of the equipment must be controlled efficiently so as to eliminate any defects that might jeopardise patient safety. The quality assurance measures that are taken to show that certain equipment functions as required are known as quality control. The advanced equipment and stricter requirements set for the precision of radiotherapy have meant that more attention must be paid to quality control. The present radiation legislation stipulates that radiotherapy equipment must undergo regular quality control. The implementation of the quality control is supervised by the Finnish Centre for Radiation and Nuclear Safety (STUK). Hospitals carry out quality control in accordance with a programme approved by STUK, and STUK inspectors periodically visit hospitals to check the results of quality control. (orig.)

  15. Radiopharmaceutical quality control-Pragmatic approach

    International Nuclear Information System (INIS)

    Barbier, Y.

    1994-01-01

    The quality control must be considered in a practical manner. The radiopharmaceuticals are drugs. They must satisfy the quality assurance control. These products are then conform to Pharmacopeia. But sometimes the user must control some data especially radiochemical purity and pH value. On all the administered solutions four controls are compulsory: radionuclide identity, administered radioactivity, organoleptic character and pH

  16. Statistical process control using optimized neural networks: a case study.

    Science.gov (United States)

    Addeh, Jalil; Ebrahimzadeh, Ata; Azarbad, Milad; Ranaee, Vahid

    2014-09-01

    The most common statistical process control (SPC) tools employed for monitoring process changes are control charts. A control chart demonstrates that the process has altered by generating an out-of-control signal. This study investigates the design of an accurate system for the control chart patterns (CCPs) recognition in two aspects. First, an efficient system is introduced that includes two main modules: feature extraction module and classifier module. In the feature extraction module, a proper set of shape features and statistical feature are proposed as the efficient characteristics of the patterns. In the classifier module, several neural networks, such as multilayer perceptron, probabilistic neural network and radial basis function are investigated. Based on an experimental study, the best classifier is chosen in order to recognize the CCPs. Second, a hybrid heuristic recognition system is introduced based on cuckoo optimization algorithm (COA) algorithm to improve the generalization performance of the classifier. The simulation results show that the proposed algorithm has high recognition accuracy. Copyright © 2013 ISA. Published by Elsevier Ltd. All rights reserved.

  17. Expert database system for quality control

    Science.gov (United States)

    Wang, Anne J.; Li, Zhi-Cheng

    1993-09-01

    There are more competitors today. Markets are not homogeneous they are fragmented into increasingly focused niches requiring greater flexibility in the product mix shorter manufacturing production runs and above allhigher quality. In this paper the author identified a real-time expert system as a way to improve plantwide quality management. The quality control expert database system (QCEDS) by integrating knowledge of experts in operations quality management and computer systems use all information relevant to quality managementfacts as well as rulesto determine if a product meets quality standards. Keywords: expert system quality control data base

  18. Quality control of activity detectors

    International Nuclear Information System (INIS)

    Surma, M.J.

    2002-01-01

    The conditions decided on radiometric measurements quality as geometry, background, calibration etc. have been described. The testing methods for achieving high quality of radioactivity measurements using nuclear medicine instruments have been recommended

  19. Statistics

    International Nuclear Information System (INIS)

    2005-01-01

    For the years 2004 and 2005 the figures shown in the tables of Energy Review are partly preliminary. The annual statistics published in Energy Review are presented in more detail in a publication called Energy Statistics that comes out yearly. Energy Statistics also includes historical time-series over a longer period of time (see e.g. Energy Statistics, Statistics Finland, Helsinki 2004.) The applied energy units and conversion coefficients are shown in the back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes, precautionary stock fees and oil pollution fees

  20. 30 CFR 74.6 - Quality control.

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Quality control. 74.6 Section 74.6 Mineral... of the CMDPSU will be maintained in production through adequate quality control procedures, MSHA and... DUST SAMPLING DEVICES Approval Requirements for Coal Mine Dust Personal Sampler Unit § 74.6 Quality...

  1. Employee quality, monitoring environment and internal control

    OpenAIRE

    Chunli Liu; Bin Lin; Wei Shu

    2017-01-01

    We investigate the effect of internal control employees (ICEs) on internal control quality. Using special survey data from Chinese listed firms, we find that ICE quality has a significant positive influence on internal control quality. We examine the effect of monitoring on this result and find that the effect is more pronounced for firms with strict monitoring environments, especially when the firms implement the Chinese internal control regulation system (CSOX), have higher institutional ow...

  2. Problems of quality assurance and quality control in diagnostic radiology

    International Nuclear Information System (INIS)

    Angerstein, W.

    1986-01-01

    Topical problems of quality assurance and quality control in diagnostic radiology are discussed and possible solutions are shown. Complex units are differentiated with reference to physicians, technicians, organization of labour, methods of examination and indication. Quality control of radiologic imaging systems should involve three stages: (1) simple tests carried out by radiologic technicians, (2) measurements by service technicians, (3) testing of products by the manufacturer and independent governmental or health service test agencies. (author)

  3. Adaptive Statistical Iterative Reconstruction-V Versus Adaptive Statistical Iterative Reconstruction: Impact on Dose Reduction and Image Quality in Body Computed Tomography.

    Science.gov (United States)

    Gatti, Marco; Marchisio, Filippo; Fronda, Marco; Rampado, Osvaldo; Faletti, Riccardo; Bergamasco, Laura; Ropolo, Roberto; Fonio, Paolo

    The aim of this study was to evaluate the impact on dose reduction and image quality of the new iterative reconstruction technique: adaptive statistical iterative reconstruction (ASIR-V). Fifty consecutive oncologic patients acted as case controls undergoing during their follow-up a computed tomography scan both with ASIR and ASIR-V. Each study was analyzed in a double-blinded fashion by 2 radiologists. Both quantitative and qualitative analyses of image quality were conducted. Computed tomography scanner radiation output was 38% (29%-45%) lower (P ASIR-V examinations than for the ASIR ones. The quantitative image noise was significantly lower (P ASIR-V. Adaptive statistical iterative reconstruction-V had a higher performance for the subjective image noise (P = 0.01 for 5 mm and P = 0.009 for 1.25 mm), the other parameters (image sharpness, diagnostic acceptability, and overall image quality) being similar (P > 0.05). Adaptive statistical iterative reconstruction-V is a new iterative reconstruction technique that has the potential to provide image quality equal to or greater than ASIR, with a dose reduction around 40%.

  4. Statistics

    International Nuclear Information System (INIS)

    2001-01-01

    For the year 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions from the use of fossil fuels, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in 2000, Energy exports by recipient country in 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  5. Statistics

    International Nuclear Information System (INIS)

    2000-01-01

    For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g., Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-March 2000, Energy exports by recipient country in January-March 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  6. Statistics

    International Nuclear Information System (INIS)

    1999-01-01

    For the year 1998 and the year 1999, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 1999, Energy exports by recipient country in January-June 1999, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  7. Quality control in tile production

    Science.gov (United States)

    Kalviainen, Heikki A.; Kukkonen, Saku; Hyvarinen, Timo S.; Parkkinen, Jussi P. S.

    1998-10-01

    This work studies visual quality control in ceramics industry. In tile manufacturing, it is important that in each set of tiles, every single tile looks similar. For example, the tiles should have similar color and texture. Our goal is to design a machine vision system that can estimate the sufficient similarity or same appearance to the human eye. Currently, the estimation is usually done by human vision. Differing from other approaches our aim is to use accurate spectral representation of color, and we are comparing spectral features to the RGB color features. A laboratory system for color measurement is built. Experimentations with five classes of brown tiles are presented. We use chromaticity RGB features and several spectral features for classification with the k-NN classifier and with a neural network, called Self-Organizing Map. We can classify many of the tiles but there are several problems that need further investigations: larger training and test sets are needed, illuminations effects must be studied further, and more suitable spectral features are needed with more sophisticated classifiers. It is also interesting to develop further the neural approach.

  8. Implementation of statistical process control for proteomic experiments via LC MS/MS.

    Science.gov (United States)

    Bereman, Michael S; Johnson, Richard; Bollinger, James; Boss, Yuval; Shulman, Nick; MacLean, Brendan; Hoofnagle, Andrew N; MacCoss, Michael J

    2014-04-01

    Statistical process control (SPC) is a robust set of tools that aids in the visualization, detection, and identification of assignable causes of variation in any process that creates products, services, or information. A tool has been developed termed Statistical Process Control in Proteomics (SProCoP) which implements aspects of SPC (e.g., control charts and Pareto analysis) into the Skyline proteomics software. It monitors five quality control metrics in a shotgun or targeted proteomic workflow. None of these metrics require peptide identification. The source code, written in the R statistical language, runs directly from the Skyline interface, which supports the use of raw data files from several of the mass spectrometry vendors. It provides real time evaluation of the chromatographic performance (e.g., retention time reproducibility, peak asymmetry, and resolution), and mass spectrometric performance (targeted peptide ion intensity and mass measurement accuracy for high resolving power instruments) via control charts. Thresholds are experiment- and instrument-specific and are determined empirically from user-defined quality control standards that enable the separation of random noise and systematic error. Finally, Pareto analysis provides a summary of performance metrics and guides the user to metrics with high variance. The utility of these charts to evaluate proteomic experiments is illustrated in two case studies.

  9. Analytical techniques and quality control in biomedical trace element research

    DEFF Research Database (Denmark)

    Heydorn, K.

    1994-01-01

    The small number of analytical results in trace element research calls for special methods of quality control. It is shown that when the analytical methods are in statistical control, only small numbers of duplicate or replicate results are needed to ascertain the absence of systematic errors....../kg. Measurement compatibility is obtained by control of traceability to certified reference materials, (C) 1994 Wiley-Liss, Inc....

  10. Quality research in healthcare: are researchers getting enough statistical support?

    Directory of Open Access Journals (Sweden)

    Ambler Gareth

    2006-01-01

    Full Text Available Abstract Background Reviews of peer-reviewed health studies have highlighted problems with their methodological quality. As published health studies form the basis of many clinical decisions including evaluation and provisions of health services, this has scientific and ethical implications. The lack of involvement of methodologists (defined as statisticians or quantitative epidemiologists has been suggested as one key reason for this problem and this has been linked to the lack of access to methodologists. This issue was highlighted several years ago and it was suggested that more investments were needed from health care organisations and Universities to alleviate this problem. Methods To assess the current level of methodological support available for health researchers in England, we surveyed the 25 National Health Services Trusts in England, that are the major recipients of the Department of Health's research and development (R&D support funding. Results and discussion The survey shows that the earmarking of resources to provide appropriate methodological support to health researchers in these organisations is not widespread. Neither the level of R&D support funding received nor the volume of research undertaken by these organisations showed any association with the amount they spent in providing a central resource for methodological support for their researchers. Conclusion The promotion and delivery of high quality health research requires that organisations hosting health research and their academic partners put in place funding and systems to provide appropriate methodological support to ensure valid research findings. If resources are limited, health researchers may have to rely on short courses and/or a limited number of advisory sessions which may not always produce satisfactory results.

  11. HIV quality report cards: impact of case-mix adjustment and statistical methods.

    Science.gov (United States)

    Ohl, Michael E; Richardson, Kelly K; Goto, Michihiko; Vaughan-Sarrazin, Mary; Schweizer, Marin L; Perencevich, Eli N

    2014-10-15

    There will be increasing pressure to publicly report and rank the performance of healthcare systems on human immunodeficiency virus (HIV) quality measures. To inform discussion of public reporting, we evaluated the influence of case-mix adjustment when ranking individual care systems on the viral control quality measure. We used data from the Veterans Health Administration (VHA) HIV Clinical Case Registry and administrative databases to estimate case-mix adjusted viral control for 91 local systems caring for 12 368 patients. We compared results using 2 adjustment methods, the observed-to-expected estimator and the risk-standardized ratio. Overall, 10 913 patients (88.2%) achieved viral control (viral load ≤400 copies/mL). Prior to case-mix adjustment, system-level viral control ranged from 51% to 100%. Seventeen (19%) systems were labeled as low outliers (performance significantly below the overall mean) and 11 (12%) as high outliers. Adjustment for case mix (patient demographics, comorbidity, CD4 nadir, time on therapy, and income from VHA administrative databases) reduced the number of low outliers by approximately one-third, but results differed by method. The adjustment model had moderate discrimination (c statistic = 0.66), suggesting potential for unadjusted risk when using administrative data to measure case mix. Case-mix adjustment affects rankings of care systems on the viral control quality measure. Given the sensitivity of rankings to selection of case-mix adjustment methods-and potential for unadjusted risk when using variables limited to current administrative databases-the HIV care community should explore optimal methods for case-mix adjustment before moving forward with public reporting. Published by Oxford University Press on behalf of the Infectious Diseases Society of America 2014. This work is written by (a) US Government employee(s) and is in the public domain in the US.

  12. Evaluating Statistical Process Control (SPC) techniques and computing the uncertainty of force calibrations

    Science.gov (United States)

    Navard, Sharon E.

    1989-01-01

    In recent years there has been a push within NASA to use statistical techniques to improve the quality of production. Two areas where statistics are used are in establishing product and process quality control of flight hardware and in evaluating the uncertainty of calibration of instruments. The Flight Systems Quality Engineering branch is responsible for developing and assuring the quality of all flight hardware; the statistical process control methods employed are reviewed and evaluated. The Measurement Standards and Calibration Laboratory performs the calibration of all instruments used on-site at JSC as well as those used by all off-site contractors. These calibrations must be performed in such a way as to be traceable to national standards maintained by the National Institute of Standards and Technology, and they must meet a four-to-one ratio of the instrument specifications to calibrating standard uncertainty. In some instances this ratio is not met, and in these cases it is desirable to compute the exact uncertainty of the calibration and determine ways of reducing it. A particular example where this problem is encountered is with a machine which does automatic calibrations of force. The process of force calibration using the United Force Machine is described in detail. The sources of error are identified and quantified when possible. Suggestions for improvement are made.

  13. Production process and quality control for the HTTR fuel

    International Nuclear Information System (INIS)

    Yoshimuta, S.; Suzuki, N.; Kaneko, M.; Fukuda, K.

    1991-01-01

    Development of the production and inspection technology for High Temperature Engineering Test Reactor (HTTR) fuel has been carried out by cooperative work between Japan Atomic Energy Research Institute (JAERI) and Nuclear Fuel Industries, Ltd (NFI). The performance and the quality level of the developed fuel are well established to meet the design requirements of the HTTR. For the commercial scale production of the fuel, statistical quality control and quality assurance must be carefully considered in order to assure the safety of the HTTR. It is also important to produce the fuel under well controlled process condition. To meet these requirements in the production of the HTTR fuel, a new production process and quality control system is to be introduced in the new facilities. The main feature of the system is a computer integrated control system. Process control data at each production stage of products and semi-products are all gathered by terminal computers and processed by a host computer. The processed information is effectively used for the production, quality and accountancy control. With the aid of this system, all the products will be easily traceable from starting materials to final stages and the statistical evaluation of the quality of products becomes more reliable. (author). 8 figs

  14. Statistics

    International Nuclear Information System (INIS)

    2003-01-01

    For the year 2002, part of the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot 2001, Statistics Finland, Helsinki 2002). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supply and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees on energy products

  15. Statistics

    International Nuclear Information System (INIS)

    2004-01-01

    For the year 2003 and 2004, the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot, Statistics Finland, Helsinki 2003, ISSN 0785-3165). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-March 2004, Energy exports by recipient country in January-March 2004, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees

  16. Statistics

    International Nuclear Information System (INIS)

    2000-01-01

    For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy also includes historical time series over a longer period (see e.g., Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 2000, Energy exports by recipient country in January-June 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  17. Statistical process control applied to the manufacturing of beryllia ceramics

    International Nuclear Information System (INIS)

    Ferguson, G.P.; Jech, D.E.; Sepulveda, J.L.

    1991-01-01

    To compete effectively in an international market, scrap and re-work costs must be minimized. Statistical Process Control (SPC) provides powerful tools to optimize production performance. These techniques are currently being applied to the forming, metallizing, and brazing of beryllia ceramic components. This paper describes specific examples of applications of SPC to dry-pressing of beryllium oxide 2x2 substrates, to Mo-Mn refractory metallization, and to metallization and brazing of plasma tubes used in lasers where adhesion strength is critical

  18. INSTITUTIONAL MANAGEMENT OF EUROPEAN STATISTICS AND OF THEIR QUALITY - CURRENT CONCERNS AT EUROPEAN LEVEL

    Directory of Open Access Journals (Sweden)

    Daniela ŞTEFĂNESCU

    2011-08-01

    Full Text Available The issues referring to official statistics quality and reliability became the main topics of debates as far as statistical governance in Europe is concerned. The Council welcomed the Commission Communication to the European Parliament and to the Council « Towards robust quality management for European Statistics » (COM 211, appreciating that the approach and the objective of the strategy would confer the European Statistical System (ESS the quality management framework for the coordination of consolidated economic policies. The Council pointed out that the European Statistical System management was improved during recent years, that progress was noticed in relation with high quality statistics production and dissemination within the European Union, but has also noticed that, in the context of recent financial crisis, certain weaknesses were identified, particularly related to quality management general framework.„Greece Case” proved that progresses were not enough for guaranteeing the complete independence of national statistical institutes and entailed the need for further consolidating ESS governance. Several undertakings are now in the preparatory stage, in accordance with the Commission Communication; these actions are welcomed, but the question arise: are these sufficient for definitively solving the problem?The paper aims to go ahead in the attempt of identifying a different way, innovative (courageous! on the long run, towards an advanced institutional structure of ESS, by setting up the European System of Statistical Institutes, similar to the European System of Central Banks, that would require a change in the Treaty.

  19. Teaching Quality Control with Chocolate Chip Cookies

    Science.gov (United States)

    Baker, Ardith

    2014-01-01

    Chocolate chip cookies are used to illustrate the importance and effectiveness of control charts in Statistical Process Control. By counting the number of chocolate chips, creating the spreadsheet, calculating the control limits and graphing the control charts, the student becomes actively engaged in the learning process. In addition, examining…

  20. Batch-to-batch quality consistency evaluation of botanical drug products using multivariate statistical analysis of the chromatographic fingerprint.

    Science.gov (United States)

    Xiong, Haoshu; Yu, Lawrence X; Qu, Haibin

    2013-06-01

    Botanical drug products have batch-to-batch quality variability due to botanical raw materials and the current manufacturing process. The rational evaluation and control of product quality consistency are essential to ensure the efficacy and safety. Chromatographic fingerprinting is an important and widely used tool to characterize the chemical composition of botanical drug products. Multivariate statistical analysis has showed its efficacy and applicability in the quality evaluation of many kinds of industrial products. In this paper, the combined use of multivariate statistical analysis and chromatographic fingerprinting is presented here to evaluate batch-to-batch quality consistency of botanical drug products. A typical botanical drug product in China, Shenmai injection, was selected as the example to demonstrate the feasibility of this approach. The high-performance liquid chromatographic fingerprint data of historical batches were collected from a traditional Chinese medicine manufacturing factory. Characteristic peaks were weighted by their variability among production batches. A principal component analysis model was established after outliers were modified or removed. Multivariate (Hotelling T(2) and DModX) control charts were finally successfully applied to evaluate the quality consistency. The results suggest useful applications for a combination of multivariate statistical analysis with chromatographic fingerprinting in batch-to-batch quality consistency evaluation for the manufacture of botanical drug products.

  1. Shipping/Receiving and Quality Control

    Data.gov (United States)

    Federal Laboratory Consortium — Shipping receiving, quality control, large and precise inspection and CMM machines. Coordinate Measuring Machines, including "scanning" probes, optical comparators,...

  2. Quality control of static irradiation processing products

    International Nuclear Information System (INIS)

    Bao Jianzhong; Chen Xiulan; Cao Hong; Zhai Jianqing

    2002-01-01

    Based on the irradiation processing practice of the nuclear technique application laboratory of Yangzhou Institute of Agricultural Science, the quality control of irradiation processing products is discussed

  3. Establishment for quality control of experimental animal

    International Nuclear Information System (INIS)

    Kim, Tae Hwan; Kim, Soo Kwan; Kim, Tae Kyoung

    1999-06-01

    Until now, because we have imported experimental animal from foreign experimental animal corporation, we could have saved money by establishing the quality control of animal in barrier system. In order to improve the quality of animal experiment and efficiency of biomedical study, it is indispensable to control many factors that effect in the experiment. Therefore, it is essential to organize the system of laboratory animal care for enhancing reliability and revivability of experimental results. The purpose of the present investigation was to establish the quality control system of experimental animals that we can provide good quality animals according to the experimental condition of each investigator although the exact quality control system to estimate the infection of bacteria and virus easily remains ill-defined yet. Accordingly, we established the useful quality control system for microbiologic monitoring and environmental monitoring to protect experimental animal from harmful bacteria and virus

  4. Establishment for quality control of experimental animal

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Tae Hwan; Kim, Soo Kwan; Kim, Tae Kyoung

    1999-06-01

    Until now, because we have imported experimental animal from foreign experimental animal corporation, we could have saved money by establishing the quality control of animal in barrier system. In order to improve the quality of animal experiment and efficiency of biomedical study, it is indispensable to control many factors that effect in the experiment. Therefore, it is essential to organize the system of laboratory animal care for enhancing reliability and revivability of experimental results. The purpose of the present investigation was to establish the quality control system of experimental animals that we can provide good quality animals according to the experimental condition of each investigator although the exact quality control system to estimate the infection of bacteria and virus easily remains ill-defined yet. Accordingly, we established the useful quality control system for microbiologic monitoring and environmental monitoring to protect experimental animal from harmful bacteria and virus.

  5. Pengendalian Kualitas Produk Di Industri Garment Dengan Menggunakan Statistical Procces Control (SPC

    Directory of Open Access Journals (Sweden)

    Rizal Rachman

    2017-09-01

    Full Text Available Abstrak Perusahaan memandang bahwa kualitas sebagai faktor kunci yang membawa keberhasilan dan standar mutu yang telah ditetapkan oleh buyer. Tujuan penelitian ini adalah untuk mengetahui tingkat kerusakan produk dalam batas pengendalian kualitas pada proses produksi pakaian jadi pada PT. Asia Penta Garment. Penelitian ini menggunakan metode statistical procces control. Data yang diambil dalam penelitian ini mengunakan data sekunder berupa laporan jumlah produksi dan kerusakan pakaian jadi di bagian finishing pada Januari 2017. Berdasarkan hasil menunjukkan kerusakan diluar batas pengendalian yaitu ada yang diluar batas kendali (out of control dengan batas pengendalian atas (UCL dan batas pengendalian bawah (LCL dan rata-rata kerusakan diluar batas kendali.Untuk meningkatkan kualitas produk khususnya pakaian yang dihasilkan perusahaan, kebijakan mutu yang telah ditetapkan harus dijalankan dengan benar, antara lain dalam pemilihan negoisasi bahan baku dengan buyer sesuai standar, perekrutan tenaga kerja yang berpengalaman, kedisiplinan kerja yang tinggi, pembinaan para karyawan, pemberian bonus pada karyawan yang sesuai target dan disiplin tinggi, perbaikan mesin secara terus menerus dan memperbaiki lingkungan kerja yang bersih, nyaman, serta aman.   Kata Kunci : Pengendalian kualitas, Kualitas produk, SPC. Abstract The Company considers that quality as a key factor that brings success and quality standards set by the buyer. The purpose of this study was to determine the level of damage to the product within the limits of quality control in the production process apparel in PT. Asia Penta Garment. This study uses a statistical procces control methode. Data taken in this study using secondary data from reports on the number of production and damage to clothing in the finishing section in January 2017. Based on the results show the damage outside the control limits is nothing beyond the control limit (out of control with the upper control limit

  6. The Profile of Creativity and Proposing Statistical Problem Quality Level Reviewed From Cognitive Style

    Science.gov (United States)

    Awi; Ahmar, A. S.; Rahman, A.; Minggi, I.; Mulbar, U.; Asdar; Ruslan; Upu, H.; Alimuddin; Hamda; Rosidah; Sutamrin; Tiro, M. A.; Rusli

    2018-01-01

    This research aims to reveal the profile about the level of creativity and the ability to propose statistical problem of students at Mathematics Education 2014 Batch in the State University of Makassar in terms of their cognitive style. This research uses explorative qualitative method by giving meta-cognitive scaffolding at the time of research. The hypothesis of research is that students who have field independent (FI) cognitive style in statistics problem posing from the provided information already able to propose the statistical problem that can be solved and create new data and the problem is already been included as a high quality statistical problem, while students who have dependent cognitive field (FD) commonly are still limited in statistics problem posing that can be finished and do not load new data and the problem is included as medium quality statistical problem.

  7. Quality control of mammographic systems

    International Nuclear Information System (INIS)

    Espana Lopez, M. L.

    2001-01-01

    High quality in mammography is a difficult objective to achieve, that is the reason for what efforts are made in order to improve equipment, to offer good combinations screen-film, and professional staff dedicated to this technique [es

  8. Employee quality, monitoring environment and internal control

    Directory of Open Access Journals (Sweden)

    Chunli Liu

    2017-03-01

    Full Text Available We investigate the effect of internal control employees (ICEs on internal control quality. Using special survey data from Chinese listed firms, we find that ICE quality has a significant positive influence on internal control quality. We examine the effect of monitoring on this result and find that the effect is more pronounced for firms with strict monitoring environments, especially when the firms implement the Chinese internal control regulation system (CSOX, have higher institutional ownership or attach greater importance to internal control. Our findings suggest that ICEs play an important role in the design and implementation of internal control systems. Our study should be of interest to both top managers who wish to improve corporate internal control quality and regulators who wish to understand the mechanisms of internal control monitoring.

  9. Quality control education in the community college

    Science.gov (United States)

    Greene, J. Griffen; Wilson, Steve

    1966-01-01

    This paper describes the Quality Control Program at Daytona Beach Junior College, including course descriptions. The program in quality control required communication between the college and the American Society for Quality Control (ASQC). The college has machinery established for certification of the learning process, and the society has the source of teachers who are competent in the technical field and who are the employers of the educational products. The associate degree for quality control does not have a fixed program, which can serve all needs, any more than all engineering degrees have identical programs. The main ideas which would be common to all quality control programs are the concept of economic control of a repetitive process and the concept of developing individual potentialities into individuals who are needed and productive.

  10. Combining Statistical Methodologies in Water Quality Monitoring in a Hydrological Basin - Space and Time Approaches

    OpenAIRE

    Costa, Marco; A. Manuela Gonçalves

    2012-01-01

    In this work are discussed some statistical approaches that combine multivariate statistical techniques and time series analysis in order to describe and model spatial patterns and temporal evolution by observing hydrological series of water quality variables recorded in time and space. These approaches are illustrated with a data set collected in the River Ave hydrological basin located in the Northwest region of Portugal.

  11. Industrial statistics and its recent contributions to total quality in the Netherlands

    NARCIS (Netherlands)

    Does, R.J.M.M.; Roes, K.C.B.

    1996-01-01

    The use of statistical methods in quality management has a long history. Most of the pioneers, such as Walter A. Shewhart and W. Edwards Deming, refer to themselves as statisticians. Statistical thinking in industry means that all work is a series of interconnected processes, that all processes show

  12. Temporal aspects of surface water quality variation using robust statistical tools.

    Science.gov (United States)

    Mustapha, Adamu; Aris, Ahmad Zaharin; Ramli, Mohammad Firuz; Juahir, Hafizan

    2012-01-01

    Robust statistical tools were applied on the water quality datasets with the aim of determining the most significance parameters and their contribution towards temporal water quality variation. Surface water samples were collected from four different sampling points during dry and wet seasons and analyzed for their physicochemical constituents. Discriminant analysis (DA) provided better results with great discriminatory ability by using five parameters with (P < 0.05) for dry season affording more than 96% correct assignation and used five and six parameters for forward and backward stepwise in wet season data with P-value (P < 0.05) affording 68.20% and 82%, respectively. Partial correlation results revealed that there are strong (r(p) = 0.829) and moderate (r(p) = 0.614) relationships between five-day biochemical oxygen demand (BOD(5)) and chemical oxygen demand (COD), total solids (TS) and dissolved solids (DS) controlling for the linear effect of nitrogen in the form of ammonia (NH(3)) and conductivity for dry and wet seasons, respectively. Multiple linear regression identified the contribution of each variable with significant values r = 0.988, R(2) = 0.976 and r = 0.970, R(2) = 0.942 (P < 0.05) for dry and wet seasons, respectively. Repeated measure t-test confirmed that the surface water quality varies significantly between the seasons with significant value P < 0.05.

  13. Effect of radiation dose and adaptive statistical iterative reconstruction on image quality of pulmonary computed tomography

    International Nuclear Information System (INIS)

    Sato, Jiro; Akahane, Masaaki; Inano, Sachiko; Terasaki, Mariko; Akai, Hiroyuki; Katsura, Masaki; Matsuda, Izuru; Kunimatsu, Akira; Ohtomo, Kuni

    2012-01-01

    The purpose of this study was to assess the effects of dose and adaptive statistical iterative reconstruction (ASIR) on image quality of pulmonary computed tomography (CT). Inflated and fixed porcine lungs were scanned with a 64-slice CT system at 10, 20, 40 and 400 mAs. Using automatic exposure control, 40 mAs was chosen as standard dose. Scan data were reconstructed with filtered back projection (FBP) and ASIR. Image pairs were obtained by factorial combination of images at a selected level. Using a 21-point scale, three experienced radiologists independently rated differences in quality between adjacently displayed paired images for image noise, image sharpness and conspicuity of tiny nodules. A subjective quality score (SQS) for each image was computed based on Anderson's functional measurement theory. The standard deviation was recorded as a quantitative noise measurement. At all doses examined, SQSs improved with ASIR for all evaluation items. No significant differences were noted between the SQSs for 40%-ASIR images obtained at 20 mAs and those for FBP images at 40 mAs. Compared to the FBP algorithm, ASIR for lung CT can enable an approximately 50% dose reduction from the standard dose while preserving visualization of small structures. (author)

  14. Related regulation of quality control of industrial products

    International Nuclear Information System (INIS)

    1983-04-01

    This book introduce related regulation of quality control of industrial products, which includes regulations of industrial products quality control, enforcement ordinance of industrial products quality control, enforcement regulation of quality control of industrial products, designated items with industrial production quality indication, industrial production quality test, and industrial production quality test organization and management tips of factory quality by grade.

  15. Quality control of nuclear medicine instrumentation

    International Nuclear Information System (INIS)

    Mould, R.F.

    1983-09-01

    The proceedings of a conference held by the Hospital Physicists' Association in London 1983 on the quality control of nuclear medicine instrumentation are presented. Section I deals with the performance of the Anger gamma camera including assessment during manufacture, acceptance testing, routine testing and long-term assessment of results. Section II covers interfaces, computers, the quality control problems of emission tomography and the quality of software. Section III deals with radionuclide measurement and impurity assessment and Section IV the presentation of images and the control of image quality. (U.K.)

  16. A comprehensive analysis of the IMRT dose delivery process using statistical process control (SPC)

    Energy Technology Data Exchange (ETDEWEB)

    Gerard, Karine; Grandhaye, Jean-Pierre; Marchesi, Vincent; Kafrouni, Hanna; Husson, Francois; Aletti, Pierre [Research Center for Automatic Control (CRAN), Nancy University, CNRS, 54516 Vandoeuvre-les-Nancy (France); Department of Medical Physics, Alexis Vautrin Cancer Center, 54511 Vandoeuvre-les-Nancy Cedex (France) and DOSIsoft SA, 94230 Cachan (France); Research Laboratory for Innovative Processes (ERPI), Nancy University, EA 3767, 5400 Nancy Cedex (France); Department of Medical Physics, Alexis Vautrin Cancer Center, 54511 Vandoeuvre-les-Nancy Cedex (France); DOSIsoft SA, 94230 Cachan (France); Research Center for Automatic Control (CRAN), Nancy University, CNRS, 54516 Vandoeuvre-les-Nancy, France and Department of Medical Physics, Alexis Vautrin Cancer Center, 54511 Vandoeuvre-les-Nancy Cedex (France)

    2009-04-15

    The aim of this study is to introduce tools to improve the security of each IMRT patient treatment by determining action levels for the dose delivery process. To achieve this, the patient-specific quality control results performed with an ionization chamber--and which characterize the dose delivery process--have been retrospectively analyzed using a method borrowed from industry: Statistical process control (SPC). The latter consisted in fulfilling four principal well-structured steps. The authors first quantified the short term variability of ionization chamber measurements regarding the clinical tolerances used in the cancer center ({+-}4% of deviation between the calculated and measured doses) by calculating a control process capability (C{sub pc}) index. The C{sub pc} index was found superior to 4, which implies that the observed variability of the dose delivery process is not biased by the short term variability of the measurement. Then, the authors demonstrated using a normality test that the quality control results could be approximated by a normal distribution with two parameters (mean and standard deviation). Finally, the authors used two complementary tools--control charts and performance indices--to thoroughly analyze the IMRT dose delivery process. Control charts aim at monitoring the process over time using statistical control limits to distinguish random (natural) variations from significant changes in the process, whereas performance indices aim at quantifying the ability of the process to produce data that are within the clinical tolerances, at a precise moment. The authors retrospectively showed that the analysis of three selected control charts (individual value, moving-range, and EWMA control charts) allowed efficient drift detection of the dose delivery process for prostate and head-and-neck treatments before the quality controls were outside the clinical tolerances. Therefore, when analyzed in real time, during quality controls, they should

  17. A comprehensive analysis of the IMRT dose delivery process using statistical process control (SPC).

    Science.gov (United States)

    Gérard, Karine; Grandhaye, Jean-Pierre; Marchesi, Vincent; Kafrouni, Hanna; Husson, François; Aletti, Pierre

    2009-04-01

    The aim of this study is to introduce tools to improve the security of each IMRT patient treatment by determining action levels for the dose delivery process. To achieve this, the patient-specific quality control results performed with an ionization chamber--and which characterize the dose delivery process--have been retrospectively analyzed using a method borrowed from industry: Statistical process control (SPC). The latter consisted in fulfilling four principal well-structured steps. The authors first quantified the short-term variability of ionization chamber measurements regarding the clinical tolerances used in the cancer center (+/- 4% of deviation between the calculated and measured doses) by calculating a control process capability (C(pc)) index. The C(pc) index was found superior to 4, which implies that the observed variability of the dose delivery process is not biased by the short-term variability of the measurement. Then, the authors demonstrated using a normality test that the quality control results could be approximated by a normal distribution with two parameters (mean and standard deviation). Finally, the authors used two complementary tools--control charts and performance indices--to thoroughly analyze the IMRT dose delivery process. Control charts aim at monitoring the process over time using statistical control limits to distinguish random (natural) variations from significant changes in the process, whereas performance indices aim at quantifying the ability of the process to produce data that are within the clinical tolerances, at a precise moment. The authors retrospectively showed that the analysis of three selected control charts (individual value, moving-range, and EWMA control charts) allowed efficient drift detection of the dose delivery process for prostate and head-and-neck treatments before the quality controls were outside the clinical tolerances. Therefore, when analyzed in real time, during quality controls, they should improve the

  18. Developing methods of controlling quality costs

    OpenAIRE

    Gorbunova A. V.; Maximova O. N.; Ekova V. A.

    2017-01-01

    The article examines issues of managing quality costs, problems of applying economic methods of quality control, implementation of progressive methods of quality costs management in enterprises with the view of improving the efficiency of their evaluation and analysis. With the aim of increasing the effectiveness of the cost management mechanism, authors introduce controlling as a tool of deviation analysis from the standpoint of the process approach. A list of processes and corresponding eva...

  19. Statistically Controlling for Confounding Constructs Is Harder than You Think.

    Directory of Open Access Journals (Sweden)

    Jacob Westfall

    Full Text Available Social scientists often seek to demonstrate that a construct has incremental validity over and above other related constructs. However, these claims are typically supported by measurement-level models that fail to consider the effects of measurement (unreliability. We use intuitive examples, Monte Carlo simulations, and a novel analytical framework to demonstrate that common strategies for establishing incremental construct validity using multiple regression analysis exhibit extremely high Type I error rates under parameter regimes common in many psychological domains. Counterintuitively, we find that error rates are highest--in some cases approaching 100%--when sample sizes are large and reliability is moderate. Our findings suggest that a potentially large proportion of incremental validity claims made in the literature are spurious. We present a web application (http://jakewestfall.org/ivy/ that readers can use to explore the statistical properties of these and other incremental validity arguments. We conclude by reviewing SEM-based statistical approaches that appropriately control the Type I error rate when attempting to establish incremental validity.

  20. Statistical analysis of longitudinal quality of life data with missing measurements

    NARCIS (Netherlands)

    Zwinderman, A. H.

    1992-01-01

    The statistical analysis of longitudinal quality of life data in the presence of missing data is discussed. In cancer trials missing data are generated due to the fact that patients die, drop out, or are censored. These missing data are problematic in the monitoring of the quality of life during the

  1. Application of statistical process control to qualitative molecular diagnostic assays

    LENUS (Irish Health Repository)

    O'Brien, Cathal P.

    2014-11-01

    Modern pathology laboratories and in particular high throughput laboratories such as clinical chemistry have developed a reliable system for statistical process control (SPC). Such a system is absent from the majority of molecular laboratories and where present is confined to quantitative assays. As the inability to apply SPC to an assay is an obvious disadvantage this study aimed to solve this problem by using a frequency estimate coupled with a confidence interval calculation to detect deviations from an expected mutation frequency. The results of this study demonstrate the strengths and weaknesses of this approach and highlight minimum sample number requirements. Notably, assays with low mutation frequencies and detection of small deviations from an expected value require greater sample numbers to mitigate a protracted time to detection. Modeled laboratory data was also used to highlight how this approach might be applied in a routine molecular laboratory. This article is the first to describe the application of SPC to qualitative laboratory data.

  2. Application of statistical process control to qualitative molecular diagnostic assays.

    Science.gov (United States)

    O'Brien, Cathal P; Finn, Stephen P

    2014-01-01

    Modern pathology laboratories and in particular high throughput laboratories such as clinical chemistry have developed a reliable system for statistical process control (SPC). Such a system is absent from the majority of molecular laboratories and where present is confined to quantitative assays. As the inability to apply SPC to an assay is an obvious disadvantage this study aimed to solve this problem by using a frequency estimate coupled with a confidence interval calculation to detect deviations from an expected mutation frequency. The results of this study demonstrate the strengths and weaknesses of this approach and highlight minimum sample number requirements. Notably, assays with low mutation frequencies and detection of small deviations from an expected value require greater sample numbers to mitigate a protracted time to detection. Modeled laboratory data was also used to highlight how this approach might be applied in a routine molecular laboratory. This article is the first to describe the application of SPC to qualitative laboratory data.

  3. Fuel manufacture and quality control

    International Nuclear Information System (INIS)

    Roepenack, H.; Raab, K.

    1975-01-01

    The different steps in fuel and fuel element manufacturing from the conversion of UF 6 to UO 2 to the assembling of the whole fuel element are shortly described. Each of this fabrication steps must satisfy well-defined quality criteria which are checked in certain analyses or tests. (RB) [de

  4. Quality Control in Mammography: Image Quality and Patient Doses

    International Nuclear Information System (INIS)

    Ciraj Bjelac, O.; Arandjic, D.; Boris Loncar, B.; Kosutic, D.

    2008-01-01

    Mammography is method of choice for early detection of breast cancer. The purpose of this paper is preliminary evaluation the mammography practice in Serbia, in terms of both quality control indicators, i.e. image quality and patient doses. The survey demonstrated considerable variations in technical parameters that affect image quality and patients doses. Mean glandular doses ranged from 0.12 to 2.8 mGy, while reference optical density ranged from 1.2 to 2.8. Correlation between image contrast and mean glandular doses was demonstrated. Systematic implementation of quality control protocol should provide satisfactory performance of mammography units and maintain satisfactory image quality and keep patient doses as low as reasonably practicable. (author)

  5. Quality control in the radioactive waste management

    International Nuclear Information System (INIS)

    Rzyski, B.M.

    1989-01-01

    Radioactive waste management as in industrial activities must mantain in all steps a quality control programme. This control extended from materials acquisition, for waste treatment, to the package deposition is one of the most important activities because it aims to observe the waste acceptance criteria in repositories and allows to guarantee the security of the nuclear facilities. In this work basic knowledges about quality control in waste management and some examples of adopted procedures in other countries are given. (author) [pt

  6. Internal quality control of neutron activation analysis laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Kim, S. H.; Mun, J. H.; BaeK, S. Y.; Jung, Y. S.; Kim, Y. J. [KAERI, Taejon (Korea, Republic of)

    2004-07-01

    The importance for quality assurance and control in analytical laboratories has been emphasized, day by day. Internal quality control using certified reference materials(CRMs) can be one of effective methods for this purpose. In this study, 10 kinds of CRMs consisting of soil, sediment and biological matrix were analyzed. To evaluate the confidence of analytical results and the validation of testing method and procedure, the accuracy and the precision of the measured elements were treated statistically and the reproducibility was compared with those values produced before 2003.

  7. Computer controlled quality of analytical measurements

    International Nuclear Information System (INIS)

    Clark, J.P.; Huff, G.A.

    1979-01-01

    A PDP 11/35 computer system is used in evaluating analytical chemistry measurements quality control data at the Barnwell Nuclear Fuel Plant. This computerized measurement quality control system has several features which are not available in manual systems, such as real-time measurement control, computer calculated bias corrections and standard deviation estimates, surveillance applications, evaluaton of measurement system variables, records storage, immediate analyst recertificaton, and the elimination of routine analysis of known bench standards. The effectiveness of the Barnwell computer system has been demonstrated in gathering and assimilating the measurements of over 1100 quality control samples obtained during a recent plant demonstration run. These data were used to determine equaitons for predicting measurement reliability estimates (bias and precision); to evaluate the measurement system; and to provide direction for modification of chemistry methods. The analytical chemistry measurement quality control activities represented 10% of the total analytical chemistry effort

  8. EFFECT OF QUALITY CONTROL SYSTEM ON AUDIT QUALITY WITH PROFESSIONAL COMMITMENTS AS A MODERATION VARIABLE

    Directory of Open Access Journals (Sweden)

    Ramadhani R.

    2017-12-01

    Full Text Available This study aims to test the effect of every element of Quality Control System (QCS that is leadership responsibilities for quality on audit, relevant ethical requirements, acceptance and continuance of client relationships and certain engagements, assignment of engagement team, engagement performance, monitoring, and documentation on audit quality as well as to test whether the professional commitment moderate effect of every element of QCS on audit quality. The population was the staff auditors working in public accounting firms domiciled in Jakarta City, especially Central Jakarta area with the drawing of 84 respondents. The statistical method used was SEM PLS with the help of SmartPLS application. The results of this study indicate that from seven elements of QCS, only relevant ethical requirements that affect on audit quality. Furthermore, the study also found that professional commitment cannot moderate the relationship between the seven elements of QCS on audit quality.

  9. TRAINING SYSTEM OF FUTURE SPECIALISTS: QUALITY CONTROL

    Directory of Open Access Journals (Sweden)

    Vladimir A. Romanov

    2015-01-01

    Full Text Available The aim of the investigation is development of innovative strategy of quality control training of engineers and skilled workers (hereinafter – future specialists in educational professional organizations on the principles of social partnership.Methods. Theoretic: theoretic and methodological analysis, polytheoretic synthesis, modeling. Empirical: research and generalization of the system, process and competence – based approaches experience, experiment, observation, surveys, expert evaluation, SWOT-analysis as a method of strategic planning which is to identify the internal and external factors (socio-cultural of the organization surrounding.Results. The strategy of the development of the process of quality control training in educational professional organizations and a predictive model of the system of quality control training for future engineers and workers have been created on the analysis and synthesis of a quantitative specification of the quality, the obtained experience and success in control training of future specialists in educational professional organizations in recent economic and educational conditions.Scientific novelty. There has been built a predicative model of quality control training of future specialists to meet modern standards and the principles of social partnership; the control algorithm of the learning process, developed in accordance with the standards (international of quality ISO in the implementation of the quality control systems of the process approach (matrix-based responsibility, competence and remit of those responsible for the education process in the educational organization, the «problem» terms and diagnostic tools for assessing the quality of professional training of future specialists. The perspective directions of innovation in the control of the quality of future professionals training have been determined; the parameters of a comprehensive analysis of the state of the system to ensure the

  10. Pattern-based feature extraction for fault detection in quality relevant process control

    NARCIS (Netherlands)

    Peruzzo, S.; Holenderski, M.J.; Lukkien, J.J.

    2017-01-01

    Statistical quality control (SQC) applies multivariate statistics to monitor production processes over time and detect changes in their performance in terms of meeting specification limits on key product quality metrics. These limits are imposed by customers and typically assumed to be a single

  11. CONCRETE STRUCTURES' QUALITY CONTROL IN PRACTICE

    OpenAIRE

    Dolaček-Alduk, Zlata; Blanda, Miroslav

    2011-01-01

    The Croatian civil engineering is characterized by a lack of systematic approach to planning, control and quality assurance in all phases of project realization. The results obtained in establishing the quality management system in some segments of civil engineering production represent initial trends in solving this problem. Benefits are of two types: the achievement of quality for the contractor and obtaining that quaity is being achieved for clients. Execution of concrete structures is a c...

  12. Application of statistical classification methods for predicting the acceptability of well-water quality

    Science.gov (United States)

    Cameron, Enrico; Pilla, Giorgio; Stella, Fabio A.

    2018-01-01

    The application of statistical classification methods is investigated—in comparison also to spatial interpolation methods—for predicting the acceptability of well-water quality in a situation where an effective quantitative model of the hydrogeological system under consideration cannot be developed. In the example area in northern Italy, in particular, the aquifer is locally affected by saline water and the concentration of chloride is the main indicator of both saltwater occurrence and groundwater quality. The goal is to predict if the chloride concentration in a water well will exceed the allowable concentration so that the water is unfit for the intended use. A statistical classification algorithm achieved the best predictive performances and the results of the study show that statistical classification methods provide further tools for dealing with groundwater quality problems concerning hydrogeological systems that are too difficult to describe analytically or to simulate effectively.

  13. A quality control manual for oral radiology

    International Nuclear Information System (INIS)

    Peixoto, J.E.; Ferreira, R.S.; Bessa, S.O.; Domingues, C.; Gomes, C.A.; Oliveira, S.L.G.; Ortiz, J.A.P.

    1988-01-01

    A quality control manual for oral radiology is showed. The X-ray equipment used for this activity is described, such as the X-ray tube, collimator. The high tension in X-ray tube, the spectra, the quality and the quantity of radiation and the X-ray intensity are also analysed. (C.G.C.) [pt

  14. The regulatory maze of quality control

    International Nuclear Information System (INIS)

    Stone, T.I.

    1987-01-01

    The appropriateness of specific procedures within a quality control program becomes difficult to assess when an attempt is made to collate all of the available information. This task is discussed from the perspective of the Joint Commission (JCAH Accreditation Manual), HHS(quality assurance program recommendations), equipment manufacturers maintenance schedules, and radiology administrative cost concerns

  15. Quality Control Guidelines for SAM Biotoxin Methods

    Science.gov (United States)

    Learn more about quality control guidelines and recommendations for the analysis of samples using the pathogen methods listed in EPA's Selected Analytical Methods for Environmental Remediation and Recovery (SAM).

  16. Quality Control Guidelines for SAM Radiochemical Methods

    Science.gov (United States)

    Learn more about quality control guidelines and recommendations for the analysis of samples using the radiochemistry methods listed in EPA's Selected Analytical Methods for Environmental Remediation and Recovery (SAM).

  17. Quality Control Guidelines for SAM Pathogen Methods

    Science.gov (United States)

    Learn more about quality control guidelines and recommendations for the analysis of samples using the biotoxin methods listed in EPA's Selected Analytical Methods for Environmental Remediation and Recovery (SAM).

  18. Quality Control Guidelines for SAM Chemical Methods

    Science.gov (United States)

    Learn more about quality control guidelines and recommendations for the analysis of samples using the chemistry methods listed in EPA's Selected Analytical Methods for Environmental Remediation and Recovery (SAM).

  19. Metallographic quality control of welding and brazing

    International Nuclear Information System (INIS)

    Slaughter, G.M.

    1979-01-01

    The value of metallography in assuring integrity in the fabrication of metal and components in energy systems is summarized. Metallography also plays an integral role in quality control of welded and brazed joints

  20. Quality Controlled Local Climatological Data (QCLCD) Publication

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Quality Controlled Local Climatological Data (QCLCD) contains summaries from major airport weather stations that include a daily account of temperature extremes,...

  1. Quality control during construction of power plants

    International Nuclear Information System (INIS)

    Hartstern, R.F.

    1982-01-01

    This paper traces the background and examines the necessity for a program to control quality during the construction phase of a power plant. It also attempts to point out considerations for making these programs cost effective

  2. Tidal controls on earthquake size-frequency statistics

    Science.gov (United States)

    Ide, S.; Yabe, S.; Tanaka, Y.

    2016-12-01

    The possibility that tidal stresses can trigger earthquakes is a long-standing issue in seismology. Except in some special cases, a causal relationship between seismicity and the phase of tidal stress has been rejected on the basis of studies using many small events. However, recently discovered deep tectonic tremors are highly sensitive to tidal stress levels, with the relationship being governed by a nonlinear law according to which the tremor rate increases exponentially with increasing stress; thus, slow deformation (and the probability of earthquakes) may be enhanced during periods of large tidal stress. Here, we show the influence of tidal stress on seismicity by calculating histories of tidal shear stress during the 2-week period before earthquakes. Very large earthquakes tend to occur near the time of maximum tidal stress, but this tendency is not obvious for small earthquakes. Rather, we found that tidal stress controls the earthquake size-frequency statistics; i.e., the fraction of large events increases (i.e. the b-value of the Gutenberg-Richter relation decreases) as the tidal shear stress increases. This correlation is apparent in data from the global catalog and in relatively homogeneous regional catalogues of earthquakes in Japan. The relationship is also reasonable, considering the well-known relationship between stress and the b-value. Our findings indicate that the probability of a tiny rock failure expanding to a gigantic rupture increases with increasing tidal stress levels. This finding has clear implications for probabilistic earthquake forecasting.

  3. Advanced methods of quality control in nuclear fuel fabrication

    International Nuclear Information System (INIS)

    Onoufriev, Vladimir

    2004-01-01

    Under pressure of current economic and electricity market situation utilities implement more demanding fuel utilization schemes including higher burn ups and thermal rates, longer fuel cycles and usage of Mo fuel. Therefore, fuel vendors have recently initiated new R and D programmes aimed at improving fuel quality, design and materials to produce robust and reliable fuel. In the beginning of commercial fuel fabrication, emphasis was given to advancements in Quality Control/Quality Assurance related mainly to product itself. During recent years, emphasis was transferred to improvements in process control and to implementation of overall Total Quality Management (TQM) programmes. In the area of fuel quality control, statistical control methods are now widely implemented replacing 100% inspection. This evolution, some practical examples and IAEA activities are described in the paper. The paper presents major findings of the latest IAEA Technical Meetings (TMs) and training courses in the area with emphasis on information received at the TM and training course held in 1999 and other latest publications to provide an overview of new developments in process/quality control, their implementation and results obtained including new approaches to QC

  4. Guideline implementation in clinical practice: Use of statistical process control charts as visual feedback devices

    Directory of Open Access Journals (Sweden)

    Fahad A Al-Hussein

    2009-01-01

    Conclusions: A process of audits in the context of statistical process control is necessary for any improvement in the implementation of guidelines in primary care. Statistical process control charts are an effective means of visual feedback to the care providers.

  5. An easy and low cost option for economic statistical process control ...

    African Journals Online (AJOL)

    An easy and low cost option for economic statistical process control using Excel. ... in both economic and economic statistical designs of the X-control chart. ... in this paper and the numerical examples illustrated are executed on this program.

  6. Developing methods of controlling quality costs

    Directory of Open Access Journals (Sweden)

    Gorbunova A. V.

    2017-01-01

    Full Text Available The article examines issues of managing quality costs, problems of applying economic methods of quality control, implementation of progressive methods of quality costs management in enterprises with the view of improving the efficiency of their evaluation and analysis. With the aim of increasing the effectiveness of the cost management mechanism, authors introduce controlling as a tool of deviation analysis from the standpoint of the process approach. A list of processes and corresponding evaluation criteria in the quality management system of enterprises is introduced. Authors also introduce the method of controlling quality costs and propose it for the practical application, which allows them to determine useful and unnecessary costs at the existing operating plant. Implementing the proposed recommendations in the system of cost management at an enterprise will allow to improve productivity of processes operating and reduce wasted expense on the quality of the process on the basis of determining values of useful and useless costs of quality according to criteria of processes functioning in the system of quality management.

  7. Quality assurance and quality control in mammography: A review

    International Nuclear Information System (INIS)

    BenComo, Jose A.

    2000-01-01

    A mammogram is among the most technically demanding radiographic procedures. The early detection of breast cancer relies on the radiologist's ability to perceive subtle changes in the image that are only perceptible with high-quality imaging. Early detection of breast cancer is only as reliable as the mammogram with which a diagnosis is made, and a mammogram is only as accurate as the system that produces it. A quality assurance (QA) program maximizes the likelihood that the mammographic images will provide adequate diagnostic information for the least possible radiation exposure and cost to the patient. The QA program monitors each phase of operation of the imaging facility beginning with the request for an examination and ending with the interpretation of the referring physician and ensures that the imaging equipment used for the examination will yield the information desired. Because image quality is the most important technical aspect of mammography, this review summarizes the most important QA and quality control issues

  8. An integrated model of statistical process control and maintenance based on the delayed monitoring

    International Nuclear Information System (INIS)

    Yin, Hui; Zhang, Guojun; Zhu, Haiping; Deng, Yuhao; He, Fei

    2015-01-01

    This paper develops an integrated model of statistical process control and maintenance decision. The proposal of delayed monitoring policy postpones the sampling process till a scheduled time and contributes to ten-scenarios of the production process, where equipment failure may occur besides quality shift. The equipment failure and the control chart alert trigger the corrective maintenance and the predictive maintenance, respectively. The occurrence probability, the cycle time and the cycle cost of each scenario are obtained by integral calculation; therefore, a mathematical model is established to minimize the expected cost by using the genetic algorithm. A Monte Carlo simulation experiment is conducted and compared with the integral calculation in order to ensure the analysis of the ten-scenario model. Another ordinary integrated model without delayed monitoring is also established as comparison. The results of a numerical example indicate satisfactory economic performance of the proposed model. Finally, a sensitivity analysis is performed to investigate the effect of model parameters. - Highlights: • We develop an integrated model of statistical process control and maintenance. • We propose delayed monitoring policy and derive an economic model with 10 scenarios. • We consider two deterioration mechanisms, quality shift and equipment failure. • The delayed monitoring policy will help reduce the expected cost

  9. Quality control and analysis of radiotracer compounds

    International Nuclear Information System (INIS)

    Sheppard, G.; Thomson, R.

    1977-01-01

    Special emphasis was on the problems and errors possible in quality control and analysis. The principles underlying quality control were outlined, and analytical techniques applicable to radiotracers were described. Chapter concluded with a selection of examples showing the effects of impurities on the use of radiotracers. The subject of quality control and analysis was treated from the viewpoint of the user and those research workers who need to synthesize and analyze their own radiochemicals. The quality characteristics for radiotracers are of two kinds, valuable or attributive. These were discussed in the chapter. For counting low radioactive concentration, scintillation techniques are in general use, whereas ionization techniques are now used mainly for the measurement of high radioactive concentrations or large quantities of radioactivity, for scanning chromatograms, and a number of very specific purposes. Determination of radionuclidic purity was discussed. Use of radiotracers in pharmaceuticals were presented. 4 figures, 6 tables

  10. Network-based production quality control

    Science.gov (United States)

    Kwon, Yongjin; Tseng, Bill; Chiou, Richard

    2007-09-01

    This study investigates the feasibility of remote quality control using a host of advanced automation equipment with Internet accessibility. Recent emphasis on product quality and reduction of waste stems from the dynamic, globalized and customer-driven market, which brings opportunities and threats to companies, depending on the response speed and production strategies. The current trends in industry also include a wide spread of distributed manufacturing systems, where design, production, and management facilities are geographically dispersed. This situation mandates not only the accessibility to remotely located production equipment for monitoring and control, but efficient means of responding to changing environment to counter process variations and diverse customer demands. To compete under such an environment, companies are striving to achieve 100%, sensor-based, automated inspection for zero-defect manufacturing. In this study, the Internet-based quality control scheme is referred to as "E-Quality for Manufacturing" or "EQM" for short. By its definition, EQM refers to a holistic approach to design and to embed efficient quality control functions in the context of network integrated manufacturing systems. Such system let designers located far away from the production facility to monitor, control and adjust the quality inspection processes as production design evolves.

  11. 20. Quality assurance and quality control in nuclear medicine

    International Nuclear Information System (INIS)

    Vavrejn, B.

    1989-01-01

    Quality control principles to be applied when taking over and using nuclear medicine instrumentation are given. Such instrumentation includes activity meters, gamma detectors for in vitro measurements (manual or automated instruments), gamma detectors for in vivo measurements (with one or several probes), 'movable' scintigraphs and 'steady' scintigraphs (gamma cameras). (Z.S.)

  12. Bootstrap-based confidence estimation in PCA and multivariate statistical process control

    DEFF Research Database (Denmark)

    Babamoradi, Hamid

    be used to detect outliers in the data since the outliers can distort the bootstrap estimates. Bootstrap-based confidence limits were suggested as alternative to the asymptotic limits for control charts and contribution plots in MSPC (Paper II). The results showed that in case of the Q-statistic......Traditional/Asymptotic confidence estimation has limited applicability since it needs statistical theories to estimate the confidences, which are not available for all indicators/parameters. Furthermore, in case the theories are available for a specific indicator/parameter, the theories are based....... The goal was to improve process monitoring by improving the quality of MSPC charts and contribution plots. Bootstrapping algorithm to build confidence limits was illustrated in a case study format (Paper I). The main steps in the algorithm were discussed where a set of sensible choices (plus...

  13. Quality control in 99m technetium radiopharmaceuticals

    International Nuclear Information System (INIS)

    Leon Cabana, Alba

    1994-01-01

    This work means about the quality control in Tc radiopharmaceuticals preparation at hospitalary levels. Several steps must be used in a Nuclear Medicine Laboratory, such as proceeding,radiopharmaceuticals kits preparation, and dispensation materials,glasses,stopper,physical aspects,identification,ph control,storage,and reactif kits

  14. Exploring the use of statistical process control methods to assess course changes

    Science.gov (United States)

    Vollstedt, Ann-Marie

    This dissertation pertains to the field of Engineering Education. The Department of Mechanical Engineering at the University of Nevada, Reno (UNR) is hosting this dissertation under a special agreement. This study was motivated by the desire to find an improved, quantitative measure of student quality that is both convenient to use and easy to evaluate. While traditional statistical analysis tools such as ANOVA (analysis of variance) are useful, they are somewhat time consuming and are subject to error because they are based on grades, which are influenced by numerous variables, independent of student ability and effort (e.g. inflation and curving). Additionally, grades are currently the only measure of quality in most engineering courses even though most faculty agree that grades do not accurately reflect student quality. Based on a literature search, in this study, quality was defined as content knowledge, cognitive level, self efficacy, and critical thinking. Nineteen treatments were applied to a pair of freshmen classes in an effort in increase the qualities. The qualities were measured via quiz grades, essays, surveys, and online critical thinking tests. Results from the quality tests were adjusted and filtered prior to analysis. All test results were subjected to Chauvenet's criterion in order to detect and remove outlying data. In addition to removing outliers from data sets, it was felt that individual course grades needed adjustment to accommodate for the large portion of the grade that was defined by group work. A new method was developed to adjust grades within each group based on the residual of the individual grades within the group and the portion of the course grade defined by group work. It was found that the grade adjustment method agreed 78% of the time with the manual ii grade changes instructors made in 2009, and also increased the correlation between group grades and individual grades. Using these adjusted grades, Statistical Process Control

  15. PACS quality control and automatic problem notifier

    Science.gov (United States)

    Honeyman-Buck, Janice C.; Jones, Douglas; Frost, Meryll M.; Staab, Edward V.

    1997-05-01

    One side effect of installing a clinical PACS Is that users become dependent upon the technology and in some cases it can be very difficult to revert back to a film based system if components fail. The nature of system failures range from slow deterioration of function as seen in the loss of monitor luminance through sudden catastrophic loss of the entire PACS networks. This paper describes the quality control procedures in place at the University of Florida and the automatic notification system that alerts PACS personnel when a failure has happened or is anticipated. The goal is to recover from a failure with a minimum of downtime and no data loss. Routine quality control is practiced on all aspects of PACS, from acquisition, through network routing, through display, and including archiving. Whenever possible, the system components perform self and between platform checks for active processes, file system status, errors in log files, and system uptime. When an error is detected or a exception occurs, an automatic page is sent to a pager with a diagnostic code. Documentation on each code, trouble shooting procedures, and repairs are kept on an intranet server accessible only to people involved in maintaining the PACS. In addition to the automatic paging system for error conditions, acquisition is assured by an automatic fax report sent on a daily basis to all technologists acquiring PACS images to be used as a cross check that all studies are archived prior to being removed from the acquisition systems. Daily quality control is preformed to assure that studies can be moved from each acquisition and contrast adjustment. The results of selected quality control reports will be presented. The intranet documentation server will be described with the automatic pager system. Monitor quality control reports will be described and the cost of quality control will be quantified. As PACS is accepted as a clinical tool, the same standards of quality control must be established

  16. Exposure parameters in fluoroscopy equipment. Quality control

    International Nuclear Information System (INIS)

    Alonso, M.; Castaneda, M.J.; Matorras, P.; Diaz-Caneja, N.; Gutierrez, I.

    1992-01-01

    Within the quality control program in Diagnostic Radiology currently being undertaken at the 'Marques de Valdecilla' University Hospital, the corresponding specification and procedure prototypes for the control of conventional radioscopy equipment have been elaborated and applied. This paper presents the values proposed in the specifications and those obtained for the following radioscopy equipment parameters: reference kerma, and its reproducibility, kerma linearity, maximum kerma at the skin, and total filtration. The results obtained indicate that the equipment studied could comply with specified requirements if a Maintenance Program were to be implemented in coordination with the Quality Control Program. (author)

  17. Using statistical process control for monitoring the prevalence of hospital-acquired pressure ulcers.

    Science.gov (United States)

    Kottner, Jan; Halfens, Ruud

    2010-05-01

    Institutionally acquired pressure ulcers are used as outcome indicators to assess the quality of pressure ulcer prevention programs. Determining whether quality improvement projects that aim to decrease the proportions of institutionally acquired pressure ulcers lead to real changes in clinical practice depends on the measurement method and statistical analysis used. To examine whether nosocomial pressure ulcer prevalence rates in hospitals in the Netherlands changed, a secondary data analysis using different statistical approaches was conducted of annual (1998-2008) nationwide nursing-sensitive health problem prevalence studies in the Netherlands. Institutions that participated regularly in all survey years were identified. Risk-adjusted nosocomial pressure ulcers prevalence rates, grade 2 to 4 (European Pressure Ulcer Advisory Panel system) were calculated per year and hospital. Descriptive statistics, chi-square trend tests, and P charts based on statistical process control (SPC) were applied and compared. Six of the 905 healthcare institutions participated in every survey year and 11,444 patients in these six hospitals were identified as being at risk for pressure ulcers. Prevalence rates per year ranged from 0.05 to 0.22. Chi-square trend tests revealed statistically significant downward trends in four hospitals but based on SPC methods, prevalence rates of five hospitals varied by chance only. Results of chi-square trend tests and SPC methods were not comparable, making it impossible to decide which approach is more appropriate. P charts provide more valuable information than single P values and are more helpful for monitoring institutional performance. Empirical evidence about the decrease of nosocomial pressure ulcer prevalence rates in the Netherlands is contradictory and limited.

  18. No-Reference Video Quality Assessment Based on Statistical Analysis in 3D-DCT Domain.

    Science.gov (United States)

    Li, Xuelong; Guo, Qun; Lu, Xiaoqiang

    2016-05-13

    It is an important task to design models for universal no-reference video quality assessment (NR-VQA) in multiple video processing and computer vision applications. However, most existing NR-VQA metrics are designed for specific distortion types which are not often aware in practical applications. A further deficiency is that the spatial and temporal information of videos is hardly considered simultaneously. In this paper, we propose a new NR-VQA metric based on the spatiotemporal natural video statistics (NVS) in 3D discrete cosine transform (3D-DCT) domain. In the proposed method, a set of features are firstly extracted based on the statistical analysis of 3D-DCT coefficients to characterize the spatiotemporal statistics of videos in different views. These features are used to predict the perceived video quality via the efficient linear support vector regression (SVR) model afterwards. The contributions of this paper are: 1) we explore the spatiotemporal statistics of videos in 3DDCT domain which has the inherent spatiotemporal encoding advantage over other widely used 2D transformations; 2) we extract a small set of simple but effective statistical features for video visual quality prediction; 3) the proposed method is universal for multiple types of distortions and robust to different databases. The proposed method is tested on four widely used video databases. Extensive experimental results demonstrate that the proposed method is competitive with the state-of-art NR-VQA metrics and the top-performing FR-VQA and RR-VQA metrics.

  19. Reaming process improvement and control: An application of statistical engineering

    DEFF Research Database (Denmark)

    Müller, Pavel; Genta, G.; Barbato, G.

    2012-01-01

    A reaming operation had to be performed within given technological and economical constraints. Process improvement under realistic conditions was the goal of a statistical engineering project, supported by a comprehensive experimental investigation providing detailed information on single...

  20. Quality Control and Quality Assurance of Radiation Oncology

    International Nuclear Information System (INIS)

    Abaza, A.

    2016-01-01

    Radiotherapy (RT) has played important roles in cancer treatment for more than one century. The development of RT techniques allows high-dose irradiation to tumors while reducing the radiation doses delivered to surrounding normal tissues. However, RT is a complex process and involves understanding of the principles of medical physics, radiobiology, radiation safety, dosimetry, radiation treatment planning, simulation and interaction of radiation with other treatment modalities. Each step in the integrated process of RT needs quality control and quality assurance (QA) to prevent errors and to ensure that patients will receive the prescribed treatment correctly. The aim of this study is to help the radio therapists in identifying a system for QA that balances patient safety and quality with available resources. Recent advances in RT focus on the need for a systematic RT QA program that balances patient safety and quality with available resources. It is necessary to develop more formal error mitigation and process analysis methods, such as failure mode and effect analysis (FMEA), to focus available QA resources optimally on the process components. External audit programs are also effective. Additionally, Clinical trial QA has a significant role in enhancing the quality of care. The International Atomic Energy Agency (IAEA) has operated both an on-site and off-site postal dosimetry audit to improve practice and to assure the dose from RT equipment. Both postal dosimetry audit and clinical trial RTQA, especially for advanced technologies, in collaboration with global networks, will serve to enhance patient safety and quality of care

  1. Application of multivariate statistical techniques in the water quality assessment of Danube river, Serbia

    Directory of Open Access Journals (Sweden)

    Voza Danijela

    2015-12-01

    Full Text Available The aim of this article is to evaluate the quality of the Danube River in its course through Serbia as well as to demonstrate the possibilities for using three statistical methods: Principal Component Analysis (PCA, Factor Analysis (FA and Cluster Analysis (CA in the surface water quality management. Given that the Danube is an important trans-boundary river, thorough water quality monitoring by sampling at different distances during shorter and longer periods of time is not only ecological, but also a political issue. Monitoring was carried out at monthly intervals from January to December 2011, at 17 sampling sites. The obtained data set was treated by multivariate techniques in order, firstly, to identify the similarities and differences between sampling periods and locations, secondly, to recognize variables that affect the temporal and spatial water quality changes and thirdly, to present the anthropogenic impact on water quality parameters.

  2. Automatic Assessment of Pathological Voice Quality Using Higher-Order Statistics in the LPC Residual Domain

    Directory of Open Access Journals (Sweden)

    JiYeoun Lee

    2009-01-01

    Full Text Available A preprocessing scheme based on linear prediction coefficient (LPC residual is applied to higher-order statistics (HOSs for automatic assessment of an overall pathological voice quality. The normalized skewness and kurtosis are estimated from the LPC residual and show statistically meaningful distributions to characterize the pathological voice quality. 83 voice samples of the sustained vowel /a/ phonation are used in this study and are independently assessed by a speech and language therapist (SALT according to the grade of the severity of dysphonia of GRBAS scale. These are used to train and test classification and regression tree (CART. The best result is obtained using an optima l decision tree implemented by a combination of the normalized skewness and kurtosis, with an accuracy of 92.9%. It is concluded that the method can be used as an assessment tool, providing a valuable aid to the SALT during clinical evaluation of an overall pathological voice quality.

  3. Development of the gaharu oil quality control

    International Nuclear Information System (INIS)

    Chong Saw Peng; Mohd Fajri Osman; Shyful Azizi Abdul Rahman; Khairuddin Abdul Rahim; Mat Rasol Awang

    2010-01-01

    Gaharu (Agar wood) is a secondary metabolite produces by the Aquilaria spp. and accumulates in the plant cell in oleoresin form. The essential oil known as gaharu oil can be extracted from this oleoresin gaharu via varies extraction method such as the water distillation, solvent extraction, pressurize extraction and etc. The gaharu oil extracted through different methods will give different fragrances. Besides, different source of materials will also give different in chemical profiles. In gaharu oil trading market, most of the buyers request quality assurance from the gaharu oil manufacturer to assure the gaharu oil purchased meets their standard requirement. Since there is a demand on gaharu oil quality assurance then it become a need to develop the gaharu oil quality control method in order to have a standard quality control of gaharu oil presented in a certificate of analysis and verified by laboratory. (author)

  4. [Flavouring estimation of quality of grape wines with use of methods of mathematical statistics].

    Science.gov (United States)

    Yakuba, Yu F; Khalaphyan, A A; Temerdashev, Z A; Bessonov, V V; Malinkin, A D

    2016-01-01

    The questions of forming of wine's flavour integral estimation during the tasting are discussed, the advantages and disadvantages of the procedures are declared. As investigating materials we used the natural white and red wines of Russian manufactures, which were made with the traditional technologies from Vitis Vinifera, straight hybrids, blending and experimental wines (more than 300 different samples). The aim of the research was to set the correlation between the content of wine's nonvolatile matter and wine's tasting quality rating by mathematical statistics methods. The content of organic acids, amino acids and cations in wines were considered as the main factors influencing on the flavor. Basically, they define the beverage's quality. The determination of those components in wine's samples was done by the electrophoretic method «CAPEL». Together with the analytical checking of wine's samples quality the representative group of specialists simultaneously carried out wine's tasting estimation using 100 scores system. The possibility of statistical modelling of correlation of wine's tasting estimation based on analytical data of amino acids and cations determination reasonably describing the wine's flavour was examined. The statistical modelling of correlation between the wine's tasting estimation and the content of major cations (ammonium, potassium, sodium, magnesium, calcium), free amino acids (proline, threonine, arginine) and the taking into account the level of influence on flavour and analytical valuation within fixed limits of quality accordance were done with Statistica. Adequate statistical models which are able to predict tasting estimation that is to determine the wine's quality using the content of components forming the flavour properties have been constructed. It is emphasized that along with aromatic (volatile) substances the nonvolatile matter - mineral substances and organic substances - amino acids such as proline, threonine, arginine

  5. Using integrated multivariate statistics to assess the hydrochemistry of surface water quality, Lake Taihu basin, China

    Directory of Open Access Journals (Sweden)

    Xiangyu Mu

    2014-09-01

    Full Text Available Natural factors and anthropogenic activities both contribute dissolved chemical loads to  lakes and streams.  Mineral solubility,  geomorphology of the drainage basin, source strengths and climate all contribute to concentrations and their variability. Urbanization and agriculture waste-water particularly lead to aquatic environmental degradation. Major contaminant sources and controls on water quality can be asssessed by analyzing the variability in proportions of major and minor solutes in water coupled to mutivariate statistical methods.   The demand for freshwater needed for increasing crop production puulation and industrialization occurs almost everywhere in in China and these conflicting needs have led to widespread water contamination. Because of heavy nutrient loadings from all of these sources, Lake Taihu (eastern China notably suffers periodic hyper-eutrophication and drinking water deterioration, which has led to shortages of freshwater for the City of Wuxi and other nearby cities. This lake, the third largest freshwater body in China, has historically beeen considered a cultural treasure of China, and has supported long-term fisheries. The is increasing pressure to remediate the present contamination which compromises both aquiculture and the prior economic base centered on tourism.  However, remediation cannot be effectively done without first characterizing the broad nature of the non-point source pollution. To this end, we investigated the hydrochemical setting of Lake Taihu to determine how different land use types influence the variability of surface water chemistry in different water sources to the lake. We found that waters broadly show wide variability ranging from  calcium-magnesium-bicarbonate hydrochemical facies type to mixed sodium-sulfate-chloride type. Principal components analysis produced three principal components that explained 78% of the variance in the water quality and reflect three major types of water

  6. Statistical methods to assess and control processes and products during nuclear fuel fabrication

    International Nuclear Information System (INIS)

    Weidinger, H.

    1999-01-01

    Very good statistical tools and techniques are available today to access the quality and the reliability of fabrication process as the original sources for a good and reliable quality of the fabricated processes. Quality control charts of different types play a key role and the high capability of modern electronic data acquisition technologies proved, at least potentially, a high efficiency in the more or less online application of these methods. These techniques focus mainly on stability and the reliability of the fabrication process. In addition, relatively simple statistical tolls are available to access the capability of fabrication process, assuming they are stable, to fulfill the product specifications. All these techniques can only result in as good a product as the product design is able to describe the product requirements necessary for good performance. Therefore it is essential that product design is strictly and closely performance oriented. However, performance orientation is only successful through an open and effective cooperation with the customer who uses or applies those products. During the last one to two decades in the west, a multi-vendor strategy has been developed by the utility, sometimes leading to three different fuel vendors for one reactor core. This development resulted in better economic conditions for the user but did not necessarily increase an open attitude with the vendor toward the using utility. The responsibility of the utility increased considerably to ensure an adequate quality of the fuel they received. As a matter of fact, sometimes the utilities had to pay a high price because of unexpected performance problems. Thus the utilities are now learning that they need to increase their knowledge and experience in the area of nuclear fuel quality management and technology. This process started some time ago in the west. However, it now also reaches the utilities in the eastern countries. (author)

  7. Quality control in paediatric nuclear medicine

    International Nuclear Information System (INIS)

    Fischer, S.; Hahn, K.

    1997-01-01

    Nuclear medicine examinations in children require a maximum in quality. This is true for the preparation of the child and parents, the imaging procedure, processing and documentation. It is necessary that quality control through all steps is performed regularly. The aim must be that the children receive a minimum radiation dose, while there needs to be a high quality in imaging and clinical information from the study. Furthermore the child should not be too much psychologically affected by the nuclear medicine examination. (orig.) [de

  8. Quality Assurance and Quality Control in TLD Measurement

    International Nuclear Information System (INIS)

    Bhuiyan, S.I.; Qronfla, M.M.; Abulfaraj, W.H.; Kinsara, A.A.; Taha, T.M.; Molla, N.I.; Elmohr, S.M.

    2008-01-01

    TLD technique characterized by high precision and reproducibility of dose measurement is presented by addressing pre-readout annealing, group sorting, dose evaluation, blind tests, internal dose quality audit and external quality control audits. Two hundred and forty TLD chips were annealed for 1 hour at 4000 degree C followed by 2 h at 1000 degree C. After exposure of 1 mGy from 90 Sr irradiator TLDs were subjected to pre-readout annealing at 1000 degree C, then readout, sorted into groups each with nearly equal sensitivity. Upon repeating the procedures, TLDs having response >3.5% from group mean were dropped to assuring group stability. Effect of pre-readout annealing has been studied. Series of repeated measurements were conducted to stabilize calibration procedures and DCF generation using SSDL level 137 Cs calibrator, dose master, ionization chambers. Performed internal dose quality audits, blind tests and validated by external QC tests with King Abdulaziz City of Science and Technology

  9. Quality control procedures in positron tomography

    International Nuclear Information System (INIS)

    Spinks, T.; Jones, T.; Heather, J.; Gilardi, M.

    1989-01-01

    The derivation of physiological parameters in positron tomography relies on accurate calibration of the tomograph. Normally, the calibration relates image pixel count density to the count rate from an external blood counter per unit activity concentration in each device. The quality control of the latter is simple and relies on detector stability assessed by measurement of a standard source of similar geometry to a blood sample. The quality control of the tomographic data depends on (i) detector stability, (ii) uniformity of calibration and normalisation sources and (iii) reproducibility of the attenuation correction procedure. A quality control procedure has been developed for an 8 detector ring (15 transaxial plane) tomograph in which detector response is assessed by acquiring data from retractable transmission ring sources. These are scanned daily and a print out of detector efficiencies is produced as well as changes from a given data. This provides the raw data from which decisions on recalibration or renormalisation are made. (orig.)

  10. Quality control of the activity meter

    International Nuclear Information System (INIS)

    Rodrigues, Marlon da Silva Brandão; Sá, Lídia Vasconcelos de

    2017-01-01

    Objective: To carry out a comparative analysis of national and international standards regarding the quality control of the activity meter used in Nuclear Medicine Services in Brazil. Material and methods: Quality control protocols from the International Atomic Energy Agency (IAEA), American Association of Physicists in Medicine (AAPM) and International Electrotechnical Commission (IEC) were pointed out and compared with requirements from both regulatory Brazilian agencies, National Surveillance Agency (ANVISA) and National Nuclear Energy Commission (CNEN). Results: The daily routine tests recommended by the regulatory agencies do not have significant differences; in contrast the tests with higher periodicities like (accuracy, linearity and precision) have differences discrepant. Conclusion: In view of the comparative analysis carried out, it is suggested that the national recommendations for the quality control tests of the activity meter should be checked and evaluated, with emphasis on the semiannual and annual periodicity tests. (author)

  11. Quality control of 11C-carfentanil

    International Nuclear Information System (INIS)

    Zhang Xiaojun; Zhang Jinming; Tian Jiahe; Xiang Xiaohui

    2013-01-01

    To study the quality control of 11 C-Carfentanil injection, physical, chemical and biological identification were used. The chemical and radiochemical purity of 11 C-Carfentanil Injection were detected by HPLC and Flower Count system; measured the quantity of product by LC-MS, specific activity was calculated later; The PTS was used to detect endotoxin, and other quality control methods were put up to guarantee the security of its clinical application. The produce appeared colorless and transparent, the radiochemical purity was more than 98%, content of the endotoxin was less than 5 EU/mL. The result showed that 11 C-Carfentanil injection had fulfilled pharmaceutical quality control request and could be applied safely to animal experiment and clinical diagnosis. (authors)

  12. Protein quality control in the nucleus

    DEFF Research Database (Denmark)

    Nielsen, Sofie V.; Poulsen, Esben Guldahl; Rebula, Caio A.

    2014-01-01

    to aggregate, cells have evolved several elaborate quality control systems to deal with these potentially toxic proteins. First, various molecular chaperones will seize the misfolded protein and either attempt to refold the protein or target it for degradation via the ubiquitin-proteasome system...... to be particularly active in protein quality control. Thus, specific ubiquitin-protein ligases located in the nucleus, target not only misfolded nuclear proteins, but also various misfolded cytosolic proteins which are transported to the nucleus prior to their degradation. In comparison, much less is known about...... these mechanisms in mammalian cells. Here we highlight recent advances in our understanding of nuclear protein quality control, in particular regarding substrate recognition and proteasomal degradation....

  13. The Study on quality control of nuclear power installation project

    International Nuclear Information System (INIS)

    Wu Jie

    2008-01-01

    The quality planning, quality assurance and quality control are discussed by applying the quality control (QC) theory and combining the real situation of the Qinshan II project. This paper is practical and plays an active role in instruction of project quality control by applying the above QC theory and control techniques. (authors)

  14. A quality control program for radiation sources

    International Nuclear Information System (INIS)

    Almeida, C.E. de; Sibata, C.H.; Cecatti, E.R.; Kawakami, N.S.; Alexandre, A.C.; Chiavegatti Junior, M.

    1982-01-01

    An extensive quality control program was established covering the following areas: physical parameters of the therapeutical machines, dosimetric standards, preventive maintenance of radiation sources and measuring instruments. A critical evaluation of this program was done after two years (1977-1979) of routine application and the results will be presented. The fluctuation on physical parameters strongly supports the efforts and cost of a quality control program. This program has certainly improved the accuracy required on the delivery of the prescribed dose for radiotherapy treatment. (Author) [pt

  15. Quality control of nuclear medicine instruments 1991

    International Nuclear Information System (INIS)

    1991-05-01

    This document gives detailed guidance on the quality control of various instruments used in nuclear medicine. A first preliminary document was drawn up in 1979. A revised and extended version, incorporating recommended procedures, test schedules and protocols was prepared in 1982. The first edition of ''Quality Control of Nuclear Medicine Instruments'', IAEA-TECDOC-317, was printed in late 1984. Recent advances in the field of nuclear medicine imaging made it necessary to add a chapter on Camera-Computer Systems and another on SPECT Systems. Figs and tabs

  16. Quality control of nuclear medicine instruments, 1991

    International Nuclear Information System (INIS)

    1996-12-01

    This document gives detailed guidance on the quality control of various instruments used in nuclear medicine. A first preliminary document was drawn up in 1979. A revised and extended version, incorporating recommended procedures, test schedules and protocols was prepared in 1982. The first edition of 'Quality Control of Nuclear Medicine Instruments', IAEA-TECDOC-317, was printed in late 1984. Recent advances in the field of nuclear medicine imaging made it necessary to add a chapter on Camera-Computer Systems and another on SPECT Systems

  17. HPLC for quality control of polyimides

    Science.gov (United States)

    Young, P. R.; Sykes, G. F.

    1979-01-01

    High Pressure Liquid Chromatography (HPLC) as a quality control tool for polyimide resins and prepregs are presented. A data base to help establish accept/reject criteria for these materials was developed. This work is intended to supplement, not replace, standard quality control tests normally conducted on incoming resins and prepregs. To help achieve these objectives, the HPLC separation of LARC-160 polyimide precursor resin was characterized. Room temperature resin aging effects were studied. Graphite reinforced composites made from fresh and aged resin were fabricated and tested to determine if changes observed by HPLC were significant.

  18. Technical quality control - constancy controls for digital mammography systems

    International Nuclear Information System (INIS)

    Pedersen, K.; Landmark, I.D.; Bredholt, K.; Hauge, I.H.R.

    2009-04-01

    To ensure the quality of mammographic images, so-called constancy control tests are performed frequently. The report contains a programme for constancy control of digital mammography systems, encompassing the mammography unit, computed radiography (CR) systems, viewing conditions and displays, printers, and procedures for data collection for patient dose calculations. (Author)

  19. Quality evaluation of no-reference MR images using multidirectional filters and image statistics.

    Science.gov (United States)

    Jang, Jinseong; Bang, Kihun; Jang, Hanbyol; Hwang, Dosik

    2018-09-01

    This study aimed to develop a fully automatic, no-reference image-quality assessment (IQA) method for MR images. New quality-aware features were obtained by applying multidirectional filters to MR images and examining the feature statistics. A histogram of these features was then fitted to a generalized Gaussian distribution function for which the shape parameters yielded different values depending on the type of distortion in the MR image. Standard feature statistics were established through a training process based on high-quality MR images without distortion. Subsequently, the feature statistics of a test MR image were calculated and compared with the standards. The quality score was calculated as the difference between the shape parameters of the test image and the undistorted standard images. The proposed IQA method showed a >0.99 correlation with the conventional full-reference assessment methods; accordingly, this proposed method yielded the best performance among no-reference IQA methods for images containing six types of synthetic, MR-specific distortions. In addition, for authentically distorted images, the proposed method yielded the highest correlation with subjective assessments by human observers, thus demonstrating its superior performance over other no-reference IQAs. Our proposed IQA was designed to consider MR-specific features and outperformed other no-reference IQAs designed mainly for photographic images. Magn Reson Med 80:914-924, 2018. © 2018 International Society for Magnetic Resonance in Medicine. © 2018 International Society for Magnetic Resonance in Medicine.

  20. A Comparison of Power Quality Controllers

    Directory of Open Access Journals (Sweden)

    Petr Černek

    2012-01-01

    Full Text Available This paper focuses on certain types of FACTS (Flexibile AC Transmission System controllers, which can be used for improving the power quality at the point of connection with the power network. It focuses on types of controllers that are suitable for use in large buildings, rather than in transmission networks. The goal is to compare the features of the controllers in specific tasks, and to clarify which solution is best for a specific purpose. It is in some cases better and cheaper to use a combination of controllers than a single controller. The paper also presents the features of a shunt active harmonic compensator, which is a very modern power quality controller that can be used in many cases, or in combination with other controllers. The comparison was made using a matrix diagram that, resulted from mind mapsand other analysis tools. The paper should help engineers to choose the best solution for improving the power quality in a specific power network at distribution level.

  1. No-reference image quality assessment based on statistics of convolution feature maps

    Science.gov (United States)

    Lv, Xiaoxin; Qin, Min; Chen, Xiaohui; Wei, Guo

    2018-04-01

    We propose a Convolutional Feature Maps (CFM) driven approach to accurately predict image quality. Our motivation bases on the finding that the Nature Scene Statistic (NSS) features on convolution feature maps are significantly sensitive to distortion degree of an image. In our method, a Convolutional Neural Network (CNN) is trained to obtain kernels for generating CFM. We design a forward NSS layer which performs on CFM to better extract NSS features. The quality aware features derived from the output of NSS layer is effective to describe the distortion type and degree an image suffered. Finally, a Support Vector Regression (SVR) is employed in our No-Reference Image Quality Assessment (NR-IQA) model to predict a subjective quality score of a distorted image. Experiments conducted on two public databases demonstrate the promising performance of the proposed method is competitive to state of the art NR-IQA methods.

  2. Assessment of Surface Water Quality Using Multivariate Statistical Techniques in the Terengganu River Basin

    International Nuclear Information System (INIS)

    Aminu Ibrahim; Hafizan Juahir; Mohd Ekhwan Toriman; Mustapha, A.; Azman Azid; Isiyaka, H.A.

    2015-01-01

    Multivariate Statistical techniques including cluster analysis, discriminant analysis, and principal component analysis/factor analysis were applied to investigate the spatial variation and pollution sources in the Terengganu river basin during 5 years of monitoring 13 water quality parameters at thirteen different stations. Cluster analysis (CA) classified 13 stations into 2 clusters low polluted (LP) and moderate polluted (MP) based on similar water quality characteristics. Discriminant analysis (DA) rendered significant data reduction with 4 parameters (pH, NH 3 -NL, PO 4 and EC) and correct assignation of 95.80 %. The PCA/ FA applied to the data sets, yielded in five latent factors accounting 72.42 % of the total variance in the water quality data. The obtained varifactors indicate that parameters in charge for water quality variations are mainly related to domestic waste, industrial, runoff and agricultural (anthropogenic activities). Therefore, multivariate techniques are important in environmental management. (author)

  3. The statistical reporting quality of articles published in 2010 in five dental journals.

    Science.gov (United States)

    Vähänikkilä, Hannu; Tjäderhane, Leo; Nieminen, Pentti

    2015-01-01

    Statistical methods play an important role in medical and dental research. In earlier studies it has been observed that current use of methods and reporting of statistics are responsible for some of the errors in the interpretation of results. The aim of this study was to investigate the quality of statistical reporting in dental research articles. A total of 200 articles published in 2010 were analysed covering five dental journals: Journal of Dental Research, Caries Research, Community Dentistry and Oral Epidemiology, Journal of Dentistry and Acta Odontologica Scandinavica. Each paper underwent careful scrutiny for the use of statistical methods and reporting. A paper with at least one poor reporting item has been classified as 'problems with reporting statistics' and a paper without any poor reporting item as 'acceptable'. The investigation showed that 18 (9%) papers were acceptable and 182 (91%) papers contained at least one poor reporting item. The proportion of at least one poor reporting item in this survey was high (91%). The authors of dental journals should be encouraged to improve the statistical section of their research articles and to present the results in such a way that it is in line with the policy and presentation of the leading dental journals.

  4. Data Organization for Quality Control Test

    International Nuclear Information System (INIS)

    Yahaya Talib; Glam Hadzir Patai Mohamad; Wan Hamirul Bahrin Wan Kamal

    2011-01-01

    Test data and results for quality control of Mo-99/ Tc-99m generator shall be organized properly. A computer program was developed using Visual Basic 6.0 to process test data, store data and results to specific folder, generate test reports and certificates. Its performance has been evaluated and tested. (author)

  5. Water quality control program in experimental circuits

    International Nuclear Information System (INIS)

    Cegalla, Miriam A.

    1996-01-01

    The Water Quality Control Program of the Experimental Circuits visualizes studying the water chemistry of the cooling in the primary and secondary circuits, monitoring the corrosion of the systems and studying the mechanism of the corrosion products transport in the systems. (author)

  6. Studies of quality control procedures for radiopharmaceuticals

    International Nuclear Information System (INIS)

    Zivanovic, M.; Trott, N.G.

    1983-01-01

    In this paper, a short description is given of a radiopharmaceutical preparation suite set up at the Royal Marsden Hospital and an account is presented of methods used for quality control of radiopharmaceuticals and of the results obtained over a period of about two and a half years

  7. Materials, methods and quality control, ch. 3

    International Nuclear Information System (INIS)

    Vader, H.L.

    1978-01-01

    A description of the chemical reagents, the 125 I-labelled angiotensin I, the antiserum and the standards is given. A modified measuring method with the New England Nuclear kit for angiotensin I radioimmunoassay is presented as well as the quality control data

  8. Guidelines for radriopharmaceutical quality control in hospitals

    International Nuclear Information System (INIS)

    Welsh, W.J.

    1982-01-01

    This document has been prepared to assist hospital administrators in ensuring that adequate quality control is performed on radiopharmaceuticals administered to their patients. Three sets of guidelines are presented, the degree of sophistication being dependent on the amount of hospital involvement in the radiopharmaceutical preparation

  9. Quality Control Of Selected Pesticides With GC

    Energy Technology Data Exchange (ETDEWEB)

    Karasali, H. [Benaki Phytopathological Institute Laboratory of Physical and Chemical Analysis of Pesticides, Ekalis (Greece)

    2009-07-15

    The practical quality control of selected pesticides with GC is treated. Detailed descriptions are given on materials and methods used, including sample preparation and GC operating conditions. The systematic validation of multi methods is described, comprising performance characteristics in routine analysis, like selectivity, specificity etc. This is illustrated by chromatograms, calibration curves and tables derived from real laboratory data. (author)

  10. Outsourcing University Degrees: Implications for Quality Control

    Science.gov (United States)

    Edwards, Julie; Crosling, Glenda; Edwards, Ron

    2010-01-01

    Education institutions worldwide have and continue to seek opportunities to spread their offerings abroad. While the provision of courses to students located overseas through partner institutions has many advantages, it raises questions about quality control that are not as applicable to other forms of international education. This paper uses a…

  11. Pitch Motion Stabilization by Propeller Speed Control Using Statistical Controller Design

    DEFF Research Database (Denmark)

    Nakatani, Toshihiko; Blanke, Mogens; Galeazzi, Roberto

    2006-01-01

    This paper describes dynamics analysis of a small training boat and a possibility of ship pitch stabilization by control of propeller speed. After upgrading the navigational system of an actual small training boat, in order to identify the model of the ship, the real data collected by sea trials...... were used for statistical analysis and system identification. This analysis shows that the pitching motion is indeed influenced by engine speed and it is suggested that there exists a possibility of reducing the pitching motion by properly controlling the engine throttle. Based on this observation...

  12. Ready-to-Use Simulation: Demystifying Statistical Process Control

    Science.gov (United States)

    Sumukadas, Narendar; Fairfield-Sonn, James W.; Morgan, Sandra

    2005-01-01

    Business students are typically introduced to the concept of process management in their introductory course on operations management. A very important learning outcome here is an appreciation that the management of processes is a key to the management of quality. Some of the related concepts are qualitative, such as strategic and behavioral…

  13. Preliminary Retrospective Analysis of Daily Tomotherapy Output Constancy Checks Using Statistical Process Control.

    Science.gov (United States)

    Mezzenga, Emilio; D'Errico, Vincenzo; Sarnelli, Anna; Strigari, Lidia; Menghi, Enrico; Marcocci, Francesco; Bianchini, David; Benassi, Marcello

    2016-01-01

    The purpose of this study was to retrospectively evaluate the results from a Helical TomoTherapy Hi-Art treatment system relating to quality controls based on daily static and dynamic output checks using statistical process control methods. Individual value X-charts, exponentially weighted moving average charts, and process capability and acceptability indices were used to monitor the treatment system performance. Daily output values measured from January 2014 to January 2015 were considered. The results obtained showed that, although the process was in control, there was an out-of-control situation in the principal maintenance intervention for the treatment system. In particular, process capability indices showed a decreasing percentage of points in control which was, however, acceptable according to AAPM TG148 guidelines. Our findings underline the importance of restricting the acceptable range of daily output checks and suggest a future line of investigation for a detailed process control of daily output checks for the Helical TomoTherapy Hi-Art treatment system.

  14. New statistical potential for quality assessment of protein models and a survey of energy functions

    Directory of Open Access Journals (Sweden)

    Rykunov Dmitry

    2010-03-01

    Full Text Available Abstract Background Scoring functions, such as molecular mechanic forcefields and statistical potentials are fundamentally important tools in protein structure modeling and quality assessment. Results The performances of a number of publicly available scoring functions are compared with a statistical rigor, with an emphasis on knowledge-based potentials. We explored the effect on accuracy of alternative choices for representing interaction center types and other features of scoring functions, such as using information on solvent accessibility, on torsion angles, accounting for secondary structure preferences and side chain orientation. Partially based on the observations made, we present a novel residue based statistical potential, which employs a shuffled reference state definition and takes into account the mutual orientation of residue side chains. Atom- and residue-level statistical potentials and Linux executables to calculate the energy of a given protein proposed in this work can be downloaded from http://www.fiserlab.org/potentials. Conclusions Among the most influential terms we observed a critical role of a proper reference state definition and the benefits of including information about the microenvironment of interaction centers. Molecular mechanical potentials were also tested and found to be over-sensitive to small local imperfections in a structure, requiring unfeasible long energy relaxation before energy scores started to correlate with model quality.

  15. The role and relevance of quality assurance to quality control

    International Nuclear Information System (INIS)

    Churchill, G.F.

    1989-01-01

    The paper describes the development of Quality Assurance as a total management technique, incorporating manufacturing and construction Quality Control, to give confidence of satisfactory in-service performance. The application of QA to the Heysham 2 and Torness AGR projects design and construction is defined with particular reference to the development of a QA requirements specification, delegation of QA responsibility through the hierarchy of purchasers and suppliers of plant and material, the role of the QA organization and QA auditing. The paper discusses the effectiveness and benefits of QA and the problems identified in its application and implementation. The problems, their solutions and longer term improvements to reduce the costs of QA as well as enhancing confidence in the satisfactory performance of future nuclear projects, are described. (author)

  16. Quality control of PET/CT

    International Nuclear Information System (INIS)

    Angelova, J.; Zajcharov, M.

    2013-01-01

    Full text: Introduction: The aim of this work is to undertake a review of the methods for checking and adjusting the computer and positron emission tomography in the Hospital 'Alexandrovska' by the attached to the equipment phantoms according to the manufacturer prescription in order to fulfill its requirements for the entry of the main parameters image within certain limits. Materials and Methods: At the start of work a check of the lasers setting for patient positioning and 'heat' of the X-ray tube scanner to better image quality were made. Daily verification procedures on the image quality of CT through the water phantom and weekly - 'air' calibration were carried out. In positron part, daily control involves setting the resolution and sensitivity of the scanner through built Ga68 phantom. When commissioning, after repair and at least once a year, it is necessary to verify the accuracy of registration of the pulses from the crystal with a water phantom of known volume and the coincidence between CT and PET image. Results: The process of quality control is interactive. The results are displayed in tables and graphically, with the goal the individual values to fall within the determined by manufacturers range and to meet the standards for image quality. If necessary, the procedure repeats several times until it is fulfilled. Conclusion: Ensuring the quality of the image in positron emission tomography combined with computed tomography, is inextricably linked to accurate and precise diagnosis of tumor processes in the human body

  17. Quality assurance and quality control of nuclear engineering during construction phase

    International Nuclear Information System (INIS)

    Zhang Zhihua; Deng Yue; Liu Yaoguang; Xu Xianqi; Zhou Shan; Qian Dazhi; Zhang Yang

    2007-01-01

    The quality assurance (QA) and quality control (QC) is a very important work in the nuclear engineering. This paper starts with how to establish quality assurance system of nuclear engineering construction phase, then introduces several experiments and techniques such as the implementation of quality assurance program, the quality assurance and quality control of contractors, the quality surveillance and control of supervisory companies, quality assurance audit and surveillance of builders. (authors)

  18. Quality and Control of Water Vapor Winds

    Science.gov (United States)

    Jedlovec, Gary J.; Atkinson, Robert J.

    1996-01-01

    Water vapor imagery from the geostationary satellites such as GOES, Meteosat, and GMS provides synoptic views of dynamical events on a continual basis. Because the imagery represents a non-linear combination of mid- and upper-tropospheric thermodynamic parameters (three-dimensional variations in temperature and humidity), video loops of these image products provide enlightening views of regional flow fields, the movement of tropical and extratropical storm systems, the transfer of moisture between hemispheres and from the tropics to the mid- latitudes, and the dominance of high pressure systems over particular regions of the Earth. Despite the obvious larger scale features, the water vapor imagery contains significant image variability down to the single 8 km GOES pixel. These features can be quantitatively identified and tracked from one time to the next using various image processing techniques. Merrill et al. (1991), Hayden and Schmidt (1992), and Laurent (1993) have documented the operational procedures and capabilities of NOAA and ESOC to produce cloud and water vapor winds. These techniques employ standard correlation and template matching approaches to wind tracking and use qualitative and quantitative procedures to eliminate bad wind vectors from the wind data set. Techniques have also been developed to improve the quality of the operational winds though robust editing procedures (Hayden and Veldon 1991). These quality and control approaches have limitations, are often subjective, and constrain wind variability to be consistent with model derived wind fields. This paper describes research focused on the refinement of objective quality and control parameters for water vapor wind vector data sets. New quality and control measures are developed and employed to provide a more robust wind data set for climate analysis, data assimilation studies, as well as operational weather forecasting. The parameters are applicable to cloud-tracked winds as well with minor

  19. Medicaid Fraud Control Units (MFCU) Annual Spending and Performance Statistics

    Data.gov (United States)

    U.S. Department of Health & Human Services — Medicaid Fraud Control Units (MFCU or Unit) investigate and prosecute Medicaid fraud as well as patient abuse and neglect in health care facilities. OIG certifies,...

  20. The Use of Statistical Methods in Dimensional Process Control

    National Research Council Canada - National Science Library

    Krajcsik, Stephen

    1985-01-01

    ... erection. To achieve this high degree of unit accuracy, we have begun a pilot dimensional control program that has set the guidelines for systematically monitoring each stage of the production process prior to erection...

  1. Statistical and trend analysis of water quality and quantity data for the Strymon River in Greece

    Directory of Open Access Journals (Sweden)

    V. Z. Antonopoulos

    2001-01-01

    Full Text Available Strymon is a transboundary river of Greece, Bulgaria and Former Yugoslav Republic of Macedonia (FYROM in southeastern Europe. Water quality parameters and the discharge have been monitored each month just 10 km downstream of the river’s entry into Greece. The data of nine water quality variables (T, ECw, DO, SO42-, Na++K+, Mg2+ , Ca2+, NO3‾, TP and the discharge for the period 1980-1997 were selected for this analysis. In this paper a the time series of monthly values of water quality parameters and the discharge were analysed using statistical methods, b the existence of trends and the evaluation of the best fitted models were performed and c the relationships between concentration and loads of constituents both with the discharge were also examined. Boxplots for summarising the distribution of a data set were used. The ◈-test and the Kolmogorov-Smirnov test were used to select the theoretical distribution which best fitted the data. Simple regression was used to examine the concentration-discharge and the load-discharge relationships. According to the correlation coefficient (r values the relation between concentrations and discharge is weak (r 0.902. Trends were detected using the nonparametric Spearman’s criterion upon the data for the variables: Q, ECw, DO, SO42-, Na++K+ and NO3‾ on which temporal trend analysis was performed. Keywords: Strymon river, water quality, discharge, concentration, load, statistics, trends

  2. Impact analysis of critical success factors on the benefits from statistical process control implementation

    Directory of Open Access Journals (Sweden)

    Fabiano Rodrigues Soriano

    Full Text Available Abstract The Statistical Process Control - SPC is a set of statistical techniques focused on process control, monitoring and analyzing variation causes in the quality characteristics and/or in the parameters used to control and process improvements. Implementing SPC in organizations is a complex task. The reasons for its failure are related to organizational or social factors such as lack of top management commitment and little understanding about its potential benefits. Other aspects concern technical factors such as lack of training on and understanding about the statistical techniques. The main aim of the present article is to understand the interrelations between conditioning factors associated with top management commitment (Support, SPC Training and Application, as well as to understand the relationships between these factors and the benefits associated with the implementation of the program. The Partial Least Squares Structural Equation Modeling (PLS-SEM was used in the analysis since the main goal is to establish the causal relations. A cross-section survey was used as research method to collect information of samples from Brazilian auto-parts companies, which were selected according to guides from the auto-parts industry associations. A total of 170 companies were contacted by e-mail and by phone in order to be invited to participate in the survey. However, just 93 industries agreed on participating, and only 43 answered the questionnaire. The results showed that the senior management support considerably affects the way companies develop their training programs. In turn, these trainings affect the way companies apply the techniques. Thus, it will reflect on the benefits gotten from implementing the program. It was observed that the managerial and technical aspects are closely connected to each other and that they are represented by the ratio between top management and training support. The technical aspects observed through SPC

  3. Computer-aided control of high-quality cast iron

    Directory of Open Access Journals (Sweden)

    S. Pietrowski

    2008-04-01

    Full Text Available The study discusses the possibility of control of the high-quality grey cast iron and ductile iron using the author’s genuine computer programs. The programs have been developed with the help of algorithms based on statistical relationships that are said to exist between the characteristic parameters of DTA curves and properties, like Rp0,2, Rm, A5 and HB. It has been proved that the spheroidisation and inoculation treatment of cast iron changes in an important way the characteristic parameters of DTA curves, thus enabling a control of these operations as regards their correctness and effectiveness, along with the related changes in microstructure and mechanical properties of cast iron. Moreover, some examples of statistical relationships existing between the typical properties of ductile iron and its control process were given for cases of the melts consistent and inconsistent with the adopted technology.A test stand for control of the high-quality cast iron and respective melts has been schematically depicted.

  4. The Statistical point of view of Quality: the Lean Six Sigma methodology.

    Science.gov (United States)

    Bertolaccini, Luca; Viti, Andrea; Terzi, Alberto

    2015-04-01

    Six Sigma and Lean are two quality improvement methodologies. The Lean Six Sigma methodology is applicable to repetitive procedures. Therefore, the use of this methodology in the health-care arena has focused mainly on areas of business operations, throughput, and case management and has focused on efficiency outcomes. After the revision of methodology, the paper presents a brief clinical example of the use of Lean Six Sigma as a quality improvement method in the reduction of the complications during and after lobectomies. Using Lean Six Sigma methodology, the multidisciplinary teams could identify multiple modifiable points across the surgical process. These process improvements could be applied to different surgical specialties and could result in a measurement, from statistical point of view, of the surgical quality.

  5. Water quality, Multivariate statistical techniques, submarine out fall, spatial variation, temporal variation

    International Nuclear Information System (INIS)

    Garcia, Francisco; Palacio, Carlos; Garcia, Uriel

    2012-01-01

    Multivariate statistical techniques were used to investigate the temporal and spatial variations of water quality at the Santa Marta coastal area where a submarine out fall that discharges 1 m3/s of domestic wastewater is located. Two-way analysis of variance (ANOVA), cluster and principal component analysis and Krigging interpolation were considered for this report. Temporal variation showed two heterogeneous periods. From December to April, and July, where the concentration of the water quality parameters is higher; the rest of the year (May, June, August-November) were significantly lower. The spatial variation reported two areas where the water quality is different, this difference is related to the proximity to the submarine out fall discharge.

  6. 42 CFR 84.40 - Quality control plans; filing requirements.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 1 2010-10-01 2010-10-01 false Quality control plans; filing requirements. 84.40... Control § 84.40 Quality control plans; filing requirements. As a part of each application for approval or... proposed quality control plan which shall be designed to assure the quality of respiratory protection...

  7. 21 CFR 211.22 - Responsibilities of quality control unit.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 4 2010-04-01 2010-04-01 false Responsibilities of quality control unit. 211.22... Personnel § 211.22 Responsibilities of quality control unit. (a) There shall be a quality control unit that... have been fully investigated. The quality control unit shall be responsible for approving or rejecting...

  8. 30 CFR 28.30 - Quality control plans; filing requirements.

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Quality control plans; filing requirements. 28... PROTECTION FOR TRAILING CABLES IN COAL MINES Quality Control § 28.30 Quality control plans; filing... part, each applicant shall file with MSHA a proposed quality control plan which shall be designed to...

  9. Quality Controlling CMIP datasets at GFDL

    Science.gov (United States)

    Horowitz, L. W.; Radhakrishnan, A.; Balaji, V.; Adcroft, A.; Krasting, J. P.; Nikonov, S.; Mason, E. E.; Schweitzer, R.; Nadeau, D.

    2017-12-01

    As GFDL makes the switch from model development to production in light of the Climate Model Intercomparison Project (CMIP), GFDL's efforts are shifted to testing and more importantly establishing guidelines and protocols for Quality Controlling and semi-automated data publishing. Every CMIP cycle introduces key challenges and the upcoming CMIP6 is no exception. The new CMIP experimental design comprises of multiple MIPs facilitating research in different focus areas. This paradigm has implications not only for the groups that develop the models and conduct the runs, but also for the groups that monitor, analyze and quality control the datasets before data publishing, before their knowledge makes its way into reports like the IPCC (Intergovernmental Panel on Climate Change) Assessment Reports. In this talk, we discuss some of the paths taken at GFDL to quality control the CMIP-ready datasets including: Jupyter notebooks, PrePARE, LAMP (Linux, Apache, MySQL, PHP/Python/Perl): technology-driven tracker system to monitor the status of experiments qualitatively and quantitatively, provide additional metadata and analysis services along with some in-built controlled-vocabulary validations in the workflow. In addition to this, we also discuss the integration of community-based model evaluation software (ESMValTool, PCMDI Metrics Package, and ILAMB) as part of our CMIP6 workflow.

  10. A statistical model of uplink inter-cell interference with slow and fast power control mechanisms

    KAUST Repository

    Tabassum, Hina

    2013-09-01

    Uplink power control is in essence an interference mitigation technique that aims at minimizing the inter-cell interference (ICI) in cellular networks by reducing the transmit power levels of the mobile users while maintaining their target received signal quality levels at base stations. Power control mechanisms directly impact the interference dynamics and, thus, affect the overall achievable capacity and consumed power in cellular networks. Due to the stochastic nature of wireless channels and mobile users\\' locations, it is important to derive theoretical models for ICI that can capture the impact of design alternatives related to power control mechanisms. To this end, we derive and verify a novel statistical model for uplink ICI in Generalized-K composite fading environments as a function of various slow and fast power control mechanisms. The derived expressions are then utilized to quantify numerically key network performance metrics that include average resource fairness, average reduction in power consumption, and ergodic capacity. The accuracy of the derived expressions is validated via Monte-Carlo simulations. Results are generated for multiple network scenarios, and insights are extracted to assess various power control mechanisms as a function of system parameters. © 1972-2012 IEEE.

  11. A statistical model of uplink inter-cell interference with slow and fast power control mechanisms

    KAUST Repository

    Tabassum, Hina; Yilmaz, Ferkan; Dawy, Zaher; Alouini, Mohamed-Slim

    2013-01-01

    Uplink power control is in essence an interference mitigation technique that aims at minimizing the inter-cell interference (ICI) in cellular networks by reducing the transmit power levels of the mobile users while maintaining their target received signal quality levels at base stations. Power control mechanisms directly impact the interference dynamics and, thus, affect the overall achievable capacity and consumed power in cellular networks. Due to the stochastic nature of wireless channels and mobile users' locations, it is important to derive theoretical models for ICI that can capture the impact of design alternatives related to power control mechanisms. To this end, we derive and verify a novel statistical model for uplink ICI in Generalized-K composite fading environments as a function of various slow and fast power control mechanisms. The derived expressions are then utilized to quantify numerically key network performance metrics that include average resource fairness, average reduction in power consumption, and ergodic capacity. The accuracy of the derived expressions is validated via Monte-Carlo simulations. Results are generated for multiple network scenarios, and insights are extracted to assess various power control mechanisms as a function of system parameters. © 1972-2012 IEEE.

  12. Determination and evaluation of air quality control. Manual of ambient air quality control in Germany

    Energy Technology Data Exchange (ETDEWEB)

    Lahmann, E.

    1997-07-01

    Measurement of air pollution emissions and ambient air quality are essential instruments for air quality control. By undertaking such measurements, pollutants are registered both at their place of origin and at the place where they may have an effect on people or the environment. Both types of measurement complement each other and are essential for the implementation of air quality legislation, particularly, in compliance with emission and ambient air quality limit values. Presented here are similar accounts of measurement principles and also contains as an Appendix a list of suitability-tested measuring devices which is based on information provided by the manufacturers. In addition, the guide of ambient air quality control contains further information on discontinuous measurement methods, on measurement planning and on the assessment of ambient air quality data. (orig./SR)

  13. Quality Risk Management: Putting GMP Controls First.

    Science.gov (United States)

    O'Donnell, Kevin; Greene, Anne; Zwitkovits, Michael; Calnan, Nuala

    2012-01-01

    This paper presents a practical way in which current approaches to quality risk management (QRM) may be improved, such that they better support qualification, validation programs, and change control proposals at manufacturing sites. The paper is focused on the treatment of good manufacturing practice (GMP) controls during QRM exercises. It specifically addresses why it is important to evaluate and classify such controls in terms of how they affect the severity, probability of occurrence, and detection ratings that may be assigned to potential failure modes or negative events. It also presents a QRM process that is designed to directly link the outputs of risk assessments and risk control activities with qualification and validation protocols in the GMP environment. This paper concerns the need for improvement in the use of risk-based principles and tools when working to ensure that the manufacturing processes used to produce medicines, and their related equipment, are appropriate. Manufacturing processes need to be validated (or proven) to demonstrate that they can produce a medicine of the required quality. The items of equipment used in such processes need to be qualified, in order to prove that they are fit for their intended use. Quality risk management (QRM) tools can be used to support such qualification and validation activities, but their use should be science-based and subject to as little subjectivity and uncertainty as possible. When changes are proposed to manufacturing processes, equipment, or related activities, they also need careful evaluation to ensure that any risks present are managed effectively. This paper presents a practical approach to how QRM may be improved so that it better supports qualification, validation programs, and change control proposals in a more scientific way. This improved approach is based on the treatment of what are called good manufacturing process (GMP) controls during those QRM exercises. A GMP control can be considered

  14. To control with health: from statistics to strategy.

    Science.gov (United States)

    Larsson, Johan; Landstad, Bodil; Vinberg, Stig

    2009-01-01

    The main purpose of this study is to develop and test a generic model for workplace health management in organizations. Four private and four public organizations in northern Sweden were selected for the study. A model for health control was developed on the basis of a literature review and dialogues with the stakeholders in the workplaces. The model was then implemented at the workplaces during a two-year period. Interviews with leaders and co-workers were conducted on two occasions and were analyzed using content analysis and the constant comparison method. By using a grounded theory approach, three main categories were found: health closure and other health and working environment indicators, monetary accounting of health related indicators and changes in leadership behaviour and organizational practices. An important result was that the model influenced leadership values more than leadership and organizational methodologies. From the results a model for workplace health management is proposed, incorporating the planning, control, and improvement structures. The purpose of the model is to take health aspects into consideration when deciding organizational structure (work demands, control and social support). The model controls health by using health-related indicators with high frequency measuring whereas workplace health promotion is done in a structured way with a reflective model.

  15. Microbiological Quality Control of Probiotic Products

    OpenAIRE

    Astashkina, A.P.; Khudyakova, L.I.; Kolbysheva, Y.V.

    2014-01-01

    Microbiological quality control of probiotic products such as Imunele, Dannon, Pomogayka showed that they contain living cultures of the Lactobacillus Bifidobacterium genus in the amount of 107 CFU/ml, which corresponds to the number indicated on the label of products. It is identified that the survival rate of test-strains cultured with pasteurized products does not exceed 10%. The cell concentration of target-microorganisms was reduced by 20-45% after the interaction with living probiotic b...

  16. Mitochondrial quality control in cardiac diseases.

    Directory of Open Access Journals (Sweden)

    Juliane Campos

    2016-10-01

    Full Text Available Disruption of mitochondrial homeostasis is a hallmark of cardiac diseases. Therefore, maintenance of mitochondrial integrity through different surveillance mechanisms is critical for cardiomyocyte survival. In this review, we discuss the most recent findings on the central role of mitochondrial quality control processes including regulation of mitochondrial redox balance, aldehyde metabolism, proteostasis, dynamics and clearance in cardiac diseases, highlighting their potential as therapeutic targets.

  17. Quality control and the multicrystal counter

    International Nuclear Information System (INIS)

    Hart, G.C.; Davis, K.M.

    1983-01-01

    The reliability of multicrystal counters for use in counting large numbers of radioimmunoassay samples is studied. In particular, the dependencies of the outputs from the array of detectors, and hence their degree of matching, on the count rate and volume of the samples being counted are investigated. Quality control procedures are described to assist in the assurance of consistent performance of the counter in the clinical situation. (U.K.)

  18. The software quality control for gamma spectrometry

    International Nuclear Information System (INIS)

    Monte, L.

    1986-01-01

    One of major problems with wich the quality control program of an environmental measurements laboratory is confronted is the evaluation of the performances of software packages for the analysis of gamma-ray spectra. A program of tests for evaluating the performances of the software package (SPECTRAN-F, Canberra Inc.) used by our laboratory is being carried out. In this first paper the results of a preliminary study concerning the evaluation of the performance of the doublet analysis routine are presented

  19. Quality control of estrogen receptor assays.

    Science.gov (United States)

    Godolphin, W; Jacobson, B

    1980-01-01

    Four types of material have been used for the quality control of routine assays of estrogen receptors in human breast tumors. Pieces of hormone-dependent Nb rat mammary tumors gave a precision about 40%. Rat uteri and rat tumors pulverized at liquid nitrogen temperature and stored as powder yielded precision about 30%. Powdered and lyophilised human tumors appear the best with precision as good as 17%.

  20. Statistical process control analysis for patient-specific IMRT and VMAT QA.

    Science.gov (United States)

    Sanghangthum, Taweap; Suriyapee, Sivalee; Srisatit, Somyot; Pawlicki, Todd

    2013-05-01

    This work applied statistical process control to establish the control limits of the % gamma pass of patient-specific intensity modulated radiotherapy (IMRT) and volumetric modulated arc therapy (VMAT) quality assurance (QA), and to evaluate the efficiency of the QA process by using the process capability index (Cpml). A total of 278 IMRT QA plans in nasopharyngeal carcinoma were measured with MapCHECK, while 159 VMAT QA plans were undertaken with ArcCHECK. Six megavolts with nine fields were used for the IMRT plan and 2.5 arcs were used to generate the VMAT plans. The gamma (3%/3 mm) criteria were used to evaluate the QA plans. The % gamma passes were plotted on a control chart. The first 50 data points were employed to calculate the control limits. The Cpml was calculated to evaluate the capability of the IMRT/VMAT QA process. The results showed higher systematic errors in IMRT QA than VMAT QA due to the more complicated setup used in IMRT QA. The variation of random errors was also larger in IMRT QA than VMAT QA because the VMAT plan has more continuity of dose distribution. The average % gamma pass was 93.7% ± 3.7% for IMRT and 96.7% ± 2.2% for VMAT. The Cpml value of IMRT QA was 1.60 and VMAT QA was 1.99, which implied that the VMAT QA process was more accurate than the IMRT QA process. Our lower control limit for % gamma pass of IMRT is 85.0%, while the limit for VMAT is 90%. Both the IMRT and VMAT QA processes are good quality because Cpml values are higher than 1.0.

  1. Software for creating quality control database in diagnostic radiology

    International Nuclear Information System (INIS)

    Stoeva, M.; Spassov, G.; Tabakov, S.

    2000-01-01

    The paper describes a PC based program with database for quality control (QC). It keeps information about all surveyed equipment and measured parameters. The first function of the program is to extract information from old (existing) MS Excel spreadsheets with QC surveys. The second function is used for input of measurements which are automatically organized in MS Excel spreadsheets and built into the database. The spreadsheets are based on the protocols described in the EMERALD Training Scheme. In addition, the program can make statistics of all measured parameters, both in absolute term and in time

  2. Adaptive statistical iterative reconstruction: reducing dose while preserving image quality in the pediatric head CT examination

    Energy Technology Data Exchange (ETDEWEB)

    McKnight, Colin D.; Watcharotone, Kuanwong; Ibrahim, Mohannad; Christodoulou, Emmanuel; Baer, Aaron H.; Parmar, Hemant A. [University of Michigan, Department of Radiology, Ann Arbor, MI (United States)

    2014-08-15

    Over the last decade there has been escalating concern regarding the increasing radiation exposure stemming from CT exams, particularly in children. Adaptive statistical iterative reconstruction (ASIR) is a relatively new and promising tool to reduce radiation dose while preserving image quality. While encouraging results have been found in adult head and chest and body imaging, validation of this technique in pediatric population is limited. The objective of our study was to retrospectively compare the image quality and radiation dose of pediatric head CT examinations obtained with ASIR compared to pediatric head CT examinations without ASIR in a large patient population. Retrospective analysis was performed on 82 pediatric head CT examinations. This group included 33 pediatric head CT examinations obtained with ASIR and 49 pediatric head CT examinations without ASIR. Computed tomography dose index (CTDI{sub vol}) was recorded on all examinations. Quantitative analysis consisted of standardized measurement of attenuation and the standard deviation at the bilateral centrum semiovale and cerebellar white matter to evaluate objective noise. Qualitative analysis consisted of independent assessment by two radiologists in a blinded manner of gray-white differentiation, sharpness and overall diagnostic quality. The average CTDI{sub vol} value of the ASIR group was 21.8 mGy (SD = 4.0) while the average CTDI{sub vol} for the non-ASIR group was 29.7 mGy (SD = 13.8), reflecting a statistically significant reduction in CTDI{sub vol} in the ASIR group (P < 0.01). There were statistically significant reductions in CTDI for the 3- to 12-year-old ASIR group as compared to the 3- to 12-year-old non-ASIR group (21.5 mGy vs. 30.0 mGy; P = 0.004) as well as statistically significant reductions in CTDI for the >12-year-old ASIR group as compared to the >12-year-old non-ASIR group (29.7 mGy vs. 49.9 mGy; P = 0.0002). Quantitative analysis revealed no significant difference in the

  3. TU-FG-201-05: Varian MPC as a Statistical Process Control Tool

    International Nuclear Information System (INIS)

    Carver, A; Rowbottom, C

    2016-01-01

    Purpose: Quality assurance in radiotherapy requires the measurement of various machine parameters to ensure they remain within permitted values over time. In Truebeam release 2.0 the Machine Performance Check (MPC) was released allowing beam output and machine axis movements to be assessed in a single test. We aim to evaluate the Varian Machine Performance Check (MPC) as a tool for Statistical Process Control (SPC). Methods: Varian’s MPC tool was used on three Truebeam and one EDGE linac for a period of approximately one year. MPC was commissioned against independent systems. After this period the data were reviewed to determine whether or not the MPC was useful as a process control tool. Analyses on individual tests were analysed using Shewhart control plots, using Matlab for analysis. Principal component analysis was used to determine if a multivariate model was of any benefit in analysing the data. Results: Control charts were found to be useful to detect beam output changes, worn T-nuts and jaw calibration issues. Upper and lower control limits were defined at the 95% level. Multivariate SPC was performed using Principal Component Analysis. We found little evidence of clustering beyond that which might be naively expected such as beam uniformity and beam output. Whilst this makes multivariate analysis of little use it suggests that each test is giving independent information. Conclusion: The variety of independent parameters tested in MPC makes it a sensitive tool for routine machine QA. We have determined that using control charts in our QA programme would rapidly detect changes in machine performance. The use of control charts allows large quantities of tests to be performed on all linacs without visual inspection of all results. The use of control limits alerts users when data are inconsistent with previous measurements before they become out of specification. A. Carver has received a speaker’s honorarium from Varian

  4. Metrological aspects to quality control for natural gas analyses

    Energy Technology Data Exchange (ETDEWEB)

    Ribeiro, Claudia Cipriano; Borges, Cleber Nogueira; Cunha, Valnei S. [Instituto Nacional de Metrologia, Normalizacao e Qualidade Industrial (INMETRO), Rio de Janeiro, RJ (Brazil); Augusto, Cristiane R. [Universidade Federal do Rio de Janeiro (UFRJ), RJ (Brazil); Augusto, Marco Ignazio [Companhia Estadual de Gas do Rio de Janeiro (CEG), RJ (Brazil)

    2008-07-01

    The Product's Quality and Services are fundamental topics in the globalized commercial relationship inclusive concern the measurements in natural gas. Considerable investments were necessary for industry especially about the quality control in the commercialized gas with an inclusion of the natural gas in Brazilian energetic resources The Brazilian Regulatory Agency, ANP - Agencia Nacional de Petroleo, Gas Natural e Biocombustiveis - created the Resolution ANP no.16. This Resolution defines the natural gas specification, either national or international source, for commercialization in Brazil and list the tolerance concentration for some components. Between of this components are the inert compounds like the CO{sub 2} and N{sub 2}. The presence of this compounds reduce the calorific power, apart from increase the resistance concern the detonation in the case of vehicular application, and occasion the reduction in the methane concentration in the gas. Controls charts can be useful to verify if the process are or not under Statistical Control. The process can be considerate under statistical control if the measurements have it values between in lower and upper limits stated previously The controls charts can be approach several characteristics in each subgroup: means, standard deviations, amplitude or proportion of defects. The charts are draws for a specific characteristic and to detect some deviate in the process under specific environment conditions. The CEG - Companhia de Distribuicao de Gas do Rio de Janeiro and the DQUIM - Chemical Metrology Division has an agreement for technical cooperation in research and development of gas natural composition Concern the importance of the natural gas in the Nation development, as well as the question approaching the custody transference, the objective of this work is demonstrate the control quality of the natural gas composition between the CEG laboratory and the DQUIM laboratory aiming the quality increase of the

  5. Adaptive statistical iterative reconstruction for volume-rendered computed tomography portovenography. Improvement of image quality

    International Nuclear Information System (INIS)

    Matsuda, Izuru; Hanaoka, Shohei; Akahane, Masaaki

    2010-01-01

    Adaptive statistical iterative reconstruction (ASIR) is a reconstruction technique for computed tomography (CT) that reduces image noise. The purpose of our study was to investigate whether ASIR improves the quality of volume-rendered (VR) CT portovenography. Institutional review board approval, with waived consent, was obtained. A total of 19 patients (12 men, 7 women; mean age 69.0 years; range 25-82 years) suspected of having liver lesions underwent three-phase enhanced CT. VR image sets were prepared with both the conventional method and ASIR. The required time to make VR images was recorded. Two radiologists performed independent qualitative evaluations of the image sets. The Wilcoxon signed-rank test was used for statistical analysis. Contrast-noise ratios (CNRs) of the portal and hepatic vein were also evaluated. Overall image quality was significantly improved by ASIR (P<0.0001 and P=0.0155 for each radiologist). ASIR enhanced CNRs of the portal and hepatic vein significantly (P<0.0001). The time required to create VR images was significantly shorter with ASIR (84.7 vs. 117.1 s; P=0.014). ASIR enhances CNRs and improves image quality in VR CT portovenography. It also shortens the time required to create liver VR CT portovenographs. (author)

  6. Using Statistical Process Control to Make Data-Based Clinical Decisions.

    Science.gov (United States)

    Pfadt, Al; Wheeler, Donald J.

    1995-01-01

    Statistical process control (SPC), which employs simple statistical tools and problem-solving techniques such as histograms, control charts, flow charts, and Pareto charts to implement continual product improvement procedures, can be incorporated into human service organizations. Examples illustrate use of SPC procedures to analyze behavioral data…

  7. The product composition control system at Savannah River: Statistical process control algorithm

    International Nuclear Information System (INIS)

    Brown, K.G.

    1994-01-01

    The Defense Waste Processing Facility (DWPF) at the Savannah River Site (SRS) will be used to immobilize the approximately 130 million liters of high-level nuclear waste currently stored at the site in 51 carbon steel tanks. Waste handling operations separate this waste into highly radioactive insoluble sludge and precipitate and less radioactive water soluble salts. In DWPF, precipitate (PHA) is blended with insoluble sludge and ground glass frit to produce melter feed slurry which is continuously fed to the DWPF melter. The melter produces a molten borosilicate glass which is poured into stainless steel canisters for cooling and, ultimately, shipment to and storage in an geologic repository. Described here is the Product Composition Control System (PCCS) process control algorithm. The PCCS is the amalgam of computer hardware and software intended to ensure that the melt will be processable and that the glass wasteform produced will be acceptable. Within PCCS, the Statistical Process Control (SPC) Algorithm is the means which guides control of the DWPF process. The SPC Algorithm is necessary to control the multivariate DWPF process in the face of uncertainties arising from the process, its feeds, sampling, modeling, and measurement systems. This article describes the functions performed by the SPC Algorithm, characterization of DWPF prior to making product, accounting for prediction uncertainty, accounting for measurement uncertainty, monitoring a SME batch, incorporating process information, and advantages of the algorithm. 9 refs., 6 figs

  8. Inter-vehicle gap statistics on signal-controlled crossroads

    International Nuclear Information System (INIS)

    Krbalek, Milan

    2008-01-01

    We investigate a microscopical structure in a chain of cars waiting at a red signal on signal-controlled crossroads. A one-dimensional space-continuous thermodynamical model leading to an excellent agreement with the data measured is presented. Moreover, we demonstrate that an inter-vehicle spacing distribution disclosed in relevant traffic data agrees with the thermal-balance distribution of particles in the thermodynamical traffic gas (discussed in [1]) with a high inverse temperature (corresponding to a strong traffic congestion). Therefore, as we affirm, such a system of stationary cars can be understood as a specific state of the traffic sample operating inside a congested traffic stream

  9. Assessment of the beryllium lymphocyte proliferation test using statistical process control.

    Science.gov (United States)

    Cher, Daniel J; Deubner, David C; Kelsh, Michael A; Chapman, Pamela S; Ray, Rose M

    2006-10-01

    Despite more than 20 years of surveillance and epidemiologic studies using the beryllium blood lymphocyte proliferation test (BeBLPT) as a measure of beryllium sensitization (BeS) and as an aid for diagnosing subclinical chronic beryllium disease (CBD), improvements in specific understanding of the inhalation toxicology of CBD have been limited. Although epidemiologic data suggest that BeS and CBD risks vary by process/work activity, it has proven difficult to reach specific conclusions regarding the dose-response relationship between workplace beryllium exposure and BeS or subclinical CBD. One possible reason for this uncertainty could be misclassification of BeS resulting from variation in BeBLPT testing performance. The reliability of the BeBLPT, a biological assay that measures beryllium sensitization, is unknown. To assess the performance of four laboratories that conducted this test, we used data from a medical surveillance program that offered testing for beryllium sensitization with the BeBLPT. The study population was workers exposed to beryllium at various facilities over a 10-year period (1992-2001). Workers with abnormal results were offered diagnostic workups for CBD. Our analyses used a standard statistical technique, statistical process control (SPC), to evaluate test reliability. The study design involved a repeated measures analysis of BeBLPT results generated from the company-wide, longitudinal testing. Analytical methods included use of (1) statistical process control charts that examined temporal patterns of variation for the stimulation index, a measure of cell reactivity to beryllium; (2) correlation analysis that compared prior perceptions of BeBLPT instability to the statistical measures of test variation; and (3) assessment of the variation in the proportion of missing test results and how time periods with more missing data influenced SPC findings. During the period of this study, all laboratories displayed variation in test results that

  10. 7 CFR 58.642 - Quality control tests.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Quality control tests. 58.642 Section 58.642... Procedures § 58.642 Quality control tests. All mix ingredients shall be subject to inspection for quality and condition throughout each processing operation. Quality control tests shall be made on flow line samples as...

  11. 7 CFR 58.928 - Quality control tests.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Quality control tests. 58.928 Section 58.928... Procedures § 58.928 Quality control tests. All dairy products and other ingredients shall be subject to inspection for quality and condition throughout each processing operation. Quality control tests shall be...

  12. 7 CFR 58.335 - Quality control tests.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Quality control tests. 58.335 Section 58.335... Procedures § 58.335 Quality control tests. All milk, cream and related products are subject to inspection for quality and condition throughout each processing operation. Quality control tests shall be made on flow...

  13. SU-E-T-77: A Statistical Approach to Manage Quality for Pre-Treatment Verification in IMRT/VMAT

    Energy Technology Data Exchange (ETDEWEB)

    Jassal, K [Fortis Memorial Research Institute, Gurgaon, Haryana (India); Sarkar, B [AMRI Cancer Centre and GLA university, Mathura, Kolkata, West Bengal (India); Mohanti, B; Roy, S; Ganesh, T [FMRI, Gurgaon, Haryana (India); Munshi, A [Fortis Memorial Research Institute, Gurgon, Haryana (India); Chougule, A [SMS Medical College and Hospital, Jaipur, Rajasthan (India); Sachdev, K [Malaviya National Institute of Technology, Jaipur, Rajasthan (India)

    2015-06-15

    Objective: The study presents the application of a simple concept of statistical process control (SPC) for pre-treatment quality assurance procedure analysis for planar dose measurements performed using 2D-array and a-Si electronic portal imaging device (a-Si EPID). Method: A total of 195 patients of four different anatomical sites: brain (n1=45), head & neck (n2=45), thorax (n3=50) and pelvis (n4=55) were selected for the study. Pre-treatment quality assurance for the clinically acceptable IMRT/VMAT plans was measured with 2D array and a-Si EPID of the accelerator. After the γ-analysis, control charts and the quality index Cpm was evaluated for each cohort. Results: Mean and σ of γ ( 3%/3 mm) were EPID γ %≤1= 99.9% ± 1.15% and array γ %<1 = 99.6% ± 1.06%. Among all plans γ max was consistently lower than for 2D array as compared to a-Si EPID. Fig.1 presents the X-bar control charts for every cohort. Cpm values for a-Si EPID were found to be higher than array, detailed results are presented in table 1. Conclusion: Present study demonstrates the significance of control charts used for quality management purposes in newer radiotherapy clinics. Also, provides a pictorial overview of the clinic performance for the advanced radiotherapy techniques.Higher Cpm values for EPID indicate its higher efficiency than array based measurements.

  14. Use of statistical process control in evaluation of academic performance

    Directory of Open Access Journals (Sweden)

    Ezequiel Gibbon Gautério

    2014-05-01

    Full Text Available The aim of this article was to study some indicators of academic performance (number of students per class, dropout rate, failure rate and scores obtained by the students to identify a pattern of behavior that would enable to implement improvements in the teaching-learning process. The sample was composed of five classes of undergraduate courses in Engineering. The data were collected for three years. Initially an exploratory analysis with analytical and graphical techniques was performed. An analysis of variance and Tukey’s test investigated some sources of variability. This information was used in the construction of control charts. We have found evidence that classes with more students are associated with higher failure rates and lower mean. Moreover, when the course was later in the curriculum, the students had higher scores. The results showed that although they have been detected some special causes interfering in the process, it was possible to stabilize it and to monitor it.

  15. Assessment of water quality of a river-dominated estuary with hydrochemical parameters: A statistical approach.

    Digital Repository Service at National Institute of Oceanography (India)

    Padma, P.; Sheela, V.S.; Suryakumari, S.; Jayalakshmy, K.V.; Nair, S.M.; Kumar, N.C.

    stream_size 64084 stream_content_type text/plain stream_name Water_Qual_Expos_Health_5_197.pdf.txt stream_source_info Water_Qual_Expos_Health_5_197.pdf.txt Content-Encoding UTF-8 Content-Type text/plain; charset=UTF-8... Water Qual Expo Health DOI 10.1007/s12403-014-0115-9 ORIGINAL PAPER Assessment of Water Quality of a River-Dominated Estuary with Hydrochemical Parameters: A Statistical Approach P. Padma · V. S. Sheela · S. Suryakumari · K. V. Jayalakshmy · S. M. Nair...

  16. Development of phantom periapical for control quality

    International Nuclear Information System (INIS)

    Mendes, J.M.S.; Sales Junior, E.S.; Ferreira, F.C.L.; Paschoal, C.M.M.

    2015-01-01

    This study aimed to develop a dental phantom with cysts for evaluation of periapical radiographs that was tested in private dental offices in the city of Maraba, northern Brazil. Through some tests with the object simulator (phantom) were obtained 12 periapical radiographs (one in each of the offices visited) that waking up to the standards of Ordinance No. 453 were visually evaluated by observing the physical parameters of exposure (kVp and mA), time revelation of the radiographic film, later the other radiographs were visually compared with C6 ray set as the default. Among the results, it was found that only two of the twelve rays cysts could not be viewed and, therefore, these two images were deemed unsuitable for accurate diagnosis in the 10 images the cysts could be displayed, however according the images have different qualities comparisons. In addition, it can be concluded that the performance of the phantom was highly satisfactory showing to be efficient for use in quality control testing of dental X-rays, the quality control of radiographs and continuing education of dental professionals for a price much more accessible. (authors)

  17. Quality Management of CERN Vacuum Controls

    CERN Document Server

    Antoniotti, F; Fortescue-Beck, E; Gama, J; Gomes, P; Le Roux, P; Pereira, H; Pigny, G

    2014-01-01

    The vacuum controls Section (TE-VSC-ICM) is in charge of the monitoring, maintenance and consolidation of the control systems of all accelerators and detectors in CERN; this represents 6 000 instruments distributed along 128 km of vacuum chambers, often of heterogeneous architectures and of diverse technical generations. In order to improve the efficiency of the services provided by ICM, to vacuum experts and to accelerator operators, a Quality Management Plan is being put into place. The first step was the standardization of the naming convention across different accelerators. The traceability of problems, requests, repairs, and other actions, has also been put into place (VTL). This was combined with the effort to identify each individual device by a coded label, and register it in a central database (MTF). Occurring in parallel, was the gathering of old documents and the centralization of information concerning architectures, procedures, equipment and settings (EDMS). To describe the topology of control c...

  18. Independent assessment to continue improvement: Implementing statistical process control at the Hanford Site

    International Nuclear Information System (INIS)

    Hu, T.A.; Lo, J.C.

    1994-11-01

    A Quality Assurance independent assessment has brought about continued improvement in the PUREX Plant surveillance program at the Department of Energy's Hanford Site. After the independent assessment, Quality Assurance personnel were closely involved in improving the surveillance program, specifically regarding storage tank monitoring. The independent assessment activities included reviewing procedures, analyzing surveillance data, conducting personnel interviews, and communicating with management. Process improvement efforts included: (1) designing data collection methods; (2) gaining concurrence between engineering and management, (3) revising procedures; and (4) interfacing with shift surveillance crews. Through this process, Statistical Process Control (SPC) was successfully implemented and surveillance management was improved. The independent assessment identified several deficiencies within the surveillance system. These deficiencies can be grouped into two areas: (1) data recording and analysis and (2) handling off-normal conditions. By using several independent assessment techniques, Quality Assurance was able to point out program weakness to senior management and present suggestions for improvements. SPC charting, as implemented by Quality Assurance, is an excellent tool for diagnosing the process, improving communication between the team members, and providing a scientific database for management decisions. In addition, the surveillance procedure was substantially revised. The goals of this revision were to (1) strengthen the role of surveillance management, engineering and operators and (2) emphasize the importance of teamwork for each individual who performs a task. In this instance we believe that the value independent assessment adds to the system is the continuous improvement activities that follow the independent assessment. Excellence in teamwork between the independent assessment organization and the auditee is the key to continuing improvement

  19. Adaptive statistical iterative reconstruction: reducing dose while preserving image quality in the pediatric head CT examination.

    Science.gov (United States)

    McKnight, Colin D; Watcharotone, Kuanwong; Ibrahim, Mohannad; Christodoulou, Emmanuel; Baer, Aaron H; Parmar, Hemant A

    2014-08-01

    Over the last decade there has been escalating concern regarding the increasing radiation exposure stemming from CT exams, particularly in children. Adaptive statistical iterative reconstruction (ASIR) is a relatively new and promising tool to reduce radiation dose while preserving image quality. While encouraging results have been found in adult head and chest and body imaging, validation of this technique in pediatric population is limited. The objective of our study was to retrospectively compare the image quality and radiation dose of pediatric head CT examinations obtained with ASIR compared to pediatric head CT examinations without ASIR in a large patient population. Retrospective analysis was performed on 82 pediatric head CT examinations. This group included 33 pediatric head CT examinations obtained with ASIR and 49 pediatric head CT examinations without ASIR. Computed tomography dose index (CTDIvol) was recorded on all examinations. Quantitative analysis consisted of standardized measurement of attenuation and the standard deviation at the bilateral centrum semiovale and cerebellar white matter to evaluate objective noise. Qualitative analysis consisted of independent assessment by two radiologists in a blinded manner of gray-white differentiation, sharpness and overall diagnostic quality. The average CTDIvol value of the ASIR group was 21.8 mGy (SD = 4.0) while the average CTDIvol for the non-ASIR group was 29.7 mGy (SD = 13.8), reflecting a statistically significant reduction in CTDIvol in the ASIR group (P ASIR group as compared to the 3- to 12-year-old non-ASIR group (21.5 mGy vs. 30.0 mGy; P = 0.004) as well as statistically significant reductions in CTDI for the >12-year-old ASIR group as compared to the >12-year-old non-ASIR group (29.7 mGy vs. 49.9 mGy; P = 0.0002). Quantitative analysis revealed no significant difference in the homogeneity of variance in the ASIR group compared to the non-ASIR group. Radiologist assessment of

  20. Microbiological quality control practices at Australian Radioisotopes

    International Nuclear Information System (INIS)

    Saunders, M.

    1987-01-01

    As a domestic manufacturer of therapeutic substances, Australian Radioisotopes (ARI) must adhere to guidelines set out by the Commonwealth Department of Health in the Code of Good Manufacturing Practices for Therapeutic Goods 1983 (GMP). The GMP gives guidelines for staff training, building requirements, sanitation, documentation and quality control practices. These guidelines form the basis for regular audits performed by officers of the National Biological Standards Laboratories. At Lucas Heights, ARI has combined the principles of the GMP with the overriding precautions introduced for environmental and staff safety and protection. Its policy is to maintain a high level of quality assurance for product identity, purity and sterility and apyrogenicity during all stages of product manufacture

  1. Basic quality control in diagnostic radiology

    International Nuclear Information System (INIS)

    Wikstrom, Erik

    2016-01-01

    Along the route toward regular performance of Quality Control in the Diagnostic Imaging sector there are a number of balances to negotiate: Patient/Staff safety considerations vs Regulatory compliance vs Performance of modern equipment vs Clinic's Productivity. At first glance these ambitions may seem in conflict. The tests performed to meet regulatory requirements may or may not bear any semblance to real clinical measurement scenarios. And the process of collecting the data from the quality assurance tests may induce a system down- time that adversely affects the clinic's overall productivity. Furthermore, the time it takes to complete the analysis of the test data and provide the report required to take the facility back into operation is time wasted for patients waiting for a diagnostic imaging exam

  2. Influence of adaptive statistical iterative reconstruction algorithm on image quality in coronary computed tomography angiography.

    Science.gov (United States)

    Precht, Helle; Thygesen, Jesper; Gerke, Oke; Egstrup, Kenneth; Waaler, Dag; Lambrechtsen, Jess

    2016-12-01

    Coronary computed tomography angiography (CCTA) requires high spatial and temporal resolution, increased low contrast resolution for the assessment of coronary artery stenosis, plaque detection, and/or non-coronary pathology. Therefore, new reconstruction algorithms, particularly iterative reconstruction (IR) techniques, have been developed in an attempt to improve image quality with no cost in radiation exposure. To evaluate whether adaptive statistical iterative reconstruction (ASIR) enhances perceived image quality in CCTA compared to filtered back projection (FBP). Thirty patients underwent CCTA due to suspected coronary artery disease. Images were reconstructed using FBP, 30% ASIR, and 60% ASIR. Ninety image sets were evaluated by five observers using the subjective visual grading analysis (VGA) and assessed by proportional odds modeling. Objective quality assessment (contrast, noise, and the contrast-to-noise ratio [CNR]) was analyzed with linear mixed effects modeling on log-transformed data. The need for ethical approval was waived by the local ethics committee as the study only involved anonymously collected clinical data. VGA showed significant improvements in sharpness by comparing FBP with ASIR, resulting in odds ratios of 1.54 for 30% ASIR and 1.89 for 60% ASIR ( P  = 0.004). The objective measures showed significant differences between FBP and 60% ASIR ( P  < 0.0001) for noise, with an estimated ratio of 0.82, and for CNR, with an estimated ratio of 1.26. ASIR improved the subjective image quality of parameter sharpness and, objectively, reduced noise and increased CNR.

  3. Statistical issues in reporting quality data: small samples and casemix variation.

    Science.gov (United States)

    Zaslavsky, A M

    2001-12-01

    To present two key statistical issues that arise in analysis and reporting of quality data. Casemix variation is relevant to quality reporting when the units being measured have differing distributions of patient characteristics that also affect the quality outcome. When this is the case, adjustment using stratification or regression may be appropriate. Such adjustments may be controversial when the patient characteristic does not have an obvious relationship to the outcome. Stratified reporting poses problems for sample size and reporting format, but may be useful when casemix effects vary across units. Although there are no absolute standards of reliability, high reliabilities (interunit F > or = 10 or reliability > or = 0.9) are desirable for distinguishing above- and below-average units. When small or unequal sample sizes complicate reporting, precision may be improved using indirect estimation techniques that incorporate auxiliary information, and 'shrinkage' estimation can help to summarize the strength of evidence about units with small samples. With broader understanding of casemix adjustment and methods for analyzing small samples, quality data can be analysed and reported more accurately.

  4. Statistical corruption in Beijing's air quality data has likely ended in 2012

    Science.gov (United States)

    Stoerk, Thomas

    2016-02-01

    This research documents changes in likely misreporting in official air quality data from Beijing for the years 2008-2013. It is shown that, consistent with prior research, the official Chinese data report suspiciously few observations that exceed the politically important Blue Sky Day threshold, a particular air pollution level used to evaluate local officials, and an excess of observations just below that threshold. Similar data, measured by the US Embassy in Beijing, do not show this irregularity. To document likely misreporting, this analysis proposes a new way of comparing air quality data via Benford's Law, a statistical regularity known to fit air pollution data. Using this method to compare the official data to the US Embassy data for the first time, I find that the Chinese data fit Benford's Law poorly until a change in air quality measurements at the end of 2012. From 2013 onwards, the Chinese data fit Benford's Law closely. The US Embassy data, by contrast, exhibit no variation over time in the fit with Benford's Law, implying that the underlying pollution processes remain unchanged. These findings suggest that misreporting of air quality data for Beijing has likely ended in 2012. Additionally, I use aerosol optical density data to show the general applicability of this method of detecting likely misreporting in air pollution data.

  5. Image quality of multiplanar reconstruction of pulmonary CT scans using adaptive statistical iterative reconstruction.

    Science.gov (United States)

    Honda, O; Yanagawa, M; Inoue, A; Kikuyama, A; Yoshida, S; Sumikawa, H; Tobino, K; Koyama, M; Tomiyama, N

    2011-04-01

    We investigated the image quality of multiplanar reconstruction (MPR) using adaptive statistical iterative reconstruction (ASIR). Inflated and fixed lungs were scanned with a garnet detector CT in high-resolution mode (HR mode) or non-high-resolution (HR) mode, and MPR images were then reconstructed. Observers compared 15 MPR images of ASIR (40%) and ASIR (80%) with those of ASIR (0%), and assessed image quality using a visual five-point scale (1, definitely inferior; 5, definitely superior), with particular emphasis on normal pulmonary structures, artefacts, noise and overall image quality. The mean overall image quality scores in HR mode were 3.67 with ASIR (40%) and 4.97 with ASIR (80%). Those in non-HR mode were 3.27 with ASIR (40%) and 3.90 with ASIR (80%). The mean artefact scores in HR mode were 3.13 with ASIR (40%) and 3.63 with ASIR (80%), but those in non-HR mode were 2.87 with ASIR (40%) and 2.53 with ASIR (80%). The mean scores of the other parameters were greater than 3, whereas those in HR mode were higher than those in non-HR mode. There were significant differences between ASIR (40%) and ASIR (80%) in overall image quality (pASIR did not suppress the severe artefacts of contrast medium. In general, MPR image quality with ASIR (80%) was superior to that with ASIR (40%). However, there was an increased incidence of artefacts by ASIR when CT images were obtained in non-HR mode.

  6. Statistical analysis of the influence of wheat black point kernels on selected indicators of wheat flour quality

    Directory of Open Access Journals (Sweden)

    Petrov Verica D.

    2011-01-01

    Full Text Available The influence of wheat black point kernels on selected indicators of wheat flour quality - farinograph and extensograph indicators, amylolytic activity, wet gluten and flour ash content, were examined in this study. The examinations were conducted on samples of wheat harvested in the years 2007 and 2008 from the area of Central Banat in four treatments-control (without black point flour and with 2, 4 and 10% of black point flour which was added as a replacement for a part of the control sample. Statistically significant differences between treatments were observed on the dough stability, falling number and extensibility. The samples with 10% of black point flour had the lowest dough stability and the highest amylolytic activity and extensibility. There was a trend of the increasing 15 min drop and water absorption with the increased share of black point flour. Extensograph area, resistance and ratio resistance to extensibility decreased with the addition of black point flour, but not properly. Mahalanobis distance indicates that the addition of 10% black point flour had the greatest influence on the observed quality indicators, thus proving that black point contributes to the technological quality of wheat, i.e .flour.

  7. Integrating Statistical Machine Learning in a Semantic Sensor Web for Proactive Monitoring and Control.

    Science.gov (United States)

    Adeleke, Jude Adekunle; Moodley, Deshendran; Rens, Gavin; Adewumi, Aderemi Oluyinka

    2017-04-09

    Proactive monitoring and control of our natural and built environments is important in various application scenarios. Semantic Sensor Web technologies have been well researched and used for environmental monitoring applications to expose sensor data for analysis in order to provide responsive actions in situations of interest. While these applications provide quick response to situations, to minimize their unwanted effects, research efforts are still necessary to provide techniques that can anticipate the future to support proactive control, such that unwanted situations can be averted altogether. This study integrates a statistical machine learning based predictive model in a Semantic Sensor Web using stream reasoning. The approach is evaluated in an indoor air quality monitoring case study. A sliding window approach that employs the Multilayer Perceptron model to predict short term PM 2 . 5 pollution situations is integrated into the proactive monitoring and control framework. Results show that the proposed approach can effectively predict short term PM 2 . 5 pollution situations: precision of up to 0.86 and sensitivity of up to 0.85 is achieved over half hour prediction horizons, making it possible for the system to warn occupants or even to autonomously avert the predicted pollution situations within the context of Semantic Sensor Web.

  8. Integrating Statistical Machine Learning in a Semantic Sensor Web for Proactive Monitoring and Control

    Directory of Open Access Journals (Sweden)

    Jude Adekunle Adeleke

    2017-04-01

    Full Text Available Proactive monitoring and control of our natural and built environments is important in various application scenarios. Semantic Sensor Web technologies have been well researched and used for environmental monitoring applications to expose sensor data for analysis in order to provide responsive actions in situations of interest. While these applications provide quick response to situations, to minimize their unwanted effects, research efforts are still necessary to provide techniques that can anticipate the future to support proactive control, such that unwanted situations can be averted altogether. This study integrates a statistical machine learning based predictive model in a Semantic Sensor Web using stream reasoning. The approach is evaluated in an indoor air quality monitoring case study. A sliding window approach that employs the Multilayer Perceptron model to predict short term PM 2 . 5 pollution situations is integrated into the proactive monitoring and control framework. Results show that the proposed approach can effectively predict short term PM 2 . 5 pollution situations: precision of up to 0.86 and sensitivity of up to 0.85 is achieved over half hour prediction horizons, making it possible for the system to warn occupants or even to autonomously avert the predicted pollution situations within the context of Semantic Sensor Web.

  9. ANALYSIS OF QUALITY COSTS FOR STATISTICA QUALITY CONTROL PLANNING

    Directory of Open Access Journals (Sweden)

    N. Chiadamrong

    2017-12-01

    Full Text Available Quality has become one or the most important force leading to organizational success and company growth in national and international markets. The return-on-investment from strong and effective quality programs is providing excellent profitability results in firms with effective quality strategies. Due to the wide variation in quality results, the search for the genuine keys to success in quality has become a matter of deep concern to management of companies. This paper suggests a way to quantifying quality costs. As a result, the appropriate quality strategies can be adjusted and set to match with each company situation based on the categorization of the quality costs suggested. This outcome can, then, be used as a guideline for manufactures in setting their suitable quality program, which establishes the proper balance between the costs and customer services.

  10. An empirical comparison of key statistical attributes among potential ICU quality indicators.

    Science.gov (United States)

    Brown, Sydney E S; Ratcliffe, Sarah J; Halpern, Scott D

    2014-08-01

    Good quality indicators should have face validity, relevance to patients, and be able to be measured reliably. Beyond these general requirements, good quality indicators should also have certain statistical properties, including sufficient variability to identify poor performers, relative insensitivity to severity adjustment, and the ability to capture what providers do rather than patients' characteristics. We assessed the performance of candidate indicators of ICU quality on these criteria. Indicators included ICU readmission, mortality, several length of stay outcomes, and the processes of venous-thromboembolism and stress ulcer prophylaxis provision. Retrospective cohort study. One hundred thirty-eight U.S. ICUs from 2001-2008 in the Project IMPACT database. Two hundred sixty-eight thousand eight hundred twenty-four patients discharged from U.S. ICUs. None. We assessed indicators' (1) variability across ICU-years; (2) degree of influence by patient vs. ICU and hospital characteristics using the Omega statistic; (3) sensitivity to severity adjustment by comparing the area under the receiver operating characteristic curve (AUC) between models including vs. excluding patient variables, and (4) correlation between risk adjusted quality indicators using a Spearman correlation. Large ranges of among-ICU variability were noted for all quality indicators, particularly for prolonged length of stay (4.7-71.3%) and the proportion of patients discharged home (30.6-82.0%), and ICU and hospital characteristics outweighed patient characteristics for stress ulcer prophylaxis (ω, 0.43; 95% CI, 0.34-0.54), venous thromboembolism prophylaxis (ω, 0.57; 95% CI, 0.53-0.61), and ICU readmissions (ω, 0.69; 95% CI, 0.52-0.90). Mortality measures were the most sensitive to severity adjustment (area under the receiver operating characteristic curve % difference, 29.6%); process measures were the least sensitive (area under the receiver operating characteristic curve % differences

  11. 40 CFR 75.21 - Quality assurance and quality control requirements.

    Science.gov (United States)

    2010-07-01

    ... quality assurance audit or any other audit, the system is out-of-control. The owner or operator shall... 40 Protection of Environment 16 2010-07-01 2010-07-01 false Quality assurance and quality control... assurance and quality control requirements. (a) Continuous emission monitoring systems. The owner or...

  12. Distributed sensor architecture for intelligent control that supports quality of control and quality of service.

    Science.gov (United States)

    Poza-Lujan, Jose-Luis; Posadas-Yagüe, Juan-Luis; Simó-Ten, José-Enrique; Simarro, Raúl; Benet, Ginés

    2015-02-25

    This paper is part of a study of intelligent architectures for distributed control and communications systems. The study focuses on optimizing control systems by evaluating the performance of middleware through quality of service (QoS) parameters and the optimization of control using Quality of Control (QoC) parameters. The main aim of this work is to study, design, develop, and evaluate a distributed control architecture based on the Data-Distribution Service for Real-Time Systems (DDS) communication standard as proposed by the Object Management Group (OMG). As a result of the study, an architecture called Frame-Sensor-Adapter to Control (FSACtrl) has been developed. FSACtrl provides a model to implement an intelligent distributed Event-Based Control (EBC) system with support to measure QoS and QoC parameters. The novelty consists of using, simultaneously, the measured QoS and QoC parameters to make decisions about the control action with a new method called Event Based Quality Integral Cycle. To validate the architecture, the first five Braitenberg vehicles have been implemented using the FSACtrl architecture. The experimental outcomes, demonstrate the convenience of using jointly QoS and QoC parameters in distributed control systems.

  13. Distributed Sensor Architecture for Intelligent Control that Supports Quality of Control and Quality of Service

    Directory of Open Access Journals (Sweden)

    Jose-Luis Poza-Lujan

    2015-02-01

    Full Text Available This paper is part of a study of intelligent architectures for distributed control and communications systems. The study focuses on optimizing control systems by evaluating the performance of middleware through quality of service (QoS parameters and the optimization of control using Quality of Control (QoC parameters. The main aim of this work is to study, design, develop, and evaluate a distributed control architecture based on the Data-Distribution Service for Real-Time Systems (DDS communication standard as proposed by the Object Management Group (OMG. As a result of the study, an architecture called Frame-Sensor-Adapter to Control (FSACtrl has been developed. FSACtrl provides a model to implement an intelligent distributed Event-Based Control (EBC system with support to measure QoS and QoC parameters. The novelty consists of using, simultaneously, the measured QoS and QoC parameters to make decisions about the control action with a new method called Event Based Quality Integral Cycle. To validate the architecture, the first five Braitenberg vehicles have been implemented using the FSACtrl architecture. The experimental outcomes, demonstrate the convenience of using jointly QoS and QoC parameters in distributed control systems.

  14. Statistical Control Charts: Performances of Short Term Stock Trading in Croatia

    Directory of Open Access Journals (Sweden)

    Dumičić Ksenija

    2015-03-01

    Full Text Available Background: The stock exchange, as a regulated financial market, in modern economies reflects their economic development level. The stock market indicates the mood of investors in the development of a country and is an important ingredient for growth. Objectives: This paper aims to introduce an additional statistical tool used to support the decision-making process in stock trading, and it investigate the usage of statistical process control (SPC methods into the stock trading process. Methods/Approach: The individual (I, exponentially weighted moving average (EWMA and cumulative sum (CUSUM control charts were used for gaining trade signals. The open and the average prices of CROBEX10 index stocks on the Zagreb Stock Exchange were used in the analysis. The statistical control charts capabilities for stock trading in the short-run were analysed. Results: The statistical control chart analysis pointed out too many signals to buy or sell stocks. Most of them are considered as false alarms. So, the statistical control charts showed to be not so much useful in stock trading or in a portfolio analysis. Conclusions: The presence of non-normality and autocorellation has great impact on statistical control charts performances. It is assumed that if these two problems are solved, the use of statistical control charts in a portfolio analysis could be greatly improved.

  15. Statistical iterative reconstruction to improve image quality for digital breast tomosynthesis

    International Nuclear Information System (INIS)

    Xu, Shiyu; Chen, Ying; Lu, Jianping; Zhou, Otto

    2015-01-01

    Purpose: Digital breast tomosynthesis (DBT) is a novel modality with the potential to improve early detection of breast cancer by providing three-dimensional (3D) imaging with a low radiation dose. 3D image reconstruction presents some challenges: cone-beam and flat-panel geometry, and highly incomplete sampling. A promising means to overcome these challenges is statistical iterative reconstruction (IR), since it provides the flexibility of accurate physics modeling and a general description of system geometry. The authors’ goal was to develop techniques for applying statistical IR to tomosynthesis imaging data. Methods: These techniques include the following: a physics model with a local voxel-pair based prior with flexible parameters to fine-tune image quality; a precomputed parameter λ in the prior, to remove data dependence and to achieve a uniform resolution property; an effective ray-driven technique to compute the forward and backprojection; and an oversampled, ray-driven method to perform high resolution reconstruction with a practical region-of-interest technique. To assess the performance of these techniques, the authors acquired phantom data on the stationary DBT prototype system. To solve the estimation problem, the authors proposed an optimization-transfer based algorithm framework that potentially allows fewer iterations to achieve an acceptably converged reconstruction. Results: IR improved the detectability of low-contrast and small microcalcifications, reduced cross-plane artifacts, improved spatial resolution, and lowered noise in reconstructed images. Conclusions: Although the computational load remains a significant challenge for practical development, the superior image quality provided by statistical IR, combined with advancing computational techniques, may bring benefits to screening, diagnostics, and intraoperative imaging in clinical applications

  16. Statistical iterative reconstruction to improve image quality for digital breast tomosynthesis

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Shiyu, E-mail: shiyu.xu@gmail.com; Chen, Ying, E-mail: adachen@siu.edu [Department of Electrical and Computer Engineering, Southern Illinois University Carbondale, Carbondale, Illinois 62901 (United States); Lu, Jianping; Zhou, Otto [Department of Physics and Astronomy and Curriculum in Applied Sciences and Engineering, University of North Carolina Chapel Hill, Chapel Hill, North Carolina 27599 (United States)

    2015-09-15

    Purpose: Digital breast tomosynthesis (DBT) is a novel modality with the potential to improve early detection of breast cancer by providing three-dimensional (3D) imaging with a low radiation dose. 3D image reconstruction presents some challenges: cone-beam and flat-panel geometry, and highly incomplete sampling. A promising means to overcome these challenges is statistical iterative reconstruction (IR), since it provides the flexibility of accurate physics modeling and a general description of system geometry. The authors’ goal was to develop techniques for applying statistical IR to tomosynthesis imaging data. Methods: These techniques include the following: a physics model with a local voxel-pair based prior with flexible parameters to fine-tune image quality; a precomputed parameter λ in the prior, to remove data dependence and to achieve a uniform resolution property; an effective ray-driven technique to compute the forward and backprojection; and an oversampled, ray-driven method to perform high resolution reconstruction with a practical region-of-interest technique. To assess the performance of these techniques, the authors acquired phantom data on the stationary DBT prototype system. To solve the estimation problem, the authors proposed an optimization-transfer based algorithm framework that potentially allows fewer iterations to achieve an acceptably converged reconstruction. Results: IR improved the detectability of low-contrast and small microcalcifications, reduced cross-plane artifacts, improved spatial resolution, and lowered noise in reconstructed images. Conclusions: Although the computational load remains a significant challenge for practical development, the superior image quality provided by statistical IR, combined with advancing computational techniques, may bring benefits to screening, diagnostics, and intraoperative imaging in clinical applications.

  17. Statistical Analysis of the Impacts of Regional Transportation on the Air Quality in Beijing

    Science.gov (United States)

    Huang, Zhongwen; Zhang, Huiling; Tong, Lei; Xiao, Hang

    2016-04-01

    From October to December 2015, Beijing-Tianjin-Hebei (BTH) region had experienced several severe haze events. In order to assess the effects of the regional transportation on the air quality in Beijing, the air monitoring data (PM2.5, SO2, NO2 and CO) from that period published by Chinese National Environmental Monitoring Center (CNEMC) was collected and analyzed with various statistical models. The cities within BTH area were clustered into three groups according to the geographical conditions, while the air pollutant concentrations of cities within a group sharing similar variation trends. The Granger causality test results indicate that significant causal relationships exist between the air pollutant data of Beijing and its surrounding cities (Baoding, Chengde, Tianjin and Zhangjiakou) for the reference period. Then, linear regression models were constructed to capture the interdependency among the multiple time series. It shows that the observed air pollutant concentrations in Beijing were well consistent with the model-fitted results. More importantly, further analysis suggests that the air pollutants in Beijing were strongly affected by regional transportation, as the local sources only contributed 17.88%, 27.12%, 14.63% and 31.36% of PM2.5, SO2, NO2 and CO concentrations, respectively. And the major foreign source for Beijing was from Southwest (Baoding) direction, account for more than 42% of all these air pollutants. Thus, by combining various statistical models, it may not only be able to quickly predict the air qualities of any cities on a regional scale, but also to evaluate the local and regional source contributions for a particular city. Key words: regional transportation, air pollution, Granger causality test, statistical models

  18. Adaptive statistical iterative reconstruction use for radiation dose reduction in pediatric lower-extremity CT: impact on diagnostic image quality.

    Science.gov (United States)

    Shah, Amisha; Rees, Mitchell; Kar, Erica; Bolton, Kimberly; Lee, Vincent; Panigrahy, Ashok

    2018-06-01

    For the past several years, increased levels of imaging radiation and cumulative radiation to children has been a significant concern. Although several measures have been taken to reduce radiation dose during computed tomography (CT) scan, the newer dose reduction software adaptive statistical iterative reconstruction (ASIR) has been an effective technique in reducing radiation dose. To our knowledge, no studies are published that assess the effect of ASIR on extremity CT scans in children. To compare radiation dose, image noise, and subjective image quality in pediatric lower extremity CT scans acquired with and without ASIR. The study group consisted of 53 patients imaged on a CT scanner equipped with ASIR software. The control group consisted of 37 patients whose CT images were acquired without ASIR. Image noise, Computed Tomography Dose Index (CTDI) and dose length product (DLP) were measured. Two pediatric radiologists rated the studies in subjective categories: image sharpness, noise, diagnostic acceptability, and artifacts. The CTDI (p value = 0.0184) and DLP (p value ASIR compared with non-ASIR studies. However, the subjective ratings for sharpness (p ASIR images (p ASIR CT studies. Adaptive statistical iterative reconstruction reduces radiation dose for lower extremity CTs in children, but at the expense of diagnostic imaging quality. Further studies are warranted to determine the specific utility of ASIR for pediatric musculoskeletal CT imaging.

  19. Quality control in quantitative computed tomography

    International Nuclear Information System (INIS)

    Jessen, K.A.; Joergensen, J.

    1989-01-01

    Computed tomography (CT) has for several years been an indispensable tool in diagnostic radiology, but it is only recently that extraction of quantitative information from CT images has been of practical clinical value. Only careful control of the scan parameters, and especially the scan geometry, allows useful information to be obtained; and it can be demonstrated by simple phantom measurements how sensitive a CT system can be to variations in size, shape and position of the phantom in the gantry aperture. Significant differences exist between systems that are not manifested in normal control of image quality and general performance tests. Therefore an actual system has to be analysed for its suitability for quantitative use of the images before critical clinical applications are justified. (author)

  20. Quality control concept for radioactive waste packages

    International Nuclear Information System (INIS)

    Warnecke, E.; Martens, B.R.; Odoj, R.

    1990-01-01

    In the Federal Republic of Germany a contract with the BfS for the performance of quality control measures is necessary. It is principally possible to apply two alternative methods: random checks on waste packages or qualification of conditioning processes with subsequent inspections. Priority is given to the control by the process qualification. Both methods have successfully been developed in the Federal Republic of Germany and can be applied. In the course of the qualification of conditioning processes it must be demonstrated by inactive and/or active runs that waste packages are produced which fulfil the waste acceptance requirements. The qualification results in the fixation of a handbook for the operation of the respective conditioning process including the process instrumentation and the operational margins. The qualified process will be inspected to assure the compliance of the actual operation with the conditions fixed in the handbook. (orig./DG)

  1. Software Quality Control at Belle II

    Science.gov (United States)

    Ritter, M.; Kuhr, T.; Hauth, T.; Gebard, T.; Kristof, M.; Pulvermacher, C.; Belle Software Group, II

    2017-10-01

    Over the last seven years the software stack of the next generation B factory experiment Belle II has grown to over one million lines of C++ and Python code, counting only the part included in offline software releases. There are several thousand commits to the central repository by about 100 individual developers per year. To keep a coherent software stack of high quality that it can be sustained and used efficiently for data acquisition, simulation, reconstruction, and analysis over the lifetime of the Belle II experiment is a challenge. A set of tools is employed to monitor the quality of the software and provide fast feedback to the developers. They are integrated in a machinery that is controlled by a buildbot master and automates the quality checks. The tools include different compilers, cppcheck, the clang static analyzer, valgrind memcheck, doxygen, a geometry overlap checker, a check for missing or extra library links, unit tests, steering file level tests, a sophisticated high-level validation suite, and an issue tracker. The technological development infrastructure is complemented by organizational means to coordinate the development.

  2. Random Forest Application for NEXRAD Radar Data Quality Control

    Science.gov (United States)

    Keem, M.; Seo, B. C.; Krajewski, W. F.

    2017-12-01

    Identification and elimination of non-meteorological radar echoes (e.g., returns from ground, wind turbines, and biological targets) are the basic data quality control steps before radar data use in quantitative applications (e.g., precipitation estimation). Although WSR-88Ds' recent upgrade to dual-polarization has enhanced this quality control and echo classification, there are still challenges to detect some non-meteorological echoes that show precipitation-like characteristics (e.g., wind turbine or anomalous propagation clutter embedded in rain). With this in mind, a new quality control method using Random Forest is proposed in this study. This classification algorithm is known to produce reliable results with less uncertainty. The method introduces randomness into sampling and feature selections and integrates consequent multiple decision trees. The multidimensional structure of the trees can characterize the statistical interactions of involved multiple features in complex situations. The authors explore the performance of Random Forest method for NEXRAD radar data quality control. Training datasets are selected using several clear cases of precipitation and non-precipitation (but with some non-meteorological echoes). The model is structured using available candidate features (from the NEXRAD data) such as horizontal reflectivity, differential reflectivity, differential phase shift, copolar correlation coefficient, and their horizontal textures (e.g., local standard deviation). The influence of each feature on classification results are quantified by variable importance measures that are automatically estimated by the Random Forest algorithm. Therefore, the number and types of features in the final forest can be examined based on the classification accuracy. The authors demonstrate the capability of the proposed approach using several cases ranging from distinct to complex rain/no-rain events and compare the performance with the existing algorithms (e

  3. Pengendalian Kualitas Kertas Dengan Menggunakan Statistical Process Control di Paper Machine 3

    Directory of Open Access Journals (Sweden)

    Vera Devani

    2017-01-01

    Full Text Available Purpose of this research is to determine types and causes of defects commonly found in Paper Machine 3 by using statistical process control (SPC method.  Statistical process control (SPC is a technique for solving problems and is used to monitor, control, analyze, manage and improve products and processes using statistical methods.  Based on Pareto Diagrams, wavy defect is found as the most frequent defect, which is 81.7%.  Human factor, meanwhile, is found as the main cause of defect, primarily due to lack of understanding on machinery and lack of training both leading to errors in data input.

  4. On Improving the Quality and Interpretation of Environmental Assessments using Statistical Analysis and Geographic Information Systems

    Science.gov (United States)

    Karuppiah, R.; Faldi, A.; Laurenzi, I.; Usadi, A.; Venkatesh, A.

    2014-12-01

    An increasing number of studies are focused on assessing the environmental footprint of different products and processes, especially using life cycle assessment (LCA). This work shows how combining statistical methods and Geographic Information Systems (GIS) with environmental analyses can help improve the quality of results and their interpretation. Most environmental assessments in literature yield single numbers that characterize the environmental impact of a process/product - typically global or country averages, often unchanging in time. In this work, we show how statistical analysis and GIS can help address these limitations. For example, we demonstrate a method to separately quantify uncertainty and variability in the result of LCA models using a power generation case study. This is important for rigorous comparisons between the impacts of different processes. Another challenge is lack of data that can affect the rigor of LCAs. We have developed an approach to estimate environmental impacts of incompletely characterized processes using predictive statistical models. This method is applied to estimate unreported coal power plant emissions in several world regions. There is also a general lack of spatio-temporal characterization of the results in environmental analyses. For instance, studies that focus on water usage do not put in context where and when water is withdrawn. Through the use of hydrological modeling combined with GIS, we quantify water stress on a regional and seasonal basis to understand water supply and demand risks for multiple users. Another example where it is important to consider regional dependency of impacts is when characterizing how agricultural land occupation affects biodiversity in a region. We developed a data-driven methodology used in conjuction with GIS to determine if there is a statistically significant difference between the impacts of growing different crops on different species in various biomes of the world.

  5. Statistical applications for chemistry, manufacturing and controls (CMC) in the pharmaceutical industry

    CERN Document Server

    Burdick, Richard K; Pfahler, Lori B; Quiroz, Jorge; Sidor, Leslie; Vukovinsky, Kimberly; Zhang, Lanju

    2017-01-01

    This book examines statistical techniques that are critically important to Chemistry, Manufacturing, and Control (CMC) activities. Statistical methods are presented with a focus on applications unique to the CMC in the pharmaceutical industry. The target audience consists of statisticians and other scientists who are responsible for performing statistical analyses within a CMC environment. Basic statistical concepts are addressed in Chapter 2 followed by applications to specific topics related to development and manufacturing. The mathematical level assumes an elementary understanding of statistical methods. The ability to use Excel or statistical packages such as Minitab, JMP, SAS, or R will provide more value to the reader. The motivation for this book came from an American Association of Pharmaceutical Scientists (AAPS) short course on statistical methods applied to CMC applications presented by four of the authors. One of the course participants asked us for a good reference book, and the only book recomm...

  6. Quality control of nuclear medicine instruments

    International Nuclear Information System (INIS)

    1984-11-01

    This document, which gives detailed guidance on the quality control of the various electronic instruments used for radiation detection and measurement in nuclear medicine, stems from the work of two Advisory Groups convened by the International Atomic Energy Agency (IAEA). A preliminary document, including recommended test schedules but lacking actual protocols for the tests, was drawn up by the first of these groups, meeting at the IAEA Headquarters in Vienna in 1979. A revised and extended version, incorporating recommended test protocols, was prepared by the second Group, meeting likewise in Vienna in 1982. This version is the model for the present text. The document should be of value to all nuclear medicine units, and especially to those in developing countries, in the initiation or revision of schemes for the quality control of their instruments. Its recommendations have provided the basis for instruction in two IAEA regional technical co-operation projects in the subject field, one initiated in 1981 for countries of Latin America and one initiated in 1982 for countries of Asia and the Pacific

  7. Toward standardising gamma camera quality control procedures

    International Nuclear Information System (INIS)

    Alkhorayef, M.A.; Alnaaimi, M.A.; Alduaij, M.A.; Mohamed, M.O.; Ibahim, S.Y.; Alkandari, F.A.; Bradley, D.A.

    2015-01-01

    Attaining high standards of efficiency and reliability in the practice of nuclear medicine requires appropriate quality control (QC) programs. For instance, the regular evaluation and comparison of extrinsic and intrinsic flood-field uniformity enables the quick correction of many gamma camera problems. Whereas QC tests for uniformity are usually performed by exposing the gamma camera crystal to a uniform flux of gamma radiation from a source of known activity, such protocols can vary significantly. Thus, there is a need for optimization and standardization, in part to allow direct comparison between gamma cameras from different vendors. In the present study, intrinsic uniformity was examined as a function of source distance, source activity, source volume and number of counts. The extrinsic uniformity and spatial resolution were also examined. Proper standard QC procedures need to be implemented because of the continual development of nuclear medicine imaging technology and the rapid expansion and increasing complexity of hybrid imaging system data. The present work seeks to promote a set of standard testing procedures to contribute to the delivery of safe and effective nuclear medicine services. - Highlights: • Optimal parameters for quality control of the gamma camera are proposed. • For extrinsic and intrinsic uniformity a minimum of 15,000 counts is recommended. • For intrinsic flood uniformity the activity should not exceed 100 µCi (3.7 MBq). • For intrinsic uniformity the source to detector distance should be at least 60 cm. • The bar phantom measurement must be performed with at least 15 million counts.

  8. Performance and quality control of scintillation cameras

    International Nuclear Information System (INIS)

    Moretti, J.L.; Iachetti, D.

    1983-01-01

    Acceptance testing, quality and control assurance of gamma-cameras are a part of diagnostic quality in clinical practice. Several parameters are required to achieve a good diagnostic reliability: intrinsic spatial resolution, spatial linearity, uniformities, energy resolution, count-rate characteristics, multiple window spatial analysis. Each parameter was measured and also estimated by a test easy to implement in routine practice. Material required was a 4028 multichannel analyzer linked to a microcomputeur, mini-computers and a set of phantoms (parallel slits, diffusing phantom, orthogonal hole transmission pattern). Gamma-cameras on study were:CGR 3400, CGR 3420, G.E.4000. Siemens ZLC 75 and large field Philips. Several tests proposed by N.E.M.A. and W.H.O. have to be improved concerning too punctual spatial determinations during distortion measurements with multiple window. Contrast control of image need to be monitored with high counting rate. This study shows the need to avoid punctual determinations and the interest to give sets of values of the same parameter on the whole field and to report mean values with their standard variation [fr

  9. Quality control of radioiodinated gastrin for radioimmunoassay

    International Nuclear Information System (INIS)

    Ginabreda, M.G.P.; Borghi, V.C.; Bettarello, A.

    1988-07-01

    Radioiodinated human gastrin has been prepared at IPEN laboratory for radioimmunoassay use. This work developed the quality control of this tracer analyzing parameters of the labelling reaction, chromatographic purification and radioimmunoassay. The radioiodination yield obtained in five experiments was reproducible and similar when analyzed on 7% polyaraylamide gel eletrophoresis - PAGE - (mean + - SD of 51.70 + - 10.76%) and by1 25 I incorporation checked through thrichloroacetic acid precipitation - TCA - (57-36 + - 9.69%). Similary, after purification the labelled gastrin revaled high and reproducible purity degree when submitted to PAGE (96.57 + - 1.06%) and CA (94.82 + - 4.20%) analysis. The respective specific activities varied from 62 to 307 uCi/ug, being determined by the self-displacement method, which is based on the immunoactivity of the tracer. In this way, the antibody titers required to bind 50% of the tracer ranged from 1:32.000 to 1:180.000. Consequently, the respective doses producing 50% fall in the maximum response of the radioimmunoassays ranged from 155.0 to 24.0 pmol/1, but remained unchanged for each tracer even after three months of its preparations. The tracers presented very low non-specific binding values (1.78 + - 0.79%), stablespecific binding values (46.49 + - 5.65%) and a good between-assay precision, evaluated by an internal quality control sample (25.71 + - 4.30%) with coefficient of variation of 16.74%). The PAGE analysis of the unlabeled gastrin used in the first and last radioiodination revealed an unique and unaltered component, confirming the quality of the tracers. (author) [pt

  10. Adaptive statistical iterative reconstruction: reducing dose while preserving image quality in the pediatric head CT examination

    International Nuclear Information System (INIS)

    McKnight, Colin D.; Watcharotone, Kuanwong; Ibrahim, Mohannad; Christodoulou, Emmanuel; Baer, Aaron H.; Parmar, Hemant A.

    2014-01-01

    Over the last decade there has been escalating concern regarding the increasing radiation exposure stemming from CT exams, particularly in children. Adaptive statistical iterative reconstruction (ASIR) is a relatively new and promising tool to reduce radiation dose while preserving image quality. While encouraging results have been found in adult head and chest and body imaging, validation of this technique in pediatric population is limited. The objective of our study was to retrospectively compare the image quality and radiation dose of pediatric head CT examinations obtained with ASIR compared to pediatric head CT examinations without ASIR in a large patient population. Retrospective analysis was performed on 82 pediatric head CT examinations. This group included 33 pediatric head CT examinations obtained with ASIR and 49 pediatric head CT examinations without ASIR. Computed tomography dose index (CTDI vol ) was recorded on all examinations. Quantitative analysis consisted of standardized measurement of attenuation and the standard deviation at the bilateral centrum semiovale and cerebellar white matter to evaluate objective noise. Qualitative analysis consisted of independent assessment by two radiologists in a blinded manner of gray-white differentiation, sharpness and overall diagnostic quality. The average CTDI vol value of the ASIR group was 21.8 mGy (SD = 4.0) while the average CTDI vol for the non-ASIR group was 29.7 mGy (SD = 13.8), reflecting a statistically significant reduction in CTDI vol in the ASIR group (P 12-year-old ASIR group as compared to the >12-year-old non-ASIR group (29.7 mGy vs. 49.9 mGy; P = 0.0002). Quantitative analysis revealed no significant difference in the homogeneity of variance in the ASIR group compared to the non-ASIR group. Radiologist assessment of gray-white differentiation, sharpness and overall diagnostic quality in ASIR examinations was not substantially different compared to non-ASIR examinations. The use of ASIR in

  11. EVALUATION OF PHOTOGRAMMETRIC BLOCK ORIENTATION USING QUALITY DESCRIPTORS FROM STATISTICALLY FILTERED TIE POINTS

    Directory of Open Access Journals (Sweden)

    A. Calantropio

    2018-05-01

    Full Text Available Due to the increasing number of low-cost sensors, widely accessible on the market, and because of the supposed granted correctness of the semi-automatic workflow for 3D reconstruction, highly implemented in the recent commercial software, more and more users operate nowadays without following the rigorousness of classical photogrammetric methods. This behaviour often naively leads to 3D products that lacks metric quality assessment. This paper proposes and analyses an approach that gives the users the possibility to preserve the trustworthiness of the metric information inherent in the 3D model, without sacrificing the automation offered by modern photogrammetry software. At the beginning, the importance of Data Quality Assessment is outlined, together with some recall of photogrammetry best practices. With the purpose of guiding the user through a correct pipeline for a certified 3D model reconstruction, an operative workflow is proposed, focusing on the first part of the object reconstruction steps (tie-points extraction, camera calibration, and relative orientation. A new GUI (Graphical User Interface developed for the open source MicMac suite is then presented, and a sample dataset is used for the evaluation of the photogrammetric block orientation using statistically obtained quality descriptors. The results and the future directions are then presented and discussed.

  12. Application of Multivariate Statistical Analysis in Evaluation of Surface River Water Quality of a Tropical River

    Directory of Open Access Journals (Sweden)

    Teck-Yee Ling

    2017-01-01

    Full Text Available The present study evaluated the spatial variations of surface water quality in a tropical river using multivariate statistical techniques, including cluster analysis (CA and principal component analysis (PCA. Twenty physicochemical parameters were measured at 30 stations along the Batang Baram and its tributaries. The water quality of the Batang Baram was categorized as “slightly polluted” where the chemical oxygen demand and total suspended solids were the most deteriorated parameters. The CA grouped the 30 stations into four clusters which shared similar characteristics within the same cluster, representing the upstream, middle, and downstream regions of the main river and the tributaries from the middle to downstream regions of the river. The PCA has determined a reduced number of six principal components that explained 83.6% of the data set variance. The first PC indicated that the total suspended solids, turbidity, and hydrogen sulphide were the dominant polluting factors which is attributed to the logging activities, followed by the five-day biochemical oxygen demand, total phosphorus, organic nitrogen, and nitrate-nitrogen in the second PC which are related to the discharges from domestic wastewater. The components also imply that logging activities are the major anthropogenic activities responsible for water quality variations in the Batang Baram when compared to the domestic wastewater discharge.

  13. A bibliometric analysis of 50 years of worldwide research on statistical process control

    Directory of Open Access Journals (Sweden)

    Fabiane Letícia Lizarelli

    Full Text Available Abstract An increasing number of papers on statistical process control (SPC has emerged in the last fifty years, especially in the last fifteen years. This may be attributed to the increased global competitiveness generated by innovation and the continuous improvement of products and processes. In this sense, SPC has a fundamentally important role in quality and production systems. The research in this paper considers the context of technological improvement and innovation of products and processes to increase corporate competitiveness. There are several other statistical technics and tools for assisting continuous improvement and innovation of products and processes but, despite the limitations in their use in the improvement projects, there is growing concern about the use of SPC. A gap between the SPC technics taught in engineering courses and their practical applications to industrial problems is observed in empirical research; thus, it is important to understand what has been done and identify the trends in SPC research. The bibliometric study in this paper is proposed in this direction and uses the Web of Science (WoS database. Data analysis indicates that there was a growth rate of more than 90% in the number of publications on SPC after 1990. Our results reveal the countries where these publications have come from, the authors with the highest number of papers and their networks. Main sources of publications are also identified; it is observed that the publications of SPC papers are concentrated in some of the international research journals, not necessarily those with the major high-impact factors. Furthermore, the papers are focused on industrial engineering, operations research and management science fields. The most common term found in the papers was cumulative sum control charts, but new topics have emerged and have been researched in the past ten years, such as multivariate methods for process monitoring and nonparametric methods.

  14. 7 CFR 58.141 - Alternate quality control program.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Alternate quality control program. 58.141 Section 58... Service 1 Quality Specifications for Raw Milk § 58.141 Alternate quality control program. When a plant has in operation an acceptable quality program, at the producer level, which is approved by the...

  15. Technology requirement for Halal quality control | Husny | Journal of ...

    African Journals Online (AJOL)

    Technology requirement for Halal quality control. ... Findings show that each industry segments have different technology characteristics preference. ... Keywords: halal industry, quality control; technology assistance; food and beverage; ...

  16. Statistical process control as a tool for controlling operating room performance: retrospective analysis and benchmarking.

    Science.gov (United States)

    Chen, Tsung-Tai; Chang, Yun-Jau; Ku, Shei-Ling; Chung, Kuo-Piao

    2010-10-01

    There is much research using statistical process control (SPC) to monitor surgical performance, including comparisons among groups to detect small process shifts, but few of these studies have included a stabilization process. This study aimed to analyse the performance of surgeons in operating room (OR) and set a benchmark by SPC after stabilized process. The OR profile of 499 patients who underwent laparoscopic cholecystectomy performed by 16 surgeons at a tertiary hospital in Taiwan during 2005 and 2006 were recorded. SPC was applied to analyse operative and non-operative times using the following five steps: first, the times were divided into two segments; second, they were normalized; third, they were evaluated as individual processes; fourth, the ARL(0) was calculated;, and fifth, the different groups (surgeons) were compared. Outliers were excluded to ensure stability for each group and to facilitate inter-group comparison. The results showed that in the stabilized process, only one surgeon exhibited a significantly shorter total process time (including operative time and non-operative time). In this study, we use five steps to demonstrate how to control surgical and non-surgical time in phase I. There are some measures that can be taken to prevent skew and instability in the process. Also, using SPC, one surgeon can be shown to be a real benchmark. © 2010 Blackwell Publishing Ltd.

  17. SU-E-T-205: MLC Predictive Maintenance Using Statistical Process Control Analysis.

    Science.gov (United States)

    Able, C; Hampton, C; Baydush, A; Bright, M

    2012-06-01

    MLC failure increases accelerator downtime and negatively affects the clinic treatment delivery schedule. This study investigates the use of Statistical Process Control (SPC), a modern quality control methodology, to retrospectively evaluate MLC performance data thereby predicting the impending failure of individual MLC leaves. SPC, a methodology which detects exceptional variability in a process, was used to analyze MLC leaf velocity data. A MLC velocity test is performed weekly on all leaves during morning QA. The leaves sweep 15 cm across the radiation field with the gantry pointing down. The leaf speed is analyzed from the generated dynalog file using quality assurance software. MLC leaf speeds in which a known motor failure occurred (8) and those in which no motor replacement was performed (11) were retrospectively evaluated for a 71 week period. SPC individual and moving range (I/MR) charts were used in the analysis. The I/MR chart limits were calculated using the first twenty weeks of data and set at 3 standard deviations from the mean. The MLCs in which a motor failure occurred followed two general trends: (a) no data indicating a change in leaf speed prior to failure (5 of 8) and (b) a series of data points exceeding the limit prior to motor failure (3 of 8). I/MR charts for a high percentage (8 of 11) of the non-replaced MLC motors indicated that only a single point exceeded the limit. These single point excesses were deemed false positives. SPC analysis using MLC performance data may be helpful in detecting a significant percentage of impending failures of MLC motors. The ability to detect MLC failure may depend on the method of failure (i.e. gradual or catastrophic). Further study is needed to determine if increasing the sampling frequency could increase reliability. Project was support by a grant from Varian Medical Systems, Inc. © 2012 American Association of Physicists in Medicine.

  18. Improving Quality in Teaching Statistics Concepts Using Modern Visualization: The Design and Use of the Flash Application on Pocket PCs

    Science.gov (United States)

    Vaughn, Brandon K.; Wang, Pei-Yu

    2009-01-01

    The emergence of technology has led to numerous changes in mathematical and statistical teaching and learning which has improved the quality of instruction and teacher/student interactions. The teaching of statistics, for example, has shifted from mathematical calculations to higher level cognitive abilities such as reasoning, interpretation, and…

  19. Managing Air Quality - Control Strategies to Achieve Air Pollution Reduction

    Science.gov (United States)

    Considerations in designing an effective control strategy related to air quality, controlling pollution sources, need for regional or national controls, steps to developing a control strategy, and additional EPA resources.

  20. 10 CFR 26.137 - Quality assurance and quality control.

    Science.gov (United States)

    2010-01-01

    ... cutoff concentration for the compound of interest, a control without the compound of interest (i.e., a certified negative control), and a control with at least one of the compounds of interest at a measurable... calibrator, a control without the compound of interest (i.e., a certified negative control), and a control...

  1. Influence of adaptive statistical iterative reconstruction algorithm on image quality in coronary computed tomography angiography

    Directory of Open Access Journals (Sweden)

    Helle Precht

    2016-12-01

    Full Text Available Background Coronary computed tomography angiography (CCTA requires high spatial and temporal resolution, increased low contrast resolution for the assessment of coronary artery stenosis, plaque detection, and/or non-coronary pathology. Therefore, new reconstruction algorithms, particularly iterative reconstruction (IR techniques, have been developed in an attempt to improve image quality with no cost in radiation exposure. Purpose To evaluate whether adaptive statistical iterative reconstruction (ASIR enhances perceived image quality in CCTA compared to filtered back projection (FBP. Material and Methods Thirty patients underwent CCTA due to suspected coronary artery disease. Images were reconstructed using FBP, 30% ASIR, and 60% ASIR. Ninety image sets were evaluated by five observers using the subjective visual grading analysis (VGA and assessed by proportional odds modeling. Objective quality assessment (contrast, noise, and the contrast-to-noise ratio [CNR] was analyzed with linear mixed effects modeling on log-transformed data. The need for ethical approval was waived by the local ethics committee as the study only involved anonymously collected clinical data. Results VGA showed significant improvements in sharpness by comparing FBP with ASIR, resulting in odds ratios of 1.54 for 30% ASIR and 1.89 for 60% ASIR (P = 0.004. The objective measures showed significant differences between FBP and 60% ASIR (P < 0.0001 for noise, with an estimated ratio of 0.82, and for CNR, with an estimated ratio of 1.26. Conclusion ASIR improved the subjective image quality of parameter sharpness and, objectively, reduced noise and increased CNR.

  2. Improving the Document Development Process: Integrating Relational Data and Statistical Process Control.

    Science.gov (United States)

    Miller, John

    1994-01-01

    Presents an approach to document numbering, document titling, and process measurement which, when used with fundamental techniques of statistical process control, reveals meaningful process-element variation as well as nominal productivity models. (SR)

  3. 14 CFR 21.147 - Changes in quality control system.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Changes in quality control system. 21.147 Section 21.147 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION... quality control system. After the issue of a production certificate, each change to the quality control...

  4. 14 CFR 145.211 - Quality control system.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 3 2010-01-01 2010-01-01 false Quality control system. 145.211 Section 145...) SCHOOLS AND OTHER CERTIFICATED AGENCIES REPAIR STATIONS Operating Rules § 145.211 Quality control system. (a) A certificated repair station must establish and maintain a quality control system acceptable to...

  5. 7 CFR 58.523 - Laboratory and quality control tests.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Laboratory and quality control tests. 58.523 Section... Service 1 Operations and Operating Procedures § 58.523 Laboratory and quality control tests. (a) Quality control tests shall be made on samples as often as necessary to determine the shelf-life and stability of...

  6. 18 CFR 12.40 - Quality control programs.

    Science.gov (United States)

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Quality control... PROJECT WORKS Other Responsibilities of Applicant or Licensee § 12.40 Quality control programs. (a... meeting any requirements or standards set by the Regional Engineer. If a quality control program is...

  7. 21 CFR 640.56 - Quality control test for potency.

    Science.gov (United States)

    2010-04-01

    ... quality control test for potency may be performed by a clinical laboratory which meets the standards of... 21 Food and Drugs 7 2010-04-01 2010-04-01 false Quality control test for potency. 640.56 Section...) BIOLOGICS ADDITIONAL STANDARDS FOR HUMAN BLOOD AND BLOOD PRODUCTS Cryoprecipitate § 640.56 Quality control...

  8. Statistical methods for the evaluation of educational services and quality of products

    CERN Document Server

    Bini, Matilde; Piccolo, Domenico; Salmaso, Luigi

    2009-01-01

    The book presents statistical methods and models that can usefully support the evaluation of educational services and quality of products. The evaluation of educational services, as well as the analysis of judgments and preferences, poses severe methodological challenges because of the presence of the following aspects: the observational nature of the context, which is associated with the problems of selection bias and presence of nuisance factors; the hierarchical structure of the data (multilevel analysis); the multivariate and qualitative nature of the dependent variable; the presence of non observable factors, e.g. the satisfaction, calling for the use of latent variables models; the simultaneous presence of components of pleasure and components of uncertainty in the explication of the judgments, that asks for the specification and estimation of mixture models. The contributions concern methodological advances developed mostly with reference to specific problems of evaluation using real data sets.

  9. Quality control of ATLAS muon chambers

    CERN Document Server

    Fabich, Adrian

    ATLAS is a general-purpose experiment for the future Large Hadron Collider (LHC) at CERN. Its Muon Spectrometer will require ∼ 5500m2 of precision tracking chambers to measure the muon tracks along a spectrometer arm of 5m to 15m length, embedded in a magnetic field of ∼ 0.5T. The precision tracking devices in the Muon System will be high pressure drift tubes (MDTs). Approximately 370,000 MDTs will be assembled into ∼ 1200 drift chambers. The performance of the MDT chambers is very much dependent on the mechanical quality of the chambers. The uniformity and stability of the performance can only be assured providing very high quality control during production. Gas tightness, high-voltage behaviour and dark currents are global parameters which are common to gas detectors. For all chambers, they will be tested immediately after the chamber assembly at every production site. Functional tests, for example radioactive source scans and cosmic-ray runs, will be performed in order to establish detailed performan...

  10. Acceptance, commissioning and quality control in radiosurgery

    International Nuclear Information System (INIS)

    Toreti, Dalila Luzia

    2009-01-01

    Stereotactic Radiosurgery is a treatment technique that uses narrow beams of radiation focused with great accuracy in a small lesion. The introduction of micro multi leaf collimators (mMLC) allows this technique to reach a higher degree of dose conformation of the target lesion allowing a smaller irradiation of critical structures and normal tissues. This paper presents the results of the acceptance tests and commissioning of a Varian 6EX linear accelerator dedicated to radiosurgery associated with the BrainLab micro multi leaf collimator installed in the Hospital das Clinicas da Faculdade de Medicina da USP (HC-FMUSP) and establish feasible quality assurance program for the services that employ this special technique. The results of the acceptance tests were satisfactory and are willing with the specifications provided by the manufacturer and the commissioning tests were within the international recommendations. The tests and measures that are part of quality control process should be specific to each treatment unit, and the need, frequency and levels of tolerance

  11. Quality control in nuclear fuel fabrication

    International Nuclear Information System (INIS)

    Abdelhalim, A.S.; Elsayed, A.A.; Shaaban, H.I.

    1988-01-01

    The department of metallurgy, NRC Inchass is embarking on a programme of on a laboratory scale, fuel pins containing uranium dioxide pellets are going to be produced. The department is making use of the expertise and equipment at present available and is going to utilize the new fuel pin fabrication unit which would be shortly in operation. The fabrication and testing of uranium dioxide pellets then gradually adapt them and develop, a national know how in this field. This would also involve building up of indigenous experience through proper training of qualified personnel. That are applied to ensure quality of U o 2 pellets, the techniques implemented, the equipment used and the specifications of the equipment presently available. The following parameters are subject to quality control tests: density. O/U ration, hydrogen content, microstructure, each property will be discussed, measurements related to U o 2 powders, including flow ability, bulk density, O/U ratio, bet surface area and water content will be critically discussed. Relevant tests to ensure Q C of pellets are reviewed. These include surface integrity, density, dimensions, microstructure.4 fig., 1 tab

  12. Tools for quality control of fingerprint databases

    Science.gov (United States)

    Swann, B. Scott; Libert, John M.; Lepley, Margaret A.

    2010-04-01

    Integrity of fingerprint data is essential to biometric and forensic applications. Accordingly, the FBI's Criminal Justice Information Services (CJIS) Division has sponsored development of software tools to facilitate quality control functions relative to maintaining its fingerprint data assets inherent to the Integrated Automated Fingerprint Identification System (IAFIS) and Next Generation Identification (NGI). This paper provides an introduction of two such tools. The first FBI-sponsored tool was developed by the National Institute of Standards and Technology (NIST) and examines and detects the spectral signature of the ridge-flow structure characteristic of friction ridge skin. The Spectral Image Validation/Verification (SIVV) utility differentiates fingerprints from non-fingerprints, including blank frames or segmentation failures erroneously included in data; provides a "first look" at image quality; and can identify anomalies in sample rates of scanned images. The SIVV utility might detect errors in individual 10-print fingerprints inaccurately segmented from the flat, multi-finger image acquired by one of the automated collection systems increasing in availability and usage. In such cases, the lost fingerprint can be recovered by re-segmentation from the now compressed multi-finger image record. The second FBI-sponsored tool, CropCoeff was developed by MITRE and thoroughly tested via NIST. CropCoeff enables cropping of the replacement single print directly from the compressed data file, thus avoiding decompression and recompression of images that might degrade fingerprint features necessary for matching.

  13. A social network's changing statistical properties and the quality of human innovation

    Energy Technology Data Exchange (ETDEWEB)

    Uzzi, Brian [Kellogg School of Management, Northwestern University, Evanston, IL (United States)], E-mail: uzzi@northwestern.edu

    2008-06-06

    We examined the entire network of creative artists that made Broadway musicals, in the post-War period, a collaboration network of international acclaim and influence, with an eye to investigating how the network's structural features condition the relationship between individual artistic talent and the success of their musicals. Our findings show that some of the evolving topographical qualities of degree distributions, path lengths and assortativity are relatively stable with time even as collaboration patterns shift, which suggests their changes are only minimally associated with the ebb and flux of the success of new productions. In contrast, the clustering coefficient changed substantially over time and we found that it had a nonlinear association with the production of financially and artistically successful shows. When the clustering coefficient ratio is low or high, the financial and artistic success of the industry is low, while an intermediate level of clustering is associated with successful shows. We supported these findings with sociological theory on the relationship between social structure and collaboration and with tests of statistical inference. Our discussion focuses on connecting the statistical properties of social networks to their performance and the performance of the actors embedded within them.

  14. Classification of Underlying Causes of Power Quality Disturbances: Deterministic versus Statistical Methods

    Directory of Open Access Journals (Sweden)

    Emmanouil Styvaktakis

    2007-01-01

    Full Text Available This paper presents the two main types of classification methods for power quality disturbances based on underlying causes: deterministic classification, giving an expert system as an example, and statistical classification, with support vector machines (a novel method as an example. An expert system is suitable when one has limited amount of data and sufficient power system expert knowledge; however, its application requires a set of threshold values. Statistical methods are suitable when large amount of data is available for training. Two important issues to guarantee the effectiveness of a classifier, data segmentation, and feature extraction are discussed. Segmentation of a sequence of data recording is preprocessing to partition the data into segments each representing a duration containing either an event or a transition between two events. Extraction of features is applied to each segment individually. Some useful features and their effectiveness are then discussed. Some experimental results are included for demonstrating the effectiveness of both systems. Finally, conclusions are given together with the discussion of some future research directions.

  15. A social network's changing statistical properties and the quality of human innovation

    International Nuclear Information System (INIS)

    Uzzi, Brian

    2008-01-01

    We examined the entire network of creative artists that made Broadway musicals, in the post-War period, a collaboration network of international acclaim and influence, with an eye to investigating how the network's structural features condition the relationship between individual artistic talent and the success of their musicals. Our findings show that some of the evolving topographical qualities of degree distributions, path lengths and assortativity are relatively stable with time even as collaboration patterns shift, which suggests their changes are only minimally associated with the ebb and flux of the success of new productions. In contrast, the clustering coefficient changed substantially over time and we found that it had a nonlinear association with the production of financially and artistically successful shows. When the clustering coefficient ratio is low or high, the financial and artistic success of the industry is low, while an intermediate level of clustering is associated with successful shows. We supported these findings with sociological theory on the relationship between social structure and collaboration and with tests of statistical inference. Our discussion focuses on connecting the statistical properties of social networks to their performance and the performance of the actors embedded within them

  16. A social network's changing statistical properties and the quality of human innovation

    Science.gov (United States)

    Uzzi, Brian

    2008-06-01

    We examined the entire network of creative artists that made Broadway musicals, in the post-War period, a collaboration network of international acclaim and influence, with an eye to investigating how the network's structural features condition the relationship between individual artistic talent and the success of their musicals. Our findings show that some of the evolving topographical qualities of degree distributions, path lengths and assortativity are relatively stable with time even as collaboration patterns shift, which suggests their changes are only minimally associated with the ebb and flux of the success of new productions. In contrast, the clustering coefficient changed substantially over time and we found that it had a nonlinear association with the production of financially and artistically successful shows. When the clustering coefficient ratio is low or high, the financial and artistic success of the industry is low, while an intermediate level of clustering is associated with successful shows. We supported these findings with sociological theory on the relationship between social structure and collaboration and with tests of statistical inference. Our discussion focuses on connecting the statistical properties of social networks to their performance and the performance of the actors embedded within them.

  17. Quality control tests for conventional mammography

    International Nuclear Information System (INIS)

    Dawod, Alnazer Ahmed Ibrahim

    2014-12-01

    Mammography is this the test that allows the radiologist to look at images of the inside of the breasts. Mammograms help detect breast cancer early successful treatment of breast cancer depends on that early diagnosis. Breast cancer is a very common condition. About one in every nine women develops breast cancer by the age of eighty. In addition to the clinical examination and self-examination, mammography plays important role in the detection of breast cancer before they become clinically visible tumors. The mammography is the most common test for early detection of breast cancer. Quality control techniques that done ensured importance of this programme to produce images with good diagnostic values and help radiologist to diagnose breast discase easily and avoid exposing patient to radiation hazards.(Author)

  18. Analytical quality control [An IAEA service

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1973-07-01

    In analytical chemistry the determination of small or trace amounts of elements or compounds in different types of materials is increasingly important. The results of these findings have a great influence on different fields of science, and on human life. Their reliability, precision and accuracy must, therefore, be checked by analytical quality control measures. The International Atomic Energy Agency (IAEA) set up an Analytical Quality Control Service (AQCS) in 1962 to assist laboratories in Member States in the assessment of their reliability in radionuclide analysis, and in other branches of applied analysis in which radionuclides may be used as analytical implements. For practical reasons, most analytical laboratories are not in a position to check accuracy internally, as frequently resources are available for only one method; standardized sample material, particularly in the case of trace analysis, is not available and can be prepared by the institutes themselves only in exceptional cases; intercomparisons are organized rather seldom and many important types of analysis are so far not covered. AQCS assistance is provided by the shipment to laboratories of standard reference materials containing known quantities of different trace elements or radionuclides, as well as by the organization of analytical intercomparisons in which the participating laboratories are provided with aliquots of homogenized material of unknown composition for analysis. In the latter case the laboratories report their data to the Agency's laboratory, which calculates averages and distributions of results and advises each laboratory of its performance relative to all the others. Throughout the years several dozens of intercomparisons have been organized and many thousands of samples provided. The service offered, as a consequence, has grown enormously. The programme for 1973 and 1974, which is currently being distributed to Member States, will contain 31 different types of materials.

  19. The growing need for analytical quality control

    International Nuclear Information System (INIS)

    Suschny, O.; Richman, D.M.

    1974-01-01

    Technological development in a country is directly dependent upon its analytical chemistry or measurement capability, because it is impossible to achieve any level of technological sophistication without the ability to measure. Measurement capability is needed to determine both technological competence and technological consequence. But measurement itself is insufficient. There must be a standard or a reference for comparison. In the complicated world of chemistry the need for reference materials grows with successful technological development. The International Atomic Energy Agency has been distributing calibrated radioisotope solutions, standard reference materials and intercomparison materials since the early 1960's. The purpose of this activity has been to help laboratories in its Member States to assess and, if necessary, to improve the reliability of their analytical work. The value and continued need of this service has been demonstrated by the results of many intercomparisons which proved that without continuing analytical quality control activities, adequate reliability of analytical data could not be taken for granted. Analytical chemistry, lacking the glamour of other aspects of the physical sciences, has not attracted the attention it deserves, but in terms of practical importance, it warrants high priority in any developing technological scheme, because without it there is little chance to evaluate technological success or failure or opportunity to identify the reasons for success or failure. The scope and the size of the future programme of the IAEA in this field has been delineated by recommendations made by several Panels of Experts; all have agreed on the importance of this programme and made detailed recommendations in their areas of expertise. The Agency's resources are limited and it cannot on its own undertake the preparation and distribution of all the materials needed. It can, however, offer a focal point to bring together different

  20. Terms and definitions of quality assurance/quality control

    International Nuclear Information System (INIS)

    Kaden, W.

    1980-01-01

    Terms of quality assurance are defined and interpreted. Reference is made to the IAEA Code of Practice and to other important Codes and Standards like ANSI, ASME and KTA. The relevance of these terms to everyday's work and problems of a quality assurance engineer is explained. (orig.)

  1. [Compatibility of different quality control systems].

    Science.gov (United States)

    Invernizzi, Enrico

    2002-01-01

    Management of the good laboratory practice (GLP) quality system presupposes its linking to a basic recognized and approved quality system, from which it can draw on management procedures common to all quality systems, such as the ISO 9000 set of norms. A quality system organized in this way can also be integrated with other dedicated quality systems, or parts of them, to obtain principles or management procedures for specific topics. The aim of this organization is to set up a reliable, recognized quality system compatible with the principles of GLP and other quality management systems, which provides users with a simplified set of easily accessible management tools and answers. The organization of this quality system is set out in the quality assurance programme, which is actually the document in which the test facility incorporates the GLP principles into its own quality organization.

  2. Assessment of Reservoir Water Quality Using Multivariate Statistical Techniques: A Case Study of Qiandao Lake, China

    Directory of Open Access Journals (Sweden)

    Qing Gu

    2016-03-01

    Full Text Available Qiandao Lake (Xin’an Jiang reservoir plays a significant role in drinking water supply for eastern China, and it is an attractive tourist destination. Three multivariate statistical methods were comprehensively applied to assess the spatial and temporal variations in water quality as well as potential pollution sources in Qiandao Lake. Data sets of nine parameters from 12 monitoring sites during 2010–2013 were obtained for analysis. Cluster analysis (CA was applied to classify the 12 sampling sites into three groups (Groups A, B and C and the 12 monitoring months into two clusters (April-July, and the remaining months. Discriminant analysis (DA identified Secchi disc depth, dissolved oxygen, permanganate index and total phosphorus as the significant variables for distinguishing variations of different years, with 79.9% correct assignments. Dissolved oxygen, pH and chlorophyll-a were determined to discriminate between the two sampling periods classified by CA, with 87.8% correct assignments. For spatial variation, DA identified Secchi disc depth and ammonia nitrogen as the significant discriminating parameters, with 81.6% correct assignments. Principal component analysis (PCA identified organic pollution, nutrient pollution, domestic sewage, and agricultural and surface runoff as the primary pollution sources, explaining 84.58%, 81.61% and 78.68% of the total variance in Groups A, B and C, respectively. These results demonstrate the effectiveness of integrated use of CA, DA and PCA for reservoir water quality evaluation and could assist managers in improving water resources management.

  3. Methodological and Statistical Quality in Research Evaluating Nutritional Attitudes in Sports.

    Science.gov (United States)

    Kouvelioti, Rozalia; Vagenas, George

    2015-12-01

    The assessment of dietary attitudes and behaviors provides information of interest to sports nutritionists. Although there has been little analysis of the quality of research undertaken in this field, there is evidence of a number of flaws and methodological concerns in some of the studies in the available literature. This review undertook a systematic assessment of the attributes of research assessing the nutritional knowledge and attitudes of athletes and coaches. Sixty questionnaire-based studies were identified by a search of official databases using specific key terms with subsequent analysis by certain inclusion-exclusion criteria. These studies were then analyzed using 33 research quality criteria related to the methods, questionnaires, and statistics used. We found that many studies did not provide information on critical issues such as research hypotheses (92%), the gaining of ethics approval (50%) or informed consent (35%), or acknowledgment of limitations in the implementation of studies or interpretation of data (72%). Many of the samples were nonprobabilistic (85%) and rather small (42%). Many questionnaires were of unknown origin (30%), validity (72%), and reliability (70%) and resulted in low (≤ 60%) response rates (38%). Pilot testing was not undertaken in 67% of the studies. Few studies dealt with sample size (2%), power (3%), assumptions (7%), confidence intervals (3%), or effect sizes (3%). Improving some of these problems and deficits may enhance future research in this field.

  4. Assessment of roadside surface water quality of Savar, Dhaka, Bangladesh using GIS and multivariate statistical techniques

    Science.gov (United States)

    Ahmed, Fahad; Fakhruddin, A. N. M.; Imam, MD. Toufick; Khan, Nasima; Abdullah, Abu Tareq Mohammad; Khan, Tanzir Ahmed; Rahman, Md. Mahfuzur; Uddin, Mohammad Nashir

    2017-11-01

    In this study, multivariate statistical techniques in collaboration with GIS are used to assess the roadside surface water quality of Savar region. Nineteen water samples were collected in dry season and 15 water quality parameters including TSS, TDS, pH, DO, BOD, Cl-, F-, NO3 2-, NO2 -, SO4 2-, Ca, Mg, K, Zn and Pb were measured. The univariate overview of water quality parameters are TSS 25.154 ± 8.674 mg/l, TDS 840.400 ± 311.081 mg/l, pH 7.574 ± 0.256 pH unit, DO 4.544 ± 0.933 mg/l, BOD 0.758 ± 0.179 mg/l, Cl- 51.494 ± 28.095 mg/l, F- 0.771 ± 0.153 mg/l, NO3 2- 2.211 ± 0.878 mg/l, NO2 - 4.692 ± 5.971 mg/l, SO4 2- 69.545 ± 53.873 mg/l, Ca 48.458 ± 22.690 mg/l, Mg 19.676 ± 7.361 mg/l, K 12.874 ± 11.382 mg/l, Zn 0.027 ± 0.029 mg/l, Pb 0.096 ± 0.154 mg/l. The water quality data were subjected to R-mode PCA which resulted in five major components. PC1 explains 28% of total variance and indicates the roadside and brick field dust settle down (TDS, TSS) in the nearby water body. PC2 explains 22.123% of total variance and indicates the agricultural influence (K, Ca, and NO2 -). PC3 describes the contribution of nonpoint pollution from agricultural and soil erosion processes (SO4 2-, Cl-, and K). PC4 depicts heavy positively loaded by vehicle emission and diffusion from battery stores (Zn, Pb). PC5 depicts strong positive loading of BOD and strong negative loading of pH. Cluster analysis represents three major clusters for both water parameters and sampling sites. The site based on cluster showed similar grouping pattern of R-mode factor score map. The present work reveals a new scope to monitor the roadside water quality for future research in Bangladesh.

  5. Analyzing quality of colorectal cancer care through registry statistics: a small community hospital example.

    Science.gov (United States)

    Hopewood, Ian

    2011-01-01

    As the quantity of elderly Americans requiring oncologic care grows, and as cancer treatment and medicine become more advanced, assessing quality of cancer care becomes a necessary and advantageous practice for any facility.' Such analysis is especially practical in small community hospitals, which may not have the resources of their larger academic counterparts to ensure that the care being provided is current and competitive in terms of both technique and outcome. This study is a comparison of the colorectal cancer care at one such center, Falmouth Community Hospital (FCH)--located in Falmouth, Massachusetts, about an hour and a half away from the nearest metropolitan center--to the care provided at a major nearby Boston Tertiary Center (BTC) and at teaching and research facilities across New England and the United States. The metrics used to measure performance encompass both outcome (survival rate data) as well as technique, including quality of surgery (number of lymph nodes removed) and the administration of adjuvant treatments, chemotherapy, and radiation therapy, as per national guidelines. All data for comparison between FCH and BTC were culled from those hospitals' tumor registries. Data for the comparison between FCH and national tertiary/referral centers were taken from the American College of Surgeons' Commission on Cancer, namely National Cancer Data Base (NCDB) statistics, Hospital Benchmark Reports and Practice Profile Reports. The results showed that, while patients at FCH were diagnosed at both a higher age and at a more advanced stage of colorectal cancer than their BTC counterparts, FCH stands up favorably to BTC and other large centers in terms of the metrics referenced above. Quality assessment such as the analysis conducted here can be used at other community facilities to spotlight, and ultimately eliminate, deficiencies in cancer programs.

  6. Regularization design for high-quality cone-beam CT of intracranial hemorrhage using statistical reconstruction

    Science.gov (United States)

    Dang, H.; Stayman, J. W.; Xu, J.; Sisniega, A.; Zbijewski, W.; Wang, X.; Foos, D. H.; Aygun, N.; Koliatsos, V. E.; Siewerdsen, J. H.

    2016-03-01

    Intracranial hemorrhage (ICH) is associated with pathologies such as hemorrhagic stroke and traumatic brain injury. Multi-detector CT is the current front-line imaging modality for detecting ICH (fresh blood contrast 40-80 HU, down to 1 mm). Flat-panel detector (FPD) cone-beam CT (CBCT) offers a potential alternative with a smaller scanner footprint, greater portability, and lower cost potentially well suited to deployment at the point of care outside standard diagnostic radiology and emergency room settings. Previous studies have suggested reliable detection of ICH down to 3 mm in CBCT using high-fidelity artifact correction and penalized weighted least-squared (PWLS) image reconstruction with a post-artifact-correction noise model. However, ICH reconstructed by traditional image regularization exhibits nonuniform spatial resolution and noise due to interaction between the statistical weights and regularization, which potentially degrades the detectability of ICH. In this work, we propose three regularization methods designed to overcome these challenges. The first two compute spatially varying certainty for uniform spatial resolution and noise, respectively. The third computes spatially varying regularization strength to achieve uniform "detectability," combining both spatial resolution and noise in a manner analogous to a delta-function detection task. Experiments were conducted on a CBCT test-bench, and image quality was evaluated for simulated ICH in different regions of an anthropomorphic head. The first two methods improved the uniformity in spatial resolution and noise compared to traditional regularization. The third exhibited the highest uniformity in detectability among all methods and best overall image quality. The proposed regularization provides a valuable means to achieve uniform image quality in CBCT of ICH and is being incorporated in a CBCT prototype for ICH imaging.

  7. Investigating output and energy variations and their relationship to delivery QA results using Statistical Process Control for helical tomotherapy.

    Science.gov (United States)

    Binny, Diana; Mezzenga, Emilio; Lancaster, Craig M; Trapp, Jamie V; Kairn, Tanya; Crowe, Scott B

    2017-06-01

    The aims of this study were to investigate machine beam parameters using the TomoTherapy quality assurance (TQA) tool, establish a correlation to patient delivery quality assurance results and to evaluate the relationship between energy variations detected using different TQA modules. TQA daily measurement results from two treatment machines for periods of up to 4years were acquired. Analyses of beam quality, helical and static output variations were made. Variations from planned dose were also analysed using Statistical Process Control (SPC) technique and their relationship to output trends were studied. Energy variations appeared to be one of the contributing factors to delivery output dose seen in the analysis. Ion chamber measurements were reliable indicators of energy and output variations and were linear with patient dose verifications. Crown Copyright © 2017. Published by Elsevier Ltd. All rights reserved.

  8. Requirements for quality control of analytical data

    International Nuclear Information System (INIS)

    Westmoreland, R.D.; Bartling, M.H.

    1990-07-01

    The National Contingency Plan (NCP) of the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) provides procedures for the identification, evaluation, and remediation of past hazardous waste disposal sites. The Hazardous Materials Response section of the NCP consists of several phases: Preliminary Assessment, Site Inspection, Remedial Investigation, Feasibility Study, Remedial Design, and Remedial Action. During any of these phases, analysis of soil, water, and waste samples may be performed. The Hazardous Waste Remedial Actions Program (HAZWRAP) is involved in performing field investigations and sample analyses pursuant to the NCP for the US Department of Energy and other federal agencies. The purpose of this document is to specify the requirements of Martin Marietta Energy Systems, Inc., for the control of accuracy, precision, and completeness of samples and data from the point of collection through analysis. Requirements include data reduction and reporting of resulting environmentally related data. Because every instance and concern may not be addressed in this document, HAZWRAP subcontractors are encouraged to discuss any questions with the Analytical Quality Control Specialist (AQCS) and the HAZWRAP Project Manager. This revision supercedes all other versions of this document

  9. Chapter 5: Quality assurance/quality control in stormwater sampling

    Science.gov (United States)

    Sampling the quality of stormwater presents unique challenges because stormwater flow is relatively short-lived with drastic variability. Furthermore, storm events often occur with little advance warning, outside conventional work hours, and under adverse weather conditions. Therefore, most stormwat...

  10. 21 CFR 111.117 - What quality control operations are required for equipment, instruments, and controls?

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 2 2010-04-01 2010-04-01 false What quality control operations are required for equipment, instruments, and controls? 111.117 Section 111.117 Food and Drugs FOOD AND DRUG ADMINISTRATION... and Process Control System: Requirements for Quality Control § 111.117 What quality control operations...

  11. An Austrian framework for PET quality control

    International Nuclear Information System (INIS)

    Nicoletti, R.; Dobrozemsky, G.; Minear, G.; Bergmann, H.

    2002-01-01

    Full text: The European patient protection directive (97/43 EURATOM) requires regular routine quality control (QC) of PET imaging devices. Since no standards were available covering this area and in order to comply with the directive a joint working party of the Austrian societies of nuclear medicine and of medical physics have developed a set of procedures suitable for both dedicated PET scanners and gamma cameras operating in coincidence mode (GCPET). The routine procedures proposed include both manufacturer recommended procedures and tests for specific parameters and calibration procedures. Wherever possible, procedures adapted or derived from NEMA standards publication NU 2-2001 were used to permit direct comparison with specified parameters of image quality. For dedicated PET scanners the most important procedures are the checking of detector sensitivities and the attenuation calibration scan. With full ring scanners the attenuation calibration scan is a blank scan, with partial ring devices a special attenuation calibration phantom has to be used. Test protocols are specific to manufacturer and scanner type. They are usually performed automatically overnight. In addition, some instruments require special calibrations, e.g. gain adjustments or coincidence timing calibration. GCPET procedures include the frequent assessment in coincidence mode of detector uniformity, energy resolution and system sensitivity. Common to both dedicated PET and GCPET are the regular quarterly assessment of tomographic spatial resolution and the calibration of the system for quantitative measurements. As a total performance test for both systems assessment of image quality following NU 2-2001 was included, to be carried out after major system changes or repairs. The suite of QC procedures was tested on several dedicated PET and GCPET systems including all major manufacturers' systems. Due to missing hardware or software not all tests could be performed on all systems. Some of the

  12. Quality control and characterization of bentonite materials

    International Nuclear Information System (INIS)

    Kiviranta, L.; Kumpulainen, S.

    2011-12-01

    . Thus, in a certain extent, index tests can be used to determine the smectite content indicatively for quality control purposes. Previously set acceptance testing requirement limits for swelling index, liquid limit and CEC should be reconsidered, since Ca-bentonite tested in this study did not fulfill the requirement for swelling index, the previously set liquid limit requirement value was way below the values measured in this study, and because the previously set CEC requirement limits were based on a technique that needed different requirement limits for Na- and Ca-bentonites, on contrary to the method used in this study.(orig.)

  13. Reducing lumber thickness variation using real-time statistical process control

    Science.gov (United States)

    Thomas M. Young; Brian H. Bond; Jan Wiedenbeck

    2002-01-01

    A technology feasibility study for reducing lumber thickness variation was conducted from April 2001 until March 2002 at two sawmills located in the southern U.S. A real-time statistical process control (SPC) system was developed that featured Wonderware human machine interface technology (HMI) with distributed real-time control charts for all sawing centers and...

  14. Disciplined Decision Making in an Interdisciplinary Environment: Some Implications for Clinical Applications of Statistical Process Control.

    Science.gov (United States)

    Hantula, Donald A.

    1995-01-01

    Clinical applications of statistical process control (SPC) in human service organizations are considered. SPC is seen as providing a standard set of criteria that serves as a common interface for data-based decision making, which may bring decision making under the control of established contingencies rather than the immediate contingencies of…

  15. An Automated Statistical Process Control Study of Inline Mixing Using Spectrophotometric Detection

    Science.gov (United States)

    Dickey, Michael D.; Stewart, Michael D.; Willson, C. Grant

    2006-01-01

    An experiment is described, which is designed for a junior-level chemical engineering "fundamentals of measurements and data analysis" course, where students are introduced to the concept of statistical process control (SPC) through a simple inline mixing experiment. The students learn how to create and analyze control charts in an effort to…

  16. Austrian Daily Climate Data Rescue and Quality Control

    Science.gov (United States)

    Jurkovic, A.; Lipa, W.; Adler, S.; Albenberger, J.; Lechner, W.; Swietli, R.; Vossberg, I.; Zehetner, S.

    2010-09-01

    Checked climate datasets are a "conditio sine qua non" for all projects that are relevant for environment and climate. In the framework of climate change studies and analysis it is essential to work with quality controlled and trustful data. Furthermore these datasets are used as input for various simulation models. In regard to investigations of extreme events, like strong precipitation periods, drought periods and similar ones we need climate data in high temporal resolution (at least in daily resolution). Because of the historical background - during Second World War the majority of our climate sheets were sent to Berlin, where the historical sheets were destroyed by a bomb attack and so important information got lost - only several climate sheets, mostly duplicates, before 1939 are available and stored in our climate data archive. In 1970 the Central Institute for Meteorology and Geodynamics in Vienna started a first attempt to digitize climate data by means of punch cards. With the introduction of a routinely climate data quality control in 1984 we can speak of high-class-checked daily data (finally checked data, quality flag 6). Our group is working on the processing of digitization and quality control of the historical data for the period 1872 to 1983 for 18 years. Since 2007 it was possible to intensify the work (processes) in the framework of an internal project, namely Austrian Climate Data Rescue and Quality Control. The aim of this initiative was - and still is - to supply daily data in an outstanding good and uniform quality. So this project is a kind of pre-project for all scientific projects which are working with daily data. In addition to routine quality checks (that are running since 1984) using the commercial Bull Software we are testing our data with additional open source software, namely ProClim.db. By the use of this spatial and statistical test procedure, the elements air temperature and precipitation - for several sites in Carinthia - could

  17. 7 CFR 58.733 - Quality control tests.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Quality control tests. 58.733 Section 58.733... Procedures § 58.733 Quality control tests. (a) Chemical analyses. The following chemical analyses shall be... pasteurization by means of the phosphatase test, as well as any other tests necessary to assure good quality...

  18. Minimal requirements for quality controls in radiotherapy with external beams

    International Nuclear Information System (INIS)

    1999-01-01

    Physical dosimetric guidelines have been developed by the Italian National Institute of Health study group on quality assurance in radiotherapy to define protocols for quality controls in external beam radiotherapy. While the document does not determine strict rules or firm recommendations, it suggests minimal requirements for quality controls necessary to guarantee an adequate degree of accuracy in external beam radiotherapy [it

  19. Performance and quality control of nuclear medicine instrumentation

    International Nuclear Information System (INIS)

    Paras, P.

    1981-01-01

    The status and the recent developments of nuclear medicine instrumentation performance, with an emphasis on gamma-camera performance, are discussed as the basis for quality control. New phantoms and techniques for the measurement of gamma-camera performance parameters are introduced and their usefulness for quality control is discussed. Tests and procedures for dose calibrator quality control are included. Also, the principles of quality control, tests, equipment and procedures for each type of instrument are reviewed, and minimum requirements for an effective quality assurance programme for nuclear medicine instrumentation are suggested. (author)

  20. Adaptive statistical iterative reconstruction improves image quality without affecting perfusion CT quantitation in primary colorectal cancer

    Directory of Open Access Journals (Sweden)

    D. Prezzi

    Full Text Available Objectives: To determine the effect of Adaptive Statistical Iterative Reconstruction (ASIR on perfusion CT (pCT parameter quantitation and image quality in primary colorectal cancer. Methods: Prospective observational study. Following institutional review board approval and informed consent, 32 patients with colorectal adenocarcinoma underwent pCT (100 kV, 150 mA, 120 s acquisition, axial mode. Tumour regional blood flow (BF, blood volume (BV, mean transit time (MTT and permeability surface area product (PS were determined using identical regions-of-interests for ASIR percentages of 0%, 20%, 40%, 60%, 80% and 100%. Image noise, contrast-to-noise ratio (CNR and pCT parameters were assessed across ASIR percentages. Coefficients of variation (CV, repeated measures analysis of variance (rANOVA and Spearman’ rank order correlation were performed with statistical significance at 5%. Results: With increasing ASIR percentages, image noise decreased by 33% while CNR increased by 61%; peak tumour CNR was greater than 1.5 with 60% ASIR and above. Mean BF, BV, MTT and PS differed by less than 1.8%, 2.9%, 2.5% and 2.6% across ASIR percentages. CV were 4.9%, 4.2%, 3.3% and 7.9%; rANOVA P values: 0.85, 0.62, 0.02 and 0.81 respectively. Conclusions: ASIR improves image noise and CNR without altering pCT parameters substantially. Keywords: Perfusion imaging, Multidetector computed tomography, Colorectal neoplasms, Computer-assisted image processing, Radiation dosage