WorldWideScience

Sample records for statistical quality control

  1. Frontiers in statistical quality control

    CERN Document Server

    Wilrich, Peter-Theodor

    2004-01-01

    This volume treats the four main categories of Statistical Quality Control: General SQC Methodology, On-line Control including Sampling Inspection and Statistical Process Control, Off-line Control with Data Analysis and Experimental Design, and, fields related to Reliability. Experts with international reputation present their newest contributions.

  2. Frontiers in statistical quality control 11

    CERN Document Server

    Schmid, Wolfgang

    2015-01-01

    The main focus of this edited volume is on three major areas of statistical quality control: statistical process control (SPC), acceptance sampling and design of experiments. The majority of the papers deal with statistical process control, while acceptance sampling and design of experiments are also treated to a lesser extent. The book is organized into four thematic parts, with Part I addressing statistical process control. Part II is devoted to acceptance sampling. Part III covers the design of experiments, while Part IV discusses related fields. The twenty-three papers in this volume stem from The 11th International Workshop on Intelligent Statistical Quality Control, which was held in Sydney, Australia from August 20 to August 23, 2013. The event was hosted by Professor Ross Sparks, CSIRO Mathematics, Informatics and Statistics, North Ryde, Australia and was jointly organized by Professors S. Knoth, W. Schmid and Ross Sparks. The papers presented here were carefully selected and reviewed by the scientifi...

  3. Frontiers in statistical quality control

    CERN Document Server

    Wilrich, Peter-Theodor

    2001-01-01

    The book is a collection of papers presented at the 5th International Workshop on Intelligent Statistical Quality Control in Würzburg, Germany. Contributions deal with methodology and successful industrial applications. They can be grouped in four catagories: Sampling Inspection, Statistical Process Control, Data Analysis and Process Capability Studies and Experimental Design.

  4. Control cards as a statistical quality control resource

    Directory of Open Access Journals (Sweden)

    Aleksandar Živan Drenovac

    2013-02-01

    Full Text Available Normal 0 false false false MicrosoftInternetExplorer4 This paper proves that applying of statistical methods can significantly contribute increasing of products and services quality, as well as increasing of institutions rating. Determining of optimal, apropos anticipatory and limitary values, is based on sample`s statistical analyze. Control cards represent very confident instrument, which is simple for use and efficient for control of process, by which process is maintained in set borders. Thus, control cards can be applied in quality control of procesess of weapons and military equipment production, maintenance of technical systems, as well as for seting of standards and increasing of quality level for many other activities.

  5. Quality assurance and statistical control

    DEFF Research Database (Denmark)

    Heydorn, K.

    1991-01-01

    In scientific research laboratories it is rarely possible to use quality assurance schemes, developed for large-scale analysis. Instead methods have been developed to control the quality of modest numbers of analytical results by relying on statistical control: Analysis of precision serves...... to detect analytical errors by comparing the a priori precision of the analytical results with the actual variability observed among replicates or duplicates. The method relies on the chi-square distribution to detect excess variability and is quite sensitive even for 5-10 results. Interference control...... serves to detect analytical bias by comparing results obtained by two different analytical methods, each relying on a different detection principle and therefore exhibiting different influence from matrix elements; only 5-10 sets of results are required to establish whether a regression line passes...

  6. Net analyte signal based statistical quality control

    NARCIS (Netherlands)

    Skibsted, E.T.S.; Boelens, H.F.M.; Westerhuis, J.A.; Smilde, A.K.; Broad, N.W.; Rees, D.R.; Witte, D.T.

    2005-01-01

    Net analyte signal statistical quality control (NAS-SQC) is a new methodology to perform multivariate product quality monitoring based on the net analyte signal approach. The main advantage of NAS-SQC is that the systematic variation in the product due to the analyte (or property) of interest is

  7. Statistical quality control a loss minimization approach

    CERN Document Server

    Trietsch, Dan

    1999-01-01

    While many books on quality espouse the Taguchi loss function, they do not examine its impact on statistical quality control (SQC). But using the Taguchi loss function sheds new light on questions relating to SQC and calls for some changes. This book covers SQC in a way that conforms with the need to minimize loss. Subjects often not covered elsewhere include: (i) measurements, (ii) determining how many points to sample to obtain reliable control charts (for which purpose a new graphic tool, diffidence charts, is introduced), (iii) the connection between process capability and tolerances, (iv)

  8. A computerized diagnostic system for nuclear plant control rooms based on statistical quality control

    International Nuclear Information System (INIS)

    Heising, C.D.; Grenzebach, W.S.

    1990-01-01

    In engineering science, statistical quality control techniques have traditionally been applied to control manufacturing processes. An application to commercial nuclear power plant maintenance and control is presented that can greatly improve safety. As a demonstration of such an approach to plant maintenance and control, a specific system is analyzed: the reactor coolant pumps of the St. Lucie Unit 2 nuclear power plant located in Florida. A 30-day history of the four pumps prior to a plant shutdown caused by pump failure and a related fire within the containment was analyzed. Statistical quality control charts of recorded variables were constructed for each pump, which were shown to go out of statistical control many days before the plant trip. The analysis shows that statistical process control methods can be applied as an early warning system capable of identifying significant equipment problems well in advance of traditional control room alarm indicators

  9. A Total Quality-Control Plan with Right-Sized Statistical Quality-Control.

    Science.gov (United States)

    Westgard, James O

    2017-03-01

    A new Clinical Laboratory Improvement Amendments option for risk-based quality-control (QC) plans became effective in January, 2016. Called an Individualized QC Plan, this option requires the laboratory to perform a risk assessment, develop a QC plan, and implement a QC program to monitor ongoing performance of the QC plan. Difficulties in performing a risk assessment may limit validity of an Individualized QC Plan. A better alternative is to develop a Total QC Plan including a right-sized statistical QC procedure to detect medically important errors. Westgard Sigma Rules provides a simple way to select the right control rules and the right number of control measurements. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. PROCESS VARIABILITY REDUCTION THROUGH STATISTICAL PROCESS CONTROL FOR QUALITY IMPROVEMENT

    Directory of Open Access Journals (Sweden)

    B.P. Mahesh

    2010-09-01

    Full Text Available Quality has become one of the most important customer decision factors in the selection among the competing product and services. Consequently, understanding and improving quality is a key factor leading to business success, growth and an enhanced competitive position. Hence quality improvement program should be an integral part of the overall business strategy. According to TQM, the effective way to improve the Quality of the product or service is to improve the process used to build the product. Hence, TQM focuses on process, rather than results as the results are driven by the processes. Many techniques are available for quality improvement. Statistical Process Control (SPC is one such TQM technique which is widely accepted for analyzing quality problems and improving the performance of the production process. This article illustrates the step by step procedure adopted at a soap manufacturing company to improve the Quality by reducing process variability using Statistical Process Control.

  11. Statistical Process Control: Going to the Limit for Quality.

    Science.gov (United States)

    Training, 1987

    1987-01-01

    Defines the concept of statistical process control, a quality control method used especially in manufacturing. Generally, concept users set specific standard levels that must be met. Makes the point that although employees work directly with the method, management is responsible for its success within the plant. (CH)

  12. Interrupted Time Series Versus Statistical Process Control in Quality Improvement Projects.

    Science.gov (United States)

    Andersson Hagiwara, Magnus; Andersson Gäre, Boel; Elg, Mattias

    2016-01-01

    To measure the effect of quality improvement interventions, it is appropriate to use analysis methods that measure data over time. Examples of such methods include statistical process control analysis and interrupted time series with segmented regression analysis. This article compares the use of statistical process control analysis and interrupted time series with segmented regression analysis for evaluating the longitudinal effects of quality improvement interventions, using an example study on an evaluation of a computerized decision support system.

  13. Statistical methods for quality assurance basics, measurement, control, capability, and improvement

    CERN Document Server

    Vardeman, Stephen B

    2016-01-01

    This undergraduate statistical quality assurance textbook clearly shows with real projects, cases and data sets how statistical quality control tools are used in practice. Among the topics covered is a practical evaluation of measurement effectiveness for both continuous and discrete data. Gauge Reproducibility and Repeatability methodology (including confidence intervals for Repeatability, Reproducibility and the Gauge Capability Ratio) is thoroughly developed. Process capability indices and corresponding confidence intervals are also explained. In addition to process monitoring techniques, experimental design and analysis for process improvement are carefully presented. Factorial and Fractional Factorial arrangements of treatments and Response Surface methods are covered. Integrated throughout the book are rich sets of examples and problems that help readers gain a better understanding of where and how to apply statistical quality control tools. These large and realistic problem sets in combination with the...

  14. The application of statistical process control in linac quality assurance

    International Nuclear Information System (INIS)

    Li Dingyu; Dai Jianrong

    2009-01-01

    Objective: To improving linac quality assurance (QA) program with statistical process control (SPC) method. Methods: SPC is applied to set the control limit of QA data, draw charts and differentiate the random and systematic errors. A SPC quality assurance software named QA M ANAGER has been developed by VB programming for clinical use. Two clinical cases are analyzed with SPC to study daily output QA of a 6MV photon beam. Results: In the clinical case, the SPC is able to identify the systematic errors. Conclusion: The SPC application may be assistant to detect systematic errors in linac quality assurance thus it alarms the abnormal trend to eliminate the systematic errors and improves quality control. (authors)

  15. Application of Statistical Process Control (SPC in it´s Quality control

    Directory of Open Access Journals (Sweden)

    Carlos Hernández-Pedrera

    2015-12-01

    Full Text Available The overall objective of this paper is to use the SPC to assess the possibility of improving the process of obtaining a sanitary device. As specific objectives we set out to identify the variables to be analyzed to enter the statistical control of process (SPC, analyze possible errors and variations indicated by the control charts in addition to evaluate and compare the results achieved with the study of SPC before and after monitoring direct in the production line were used sampling methods and laboratory replacement to determine the quality of the finished product, then statistical methods were applied seeking to emphasize the importance and contribution from its application to monitor corrective actions and support processes in production. It was shown that the process is under control because the results were found within established control limits. There is a tendency to be displaced toward one end of the boundary, the distribution exceeds the limits, creating the possibility that under certain conditions the process is out of control, the results also showed that the process being within the limits of quality control is operating far from the optimal conditions. In any of the study situations were obtained products outside the limits of weight and discoloration but defective products were obtained.

  16. Statistical quality management using miniTAB 14

    International Nuclear Information System (INIS)

    An, Seong Jin

    2007-01-01

    This book explains statistical quality management giving descriptions of definition of quality, quality management, quality cost, basic methods of quality management, principles of control chart, control chart for variables, control chart for attributes, capability analysis, other issues of statistical process control, acceptance sampling, sampling for variable acceptance, design and analysis of experiment, Taguchi quality engineering, reaction surface methodology reliability analysis.

  17. Statistical process control: An approach to quality assurance in the production of vitrified nuclear waste

    International Nuclear Information System (INIS)

    Pulsipher, B.A.; Kuhn, W.L.

    1987-01-01

    Current planning for liquid high-level nuclear wastes existing in the United States includes processing in a liquid-fed ceramic melter to incorporate it into a high-quality glass, and placement in a deep geologic repository. The nuclear waste vitrification process requires assurance of a quality product with little or no final inspection. Statistical process control (SPC) is a quantitative approach to one quality assurance aspect of vitrified nuclear waste. This method for monitoring and controlling a process in the presence of uncertainties provides a statistical basis for decisions concerning product quality improvement. Statistical process control is shown to be a feasible and beneficial tool to help the waste glass producers demonstrate that the vitrification process can be controlled sufficiently to produce an acceptable product. This quantitative aspect of quality assurance could be an effective means of establishing confidence in the claims to a quality product

  18. Statistical process control: An approach to quality assurance in the production of vitrified nuclear waste

    International Nuclear Information System (INIS)

    Pulsipher, B.A.; Kuhn, W.L.

    1987-02-01

    Current planning for liquid high-level nuclear wastes existing in the US includes processing in a liquid-fed ceramic melter to incorporate it into a high-quality glass, and placement in a deep geologic repository. The nuclear waste vitrification process requires assurance of a quality product with little or no final inspection. Statistical process control (SPC) is a quantitative approach to one quality assurance aspect of vitrified nuclear waste. This method for monitoring and controlling a process in the presence of uncertainties provides a statistical basis for decisions concerning product quality improvement. Statistical process control is shown to be a feasible and beneficial tool to help the waste glass producers demonstrate that the vitrification process can be controlled sufficiently to produce an acceptable product. This quantitative aspect of quality assurance could be an effective means of establishing confidence in the claims to a quality product. 2 refs., 4 figs

  19. Six Sigma Quality Management System and Design of Risk-based Statistical Quality Control.

    Science.gov (United States)

    Westgard, James O; Westgard, Sten A

    2017-03-01

    Six sigma concepts provide a quality management system (QMS) with many useful tools for managing quality in medical laboratories. This Six Sigma QMS is driven by the quality required for the intended use of a test. The most useful form for this quality requirement is the allowable total error. Calculation of a sigma-metric provides the best predictor of risk for an analytical examination process, as well as a design parameter for selecting the statistical quality control (SQC) procedure necessary to detect medically important errors. Simple point estimates of sigma at medical decision concentrations are sufficient for laboratory applications. Copyright © 2016 Elsevier Inc. All rights reserved.

  20. Assessment of the GPC Control Quality Using Non–Gaussian Statistical Measures

    Directory of Open Access Journals (Sweden)

    Domański Paweł D.

    2017-06-01

    Full Text Available This paper presents an alternative approach to the task of control performance assessment. Various statistical measures based on Gaussian and non-Gaussian distribution functions are evaluated. The analysis starts with the review of control error histograms followed by their statistical analysis using probability distribution functions. Simulation results obtained for a control system with the generalized predictive controller algorithm are considered. The proposed approach using Cauchy and Lévy α-stable distributions shows robustness against disturbances and enables effective control loop quality evaluation. Tests of the predictive algorithm prove its ability to detect the impact of the main controller parameters, such as the model gain, the dynamics or the prediction horizon.

  1. Quality Control of the Print with the Application of Statistical Methods

    Science.gov (United States)

    Simonenko, K. V.; Bulatova, G. S.; Antropova, L. B.; Varepo, L. G.

    2018-04-01

    The basis for standardizing the process of offset printing is the control of print quality indicators. The solution of this problem has various approaches, among which the most important are statistical methods. Practical implementation of them for managing the quality of the printing process is very relevant and is reflected in this paper. The possibility of using the method of constructing a Control Card to identify the reasons for the deviation of the optical density for a triad of inks in offset printing is shown.

  2. Use of statistic control of the process as part of a quality assurance plan

    International Nuclear Information System (INIS)

    Acosta, S.; Lewis, C.

    2013-01-01

    One of the technical requirements of the standard IRAM ISO 17025 for the accreditation of testing laboratories, is the assurance of the quality of the results through the control and monitoring of the factors influencing the reliability of them. The grade the factors contribute to the total measurement uncertainty, determines which of them should be considered when developing a quality assurance plan. The laboratory of environmental measurements of strontium-90 in the accreditation process, performs most of its determinations in samples with values close to the detection limit. For this reason the correct characterization of the white, is a critical parameter and is verified through a letter for statistical process control. The scope of the present work is concerned the control of whites and so it was collected a statistically significant amount of data, for a period of time that is covered of different conditions. This allowed consider significant variables in the process, such as temperature and humidity, and build a graph of white control, which forms the basis of a statistical process control. The data obtained were lower and upper limits for the preparation of the charter white control. In this way the process of characterization of white was considered to operate under statistical control and concludes that it can be used as part of a plan of insurance of the quality

  3. Application of Statistical Increase in Industrial Quality

    International Nuclear Information System (INIS)

    Akhmad-Fauzy

    2000-01-01

    Application of statistical method in industrial field is slightly newcompared with agricultural and biology. Statistical method which is appliedin industrial field more focus on industrial system control and useful formaintaining economical control of produce quality which is produced on bigscale. Application of statistical method in industrial field has increasedrapidly. This fact is supported by release of ISO 9000 quality system in 1987as international quality standard which is adopted by more than 100countries. (author)

  4. Statistical process control for radiotherapy quality assurance

    International Nuclear Information System (INIS)

    Pawlicki, Todd; Whitaker, Matthew; Boyer, Arthur L.

    2005-01-01

    Every quality assurance process uncovers random and systematic errors. These errors typically consist of many small random errors and a very few number of large errors that dominate the result. Quality assurance practices in radiotherapy do not adequately differentiate between these two sources of error. The ability to separate these types of errors would allow the dominant source(s) of error to be efficiently detected and addressed. In this work, statistical process control is applied to quality assurance in radiotherapy for the purpose of setting action thresholds that differentiate between random and systematic errors. The theoretical development and implementation of process behavior charts are described. We report on a pilot project is which these techniques are applied to daily output and flatness/symmetry quality assurance for a 10 MV photon beam in our department. This clinical case was followed over 52 days. As part of our investigation, we found that action thresholds set using process behavior charts were able to identify systematic changes in our daily quality assurance process. This is in contrast to action thresholds set using the standard deviation, which did not identify the same systematic changes in the process. The process behavior thresholds calculated from a subset of the data detected a 2% change in the process whereas with a standard deviation calculation, no change was detected. Medical physicists must make decisions on quality assurance data as it is acquired. Process behavior charts help decide when to take action and when to acquire more data before making a change in the process

  5. Statistical methods for quality improvement

    National Research Council Canada - National Science Library

    Ryan, Thomas P

    2011-01-01

    ...."-TechnometricsThis new edition continues to provide the most current, proven statistical methods for quality control and quality improvementThe use of quantitative methods offers numerous benefits...

  6. Methods and applications of statistics in engineering, quality control, and the physical sciences

    CERN Document Server

    Balakrishnan, N

    2011-01-01

    Inspired by the Encyclopedia of Statistical Sciences, Second Edition (ESS2e), this volume presents a concise, well-rounded focus on the statistical concepts and applications that are essential for understanding gathered data in the fields of engineering, quality control, and the physical sciences. The book successfully upholds the goals of ESS2e by combining both previously-published and newly developed contributions written by over 100 leading academics, researchers, and practitioner in a comprehensive, approachable format. The result is a succinct reference that unveils modern, cutting-edge approaches to acquiring and analyzing data across diverse subject areas within these three disciplines, including operations research, chemistry, physics, the earth sciences, electrical engineering, and quality assurance. In addition, techniques related to survey methodology, computational statistics, and operations research are discussed, where applicable. Topics of coverage include: optimal and stochastic control, arti...

  7. Development of nuclear power plant online monitoring system using statistical quality control

    International Nuclear Information System (INIS)

    An, Sang Ha

    2006-02-01

    Statistical Quality Control techniques have been applied to many aspects of industrial engineering. An application to nuclear power plant maintenance and control is also presented that can greatly improve plant safety. As a demonstration of such an approach, a specific system is analyzed: the reactor coolant pumps (RCP) and the fouling resistance of heat exchanger. This research uses Shewart X-bar, R charts, Cumulative Sum charts (CUSUM), and Sequential Probability Ratio Test (SPRT) to analyze the process for the state of statistical control. And we made Control Chart Analyzer (CCA) to support these analyses that can make a decision of error in process. The analysis shows that statistical process control methods can be applied as an early warning system capable of identifying significant equipment problems well in advance of traditional control room alarm indicators. Such a system would provide operators with enough time to respond to possible emergency situations and thus improve plant safety and reliability

  8. Statistical analysis of quality control of automatic processor

    International Nuclear Information System (INIS)

    Niu Yantao; Zhao Lei; Zhang Wei; Yan Shulin

    2002-01-01

    Objective: To strengthen the scientific management of automatic processor and promote QC, based on analyzing QC management chart for automatic processor by statistical method, evaluating and interpreting the data and trend of the chart. Method: Speed, contrast, minimum density of step wedge of film strip were measured everyday and recorded on the QC chart. Mean (x-bar), standard deviation (s) and range (R) were calculated. The data and the working trend were evaluated and interpreted for management decisions. Results: Using relative frequency distribution curve constructed by measured data, the authors can judge whether it is a symmetric bell-shaped curve or not. If not, it indicates a few extremes overstepping control limits possibly are pulling the curve to the left or right. If it is a normal distribution, standard deviation (s) is observed. When x-bar +- 2s lies in upper and lower control limits of relative performance indexes, it indicates the processor works in stable status in this period. Conclusion: Guided by statistical method, QC work becomes more scientific and quantified. The authors can deepen understanding and application of the trend chart, and improve the quality management to a new step

  9. Improved Statistical Method For Hydrographic Climatic Records Quality Control

    Science.gov (United States)

    Gourrion, J.; Szekely, T.

    2016-02-01

    Climate research benefits from the continuous development of global in-situ hydrographic networks in the last decades. Apart from the increasing volume of observations available on a large range of temporal and spatial scales, a critical aspect concerns the ability to constantly improve the quality of the datasets. In the context of the Coriolis Dataset for ReAnalysis (CORA) version 4.2, a new quality control method based on a local comparison to historical extreme values ever observed is developed, implemented and validated. Temperature, salinity and potential density validity intervals are directly estimated from minimum and maximum values from an historical reference dataset, rather than from traditional mean and standard deviation estimates. Such an approach avoids strong statistical assumptions on the data distributions such as unimodality, absence of skewness and spatially homogeneous kurtosis. As a new feature, it also allows addressing simultaneously the two main objectives of a quality control strategy, i.e. maximizing the number of good detections while minimizing the number of false alarms. The reference dataset is presently built from the fusion of 1) all ARGO profiles up to early 2014, 2) 3 historical CTD datasets and 3) the Sea Mammals CTD profiles from the MEOP database. All datasets are extensively and manually quality controlled. In this communication, the latest method validation results are also presented. The method has been implemented in the latest version of the CORA dataset and will benefit to the next version of the Copernicus CMEMS dataset.

  10. Image-guided radiotherapy quality control: Statistical process control using image similarity metrics.

    Science.gov (United States)

    Shiraishi, Satomi; Grams, Michael P; Fong de Los Santos, Luis E

    2018-05-01

    The purpose of this study was to demonstrate an objective quality control framework for the image review process. A total of 927 cone-beam computed tomography (CBCT) registrations were retrospectively analyzed for 33 bilateral head and neck cancer patients who received definitive radiotherapy. Two registration tracking volumes (RTVs) - cervical spine (C-spine) and mandible - were defined, within which a similarity metric was calculated and used as a registration quality tracking metric over the course of treatment. First, sensitivity to large misregistrations was analyzed for normalized cross-correlation (NCC) and mutual information (MI) in the context of statistical analysis. The distribution of metrics was obtained for displacements that varied according to a normal distribution with standard deviation of σ = 2 mm, and the detectability of displacements greater than 5 mm was investigated. Then, similarity metric control charts were created using a statistical process control (SPC) framework to objectively monitor the image registration and review process. Patient-specific control charts were created using NCC values from the first five fractions to set a patient-specific process capability limit. Population control charts were created using the average of the first five NCC values for all patients in the study. For each patient, the similarity metrics were calculated as a function of unidirectional translation, referred to as the effective displacement. Patient-specific action limits corresponding to 5 mm effective displacements were defined. Furthermore, effective displacements of the ten registrations with the lowest similarity metrics were compared with a three dimensional (3DoF) couch displacement required to align the anatomical landmarks. Normalized cross-correlation identified suboptimal registrations more effectively than MI within the framework of SPC. Deviations greater than 5 mm were detected at 2.8σ and 2.1σ from the mean for NCC and MI

  11. Improved statistical method for temperature and salinity quality control

    Science.gov (United States)

    Gourrion, Jérôme; Szekely, Tanguy

    2017-04-01

    Climate research and Ocean monitoring benefit from the continuous development of global in-situ hydrographic networks in the last decades. Apart from the increasing volume of observations available on a large range of temporal and spatial scales, a critical aspect concerns the ability to constantly improve the quality of the datasets. In the context of the Coriolis Dataset for ReAnalysis (CORA) version 4.2, a new quality control method based on a local comparison to historical extreme values ever observed is developed, implemented and validated. Temperature, salinity and potential density validity intervals are directly estimated from minimum and maximum values from an historical reference dataset, rather than from traditional mean and standard deviation estimates. Such an approach avoids strong statistical assumptions on the data distributions such as unimodality, absence of skewness and spatially homogeneous kurtosis. As a new feature, it also allows addressing simultaneously the two main objectives of an automatic quality control strategy, i.e. maximizing the number of good detections while minimizing the number of false alarms. The reference dataset is presently built from the fusion of 1) all ARGO profiles up to late 2015, 2) 3 historical CTD datasets and 3) the Sea Mammals CTD profiles from the MEOP database. All datasets are extensively and manually quality controlled. In this communication, the latest method validation results are also presented. The method has already been implemented in the latest version of the delayed-time CMEMS in-situ dataset and will be deployed soon in the equivalent near-real time products.

  12. Cost and quality effectiveness of objective-based and statistically-based quality control for volatile organic compounds analyses of gases

    International Nuclear Information System (INIS)

    Bennett, J.T.; Crowder, C.A.; Connolly, M.J.

    1994-01-01

    Gas samples from drums of radioactive waste at the Department of Energy (DOE) Idaho National Engineering Laboratory are being characterized for 29 volatile organic compounds to determine the feasibility of storing the waste in DOE's Waste Isolation Pilot Plant (WIPP) in Carlsbad, New Mexico. Quality requirements for the gas chromatography (GC) and GC/mass spectrometry chemical methods used to analyze the waste are specified in the Quality Assurance Program Plan for the WIPP Experimental Waste Characterization Program. Quality requirements consist of both objective criteria (data quality objectives, DQOs) and statistical criteria (process control). The DQOs apply to routine sample analyses, while the statistical criteria serve to determine and monitor precision and accuracy (P ampersand A) of the analysis methods and are also used to assign upper confidence limits to measurement results close to action levels. After over two years and more than 1000 sample analyses there are two general conclusions concerning the two approaches to quality control: (1) Objective criteria (e.g., ± 25% precision, ± 30% accuracy) based on customer needs and the usually prescribed criteria for similar EPA- approved methods are consistently attained during routine analyses. (2) Statistical criteria based on short term method performance are almost an order of magnitude more stringent than objective criteria and are difficult to satisfy following the same routine laboratory procedures which satisfy the objective criteria. A more cost effective and representative approach to establishing statistical method performances criteria would be either to utilize a moving average of P ampersand A from control samples over a several month time period or to determine within a sample variation by one-way analysis of variance of several months replicate sample analysis results or both. Confidence intervals for results near action levels could also be determined by replicate analysis of the sample in

  13. Evaluation of statistical protocols for quality control of ecosystem carbon dioxide fluxes

    Science.gov (United States)

    Jorge F. Perez-Quezada; Nicanor Z. Saliendra; William E. Emmerich; Emilio A. Laca

    2007-01-01

    The process of quality control of micrometeorological and carbon dioxide (CO2) flux data can be subjective and may lack repeatability, which would undermine the results of many studies. Multivariate statistical methods and time series analysis were used together and independently to detect and replace outliers in CO2 flux...

  14. STATISTICS IN SERVICE QUALITY ASSESSMENT

    Directory of Open Access Journals (Sweden)

    Dragana Gardašević

    2012-09-01

    Full Text Available For any quality evaluation in sports, science, education, and so, it is useful to collect data to construct a strategy to improve the quality of services offered to the user. For this purpose, we use statistical software packages for data processing data collected in order to increase customer satisfaction. The principle is demonstrated by the example of the level of student satisfaction ratings Belgrade Polytechnic (as users the quality of institutions (Belgrade Polytechnic. Here, the emphasis on statistical analysis as a tool for quality control in order to improve the same, and not the interpretation of results. Therefore, the above can be used as a model in sport to improve the overall results.

  15. Assessing thermal comfort and energy efficiency in buildings by statistical quality control for autocorrelated data

    International Nuclear Information System (INIS)

    Barbeito, Inés; Zaragoza, Sonia; Tarrío-Saavedra, Javier; Naya, Salvador

    2017-01-01

    Highlights: • Intelligent web platform development for energy efficiency management in buildings. • Controlling and supervising thermal comfort and energy consumption in buildings. • Statistical quality control procedure to deal with autocorrelated data. • Open source alternative using R software. - Abstract: In this paper, a case study of performing a reliable statistical procedure to evaluate the quality of HVAC systems in buildings using data retrieved from an ad hoc big data web energy platform is presented. The proposed methodology based on statistical quality control (SQC) is used to analyze the real state of thermal comfort and energy efficiency of the offices of the company FRIDAMA (Spain) in a reliable way. Non-conformities or alarms, and the actual assignable causes of these out of control states are detected. The capability to meet specification requirements is also analyzed. Tools and packages implemented in the open-source R software are employed to apply the different procedures. First, this study proposes to fit ARIMA time series models to CTQ variables. Then, the application of Shewhart and EWMA control charts to the time series residuals is proposed to control and monitor thermal comfort and energy consumption in buildings. Once thermal comfort and consumption variability are estimated, the implementation of capability indexes for autocorrelated variables is proposed to calculate the degree to which standards specifications are met. According with case study results, the proposed methodology has detected real anomalies in HVAC installation, helping to detect assignable causes and to make appropriate decisions. One of the goals is to perform and describe step by step this statistical procedure in order to be replicated by practitioners in a better way.

  16. Severe postpartum haemorrhage after vaginal delivery: a statistical process control chart to report seven years of continuous quality improvement.

    Science.gov (United States)

    Dupont, Corinne; Occelli, Pauline; Deneux-Tharaux, Catherine; Touzet, Sandrine; Duclos, Antoine; Bouvier-Colle, Marie-Hélène; Rudigoz, René-Charles; Huissoud, Cyril

    2014-07-01

    Severe postpartum haemorrhage after vaginal delivery: a statistical process control chart to report seven years of continuous quality improvement To use statistical process control charts to describe trends in the prevalence of severe postpartum haemorrhage after vaginal delivery. This assessment was performed 7 years after we initiated a continuous quality improvement programme that began with regular criteria-based audits Observational descriptive study, in a French maternity unit in the Rhône-Alpes region. Quarterly clinical audit meetings to analyse all cases of severe postpartum haemorrhage after vaginal delivery and provide feedback on quality of care with statistical process control tools. The primary outcomes were the prevalence of severe PPH after vaginal delivery and its quarterly monitoring with a control chart. The secondary outcomes included the global quality of care for women with severe postpartum haemorrhage, including the performance rate of each recommended procedure. Differences in these variables between 2005 and 2012 were tested. From 2005 to 2012, the prevalence of severe postpartum haemorrhage declined significantly, from 1.2% to 0.6% of vaginal deliveries (pcontrol limits, that is, been out of statistical control. The proportion of cases that were managed consistently with the guidelines increased for all of their main components. Implementation of continuous quality improvement efforts began seven years ago and used, among other tools, statistical process control charts. During this period, the prevalence of severe postpartum haemorrhage after vaginal delivery has been reduced by 50%. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  17. Frontiers in statistical quality control

    CERN Document Server

    Wilrich, Peter-Theodor

    1997-01-01

    Like the preceding volumes, and met with a lively response, the present volume is collecting contributions stressed on methodology or successful industrial applications. The papers are classified under four main headings: sampling inspection, process quality control, data analysis and process capability studies and finally experimental design.

  18. Quality control statistic for laboratory analysis and assays in Departamento de Tecnologia de Combustiveis - IPEN-BR

    International Nuclear Information System (INIS)

    Lima, Waldir C. de; Lainetti, Paulo E.O.; Lima, Roberto M. de; Peres, Henrique G.

    1996-01-01

    The purpose of this work is the study for introduction of the statistical control in test and analysis realized in the Departamento de Tecnologia de Combustiveis. Are succinctly introduced: theories of statistical process control, elaboration of control graphs, the definition of standards test (or analysis) and how the standards are employed for determination the control limits in the graphs. The more expressive result is the applied form for the practice quality control, moreover it is also exemplified the utilization of one standard of verification and analysis in the laboratory of control. (author)

  19. Measuring and improving the quality of postoperative epidural analgesia for major abdominal surgery using statistical process control charts.

    Science.gov (United States)

    Duncan, Fiona; Haigh, Carol

    2013-10-01

    To explore and improve the quality of continuous epidural analgesia for pain relief using Statistical Process Control tools. Measuring the quality of pain management interventions is complex. Intermittent audits do not accurately capture the results of quality improvement initiatives. The failure rate for one intervention, epidural analgesia, is approximately 30% in everyday practice, so it is an important area for improvement. Continuous measurement and analysis are required to understand the multiple factors involved in providing effective pain relief. Process control and quality improvement Routine prospectively acquired data collection started in 2006. Patients were asked about their pain and side effects of treatment. Statistical Process Control methods were applied for continuous data analysis. A multidisciplinary group worked together to identify reasons for variation in the data and instigated ideas for improvement. The key measure for improvement was a reduction in the percentage of patients with an epidural in severe pain. The baseline control charts illustrated the recorded variation in the rate of several processes and outcomes for 293 surgical patients. The mean visual analogue pain score (VNRS) was four. There was no special cause variation when data were stratified by surgeons, clinical area or patients who had experienced pain before surgery. Fifty-seven per cent of patients were hypotensive on the first day after surgery. We were able to demonstrate a significant improvement in the failure rate of epidurals as the project continued with quality improvement interventions. Statistical Process Control is a useful tool for measuring and improving the quality of pain management. The applications of Statistical Process Control methods offer the potential to learn more about the process of change and outcomes in an Acute Pain Service both locally and nationally. We have been able to develop measures for improvement and benchmarking in routine care that

  20. Applying Statistical Process Quality Control Methodology to Educational Settings.

    Science.gov (United States)

    Blumberg, Carol Joyce

    A subset of Statistical Process Control (SPC) methodology known as Control Charting is introduced. SPC methodology is a collection of graphical and inferential statistics techniques used to study the progress of phenomena over time. The types of control charts covered are the null X (mean), R (Range), X (individual observations), MR (moving…

  1. IMPROVING QUALITY OF STATISTICAL PROCESS CONTROL BY DEALING WITH NON‐NORMAL DATA IN AUTOMOTIVE INDUSTRY

    Directory of Open Access Journals (Sweden)

    Zuzana ANDRÁSSYOVÁ

    2012-07-01

    Full Text Available Study deals with an analysis of data to the effect that it improves the quality of statistical tools in processes of assembly of automobile seats. Normal distribution of variables is one of inevitable conditions for the analysis, examination, and improvement of the manufacturing processes (f. e.: manufacturing process capability although, there are constantly more approaches to non‐normal data handling. An appropriate probability distribution of measured data is firstly tested by the goodness of fit of empirical distribution with theoretical normal distribution on the basis of hypothesis testing using programme StatGraphics Centurion XV.II. Data are collected from the assembly process of 1st row automobile seats for each characteristic of quality (Safety Regulation ‐S/R individually. Study closely processes the measured data of an airbag´s assembly and it aims to accomplish the normal distributed data and apply it the statistical process control. Results of the contribution conclude in a statement of rejection of the null hypothesis (measured variables do not follow the normal distribution therefore it is necessary to begin to work on data transformation supported by Minitab15. Even this approach does not reach a normal distributed data and so should be proposed a procedure that leads to the quality output of whole statistical control of manufacturing processes.

  2. Data exploration, quality control and statistical analysis of ChIP-exo/nexus experiments.

    Science.gov (United States)

    Welch, Rene; Chung, Dongjun; Grass, Jeffrey; Landick, Robert; Keles, Sündüz

    2017-09-06

    ChIP-exo/nexus experiments rely on innovative modifications of the commonly used ChIP-seq protocol for high resolution mapping of transcription factor binding sites. Although many aspects of the ChIP-exo data analysis are similar to those of ChIP-seq, these high throughput experiments pose a number of unique quality control and analysis challenges. We develop a novel statistical quality control pipeline and accompanying R/Bioconductor package, ChIPexoQual, to enable exploration and analysis of ChIP-exo and related experiments. ChIPexoQual evaluates a number of key issues including strand imbalance, library complexity, and signal enrichment of data. Assessment of these features are facilitated through diagnostic plots and summary statistics computed over regions of the genome with varying levels of coverage. We evaluated our QC pipeline with both large collections of public ChIP-exo/nexus data and multiple, new ChIP-exo datasets from Escherichia coli. ChIPexoQual analysis of these datasets resulted in guidelines for using these QC metrics across a wide range of sequencing depths and provided further insights for modelling ChIP-exo data. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  3. From Quality to Information Quality in Official Statistics

    Directory of Open Access Journals (Sweden)

    Kenett Ron S.

    2016-12-01

    Full Text Available The term quality of statistical data, developed and used in official statistics and international organizations such as the International Monetary Fund (IMF and the Organisation for Economic Co-operation and Development (OECD, refers to the usefulness of summary statistics generated by producers of official statistics. Similarly, in the context of survey quality, official agencies such as Eurostat, National Center for Science and Engineering Statistics (NCSES, and Statistics Canada have created dimensions for evaluating the quality of a survey and its ability to report ‘accurate survey data’.

  4. Statistical Process Control: A Quality Tool for a Venous Thromboembolic Disease Registry.

    Science.gov (United States)

    Posadas-Martinez, Maria Lourdes; Rojas, Liliana Paloma; Vazquez, Fernando Javier; De Quiros, Fernan Bernaldo; Waisman, Gabriel Dario; Giunta, Diego Hernan

    2016-01-01

    We aim to describe Statistical Control Process as a quality tool for the Institutional Registry of Venous Thromboembolic Disease (IRTD), a registry developed in a community-care tertiary hospital in Buenos Aires, Argentina. The IRTD is a prospective cohort. The process of data acquisition began with the creation of a computerized alert generated whenever physicians requested imaging or laboratory study to diagnose venous thromboembolism, which defined eligible patients. The process then followed a structured methodology for patient's inclusion, evaluation, and posterior data entry. To control this process, process performance indicators were designed to be measured monthly. These included the number of eligible patients, the number of included patients, median time to patient's evaluation, and percentage of patients lost to evaluation. Control charts were graphed for each indicator. The registry was evaluated in 93 months, where 25,757 patients were reported and 6,798 patients met inclusion criteria. The median time to evaluation was 20 hours (SD, 12) and 7.7% of the total was lost to evaluation. Each indicator presented trends over time, caused by structural changes and improvement cycles, and therefore the central limit suffered inflexions. Statistical process control through process performance indicators allowed us to control the performance of the registry over time to detect systematic problems. We postulate that this approach could be reproduced for other clinical registries.

  5. Memory-type control charts in statistical process control

    NARCIS (Netherlands)

    Abbas, N.

    2012-01-01

    Control chart is the most important statistical tool to manage the business processes. It is a graph of measurements on a quality characteristic of the process on the vertical axis plotted against time on the horizontal axis. The graph is completed with control limits that cause variation mark. Once

  6. Statistical methods in quality assurance

    International Nuclear Information System (INIS)

    Eckhard, W.

    1980-01-01

    During the different phases of a production process - planning, development and design, manufacturing, assembling, etc. - most of the decision rests on a base of statistics, the collection, analysis and interpretation of data. Statistical methods can be thought of as a kit of tools to help to solve problems in the quality functions of the quality loop with respect to produce quality products and to reduce quality costs. Various statistical methods are represented, typical examples for their practical application are demonstrated. (RW)

  7. Optimage central organised image quality control including statistics and reporting

    International Nuclear Information System (INIS)

    Jahnen, A.; Schilz, C.; Shannoun, F.; Schreiner, A.; Hermen, J.; Moll, C.

    2008-01-01

    Quality control of medical imaging systems is performed using dedicated phantoms. As the imaging systems are more and more digital, adequate image processing methods might help to save evaluation time and to receive objective results. The developed software package OPTIMAGE is focusing on this with a central approach: On one hand, OPTIMAGE provides a framework, which includes functions like database integration, DICOM data sources, multilingual user interface and image processing functionality. On the other hand, the test methods are implemented using modules which are able to process the images automatically for the common imaging systems. The integration of statistics and reporting into this environment is paramount: This is the only way to provide these functions in an interactive, user-friendly way. These features enable the users to discover degradation in performance quickly and document performed measurements easily. (authors)

  8. Using Statistical Process Control to Enhance Student Progression

    Science.gov (United States)

    Hanna, Mark D.; Raichura, Nilesh; Bernardes, Ednilson

    2012-01-01

    Public interest in educational outcomes has markedly increased in the most recent decade; however, quality management and statistical process control have not deeply penetrated the management of academic institutions. This paper presents results of an attempt to use Statistical Process Control (SPC) to identify a key impediment to continuous…

  9. QUALITY IMPROVEMENT USING STATISTICAL PROCESS CONTROL TOOLS IN GLASS BOTTLES MANUFACTURING COMPANY

    Directory of Open Access Journals (Sweden)

    Yonatan Mengesha Awaj

    2013-03-01

    Full Text Available In order to survive in a competitive market, improving quality and productivity of product or process is a must for any company. This study is about to apply the statistical process control (SPC tools in the production processing line and on final product in order to reduce defects by identifying where the highest waste is occur at and to give suggestion for improvement. The approach used in this study is direct observation, thorough examination of production process lines, brain storming session, fishbone diagram, and information has been collected from potential customers and company's workers through interview and questionnaire, Pareto chart/analysis and control chart (p-chart was constructed. It has been found that the company has many problems; specifically there is high rejection or waste in the production processing line. The highest waste occurs in melting process line which causes loss due to trickle and in the forming process line which causes loss due to defective product rejection. The vital few problems were identified, it was found that the blisters, double seam, stone, pressure failure and overweight are the vital few problems. The principal aim of the study is to create awareness to quality team how to use SPC tools in the problem analysis, especially to train quality team on how to held an effective brainstorming session, and exploit these data in cause-and-effect diagram construction, Pareto analysis and control chart construction. The major causes of non-conformities and root causes of the quality problems were specified, and possible remedies were proposed. Although the company has many constraints to implement all suggestion for improvement within short period of time, the company recognized that the suggestion will provide significant productivity improvement in the long run.

  10. Use of statistical process control in the production of blood components

    DEFF Research Database (Denmark)

    Magnussen, K; Quere, S; Winkel, P

    2008-01-01

    Introduction of statistical process control in the setting of a small blood centre was tested, both on the regular red blood cell production and specifically to test if a difference was seen in the quality of the platelets produced, when a change was made from a relatively large inexperienced...... by an experienced staff with four technologists. We applied statistical process control to examine if time series of quality control values were in statistical control. Leucocyte count in red blood cells was out of statistical control. Platelet concentration and volume of the platelets produced by the occasional...... occasional component manufacturing staff to an experienced regular manufacturing staff. Production of blood products is a semi-automated process in which the manual steps may be difficult to control. This study was performed in an ongoing effort to improve the control and optimize the quality of the blood...

  11. Using a statistical process control chart during the quality assessment of cancer registry data.

    Science.gov (United States)

    Myles, Zachary M; German, Robert R; Wilson, Reda J; Wu, Manxia

    2011-01-01

    Statistical process control (SPC) charts may be used to detect acute variations in the data while simultaneously evaluating unforeseen aberrations that may warrant further investigation by the data user. Using cancer stage data captured by the Summary Stage 2000 (SS2000) variable, we sought to present a brief report highlighting the utility of the SPC chart during the quality assessment of cancer registry data. Using a county-level caseload for the diagnosis period of 2001-2004 (n=25,648), we found the overall variation of the SS2000 variable to be in control during diagnosis years of 2001 and 2002, exceeded the lower control limit (LCL) in 2003, and exceeded the upper control limit (UCL) in 2004; in situ/localized stages were in control throughout the diagnosis period, regional stage exceeded UCL in 2004, and distant stage exceeded the LCL in 2001 and the UCL in 2004. Our application of the SPC chart with cancer registry data illustrates that the SPC chart may serve as a readily available and timely tool for identifying areas of concern during the data collection and quality assessment of central cancer registry data.

  12. [Statistical process control applied to intensity modulated radiotherapy pretreatment controls with portal dosimetry].

    Science.gov (United States)

    Villani, N; Gérard, K; Marchesi, V; Huger, S; François, P; Noël, A

    2010-06-01

    The first purpose of this study was to illustrate the contribution of statistical process control for a better security in intensity modulated radiotherapy (IMRT) treatments. This improvement is possible by controlling the dose delivery process, characterized by pretreatment quality control results. So, it is necessary to put under control portal dosimetry measurements (currently, the ionisation chamber measurements were already monitored by statistical process control thanks to statistical process control tools). The second objective was to state whether it is possible to substitute ionisation chamber with portal dosimetry in order to optimize time devoted to pretreatment quality control. At Alexis-Vautrin center, pretreatment quality controls in IMRT for prostate and head and neck treatments were performed for each beam of each patient. These controls were made with an ionisation chamber, which is the reference detector for the absolute dose measurement, and with portal dosimetry for the verification of dose distribution. Statistical process control is a statistical analysis method, coming from industry, used to control and improve the studied process quality. It uses graphic tools as control maps to follow-up process, warning the operator in case of failure, and quantitative tools to evaluate the process toward its ability to respect guidelines: this is the capability study. The study was performed on 450 head and neck beams and on 100 prostate beams. Control charts, showing drifts, both slow and weak, and also both strong and fast, of mean and standard deviation have been established and have shown special cause introduced (manual shift of the leaf gap of the multileaf collimator). Correlation between dose measured at one point, given with the EPID and the ionisation chamber has been evaluated at more than 97% and disagreement cases between the two measurements were identified. The study allowed to demonstrate the feasibility to reduce the time devoted to

  13. Statistical process control applied to intensity modulated radiotherapy pretreatment controls with portal dosimetry

    International Nuclear Information System (INIS)

    Villani, N.; Noel, A.; Villani, N.; Gerard, K.; Marchesi, V.; Huger, S.; Noel, A.; Francois, P.

    2010-01-01

    Purpose The first purpose of this study was to illustrate the contribution of statistical process control for a better security in intensity modulated radiotherapy (I.M.R.T.) treatments. This improvement is possible by controlling the dose delivery process, characterized by pretreatment quality control results. So, it is necessary to put under control portal dosimetry measurements (currently, the ionisation chamber measurements were already monitored by statistical process control thanks to statistical process control tools). The second objective was to state whether it is possible to substitute ionisation chamber with portal dosimetry in order to optimize time devoted to pretreatment quality control. Patients and methods At Alexis-Vautrin center, pretreatment quality controls in I.M.R.T. for prostate and head and neck treatments were performed for each beam of each patient. These controls were made with an ionisation chamber, which is the reference detector for the absolute dose measurement, and with portal dosimetry for the verification of dose distribution. Statistical process control is a statistical analysis method, coming from industry, used to control and improve the studied process quality. It uses graphic tools as control maps to follow-up process, warning the operator in case of failure, and quantitative tools to evaluate the process toward its ability to respect guidelines: this is the capability study. The study was performed on 450 head and neck beams and on 100 prostate beams. Results Control charts, showing drifts, both slow and weak, and also both strong and fast, of mean and standard deviation have been established and have shown special cause introduced (manual shift of the leaf gap of the multi-leaf collimator). Correlation between dose measured at one point, given with the E.P.I.D. and the ionisation chamber has been evaluated at more than 97% and disagreement cases between the two measurements were identified. Conclusion The study allowed to

  14. Statistical process control for serially correlated data

    NARCIS (Netherlands)

    Wieringa, Jakob Edo

    1999-01-01

    Statistical Process Control (SPC) aims at quality improvement through reduction of variation. The best known tool of SPC is the control chart. Over the years, the control chart has proved to be a successful practical technique for monitoring process measurements. However, its usefulness in practice

  15. SU-C-BRD-01: A Statistical Modeling Method for Quality Control of Intensity- Modulated Radiation Therapy Planning

    International Nuclear Information System (INIS)

    Gao, S; Meyer, R; Shi, L; D'Souza, W; Zhang, H

    2014-01-01

    Purpose: To apply a statistical modeling approach, threshold modeling (TM), for quality control of intensity-modulated radiation therapy (IMRT) treatment plans. Methods: A quantitative measure, which was the weighted sum of violations of dose/dose-volume constraints, was first developed to represent the quality of each IMRT plan. Threshold modeling approach, which is is an extension of extreme value theory in statistics and is an effect way to model extreme values, was then applied to analyze the quality of the plans summarized by our quantitative measures. Our approach modeled the plans generated by planners as a series of independent and identically distributed random variables and described the behaviors of them if the plan quality was controlled below certain threshold. We tested our approach with five locally advanced head and neck cancer patients retrospectively. Two statistics were incorporated for numerical analysis: probability of quality improvement (PQI) of the plans and expected amount of improvement on the quantitative measure (EQI). Results: After clinical planners generated 15 plans for each patient, we applied our approach to obtain the PQI and EQI as if planners would generate additional 15 plans. For two of the patients, the PQI was significantly higher than the other three (0.17 and 0.18 comparing to 0.08, 0.01 and 0.01). The actual percentage of the additional 15 plans that outperformed the best of initial 15 plans was 20% and 27% comparing to 11%, 0% and 0%. EQI for the two potential patients were 34.5 and 32.9 and the rest of three patients were 9.9, 1.4 and 6.6. The actual improvements obtained were 28.3 and 20.5 comparing to 6.2, 0 and 0. Conclusion: TM is capable of reliably identifying the potential quality improvement of IMRT plans. It provides clinicians an effective tool to assess the trade-off between extra planning effort and achievable plan quality. This work was supported in part by NIH/NCI grant CA130814

  16. An introduction to statistical process control in research proteomics.

    Science.gov (United States)

    Bramwell, David

    2013-12-16

    Statistical process control is a well-established and respected method which provides a general purpose, and consistent framework for monitoring and improving the quality of a process. It is routinely used in many industries where the quality of final products is critical and is often required in clinical diagnostic laboratories [1,2]. To date, the methodology has been little utilised in research proteomics. It has been shown to be capable of delivering quantitative QC procedures for qualitative clinical assays [3] making it an ideal methodology to apply to this area of biological research. To introduce statistical process control as an objective strategy for quality control and show how it could be used to benefit proteomics researchers and enhance the quality of the results they generate. We demonstrate that rules which provide basic quality control are easy to derive and implement and could have a major impact on data quality for many studies. Statistical process control is a powerful tool for investigating and improving proteomics research work-flows. The process of characterising measurement systems and defining control rules forces the exploration of key questions that can lead to significant improvements in performance. This work asserts that QC is essential to proteomics discovery experiments. Every experimenter must know the current capabilities of their measurement system and have an objective means for tracking and ensuring that performance. Proteomic analysis work-flows are complicated and multi-variate. QC is critical for clinical chemistry measurements and huge strides have been made in ensuring the quality and validity of results in clinical biochemistry labs. This work introduces some of these QC concepts and works to bridge their use from single analyte QC to applications in multi-analyte systems. This article is part of a Special Issue entitled: Standardization and Quality Control in Proteomics. Copyright © 2013 The Author. Published by Elsevier

  17. Radiographic rejection index using statistical process control

    International Nuclear Information System (INIS)

    Savi, M.B.M.B.; Camozzato, T.S.C.; Soares, F.A.P.; Nandi, D.M.

    2015-01-01

    The Repeat Analysis Index (IRR) is one of the items contained in the Quality Control Program dictated by brazilian law of radiological protection and should be performed frequently, at least every six months. In order to extract more and better information of IRR, this study presents the Statistical Quality Control applied to reject rate through Statistical Process Control (Control Chart for Attributes ρ - GC) and the Pareto Chart (GP). Data collection was performed for 9 months and the last four months of collection was given on a daily basis. The Limits of Control (LC) were established and Minitab 16 software used to create the charts. IRR obtained for the period was corresponding to 8.8% ± 2,3% and the generated charts analyzed. Relevant information such as orders for X-ray equipment and processors were crossed to identify the relationship between the points that exceeded the control limits and the state of equipment at the time. The GC demonstrated ability to predict equipment failures, as well as the GP showed clearly what causes are recurrent in IRR. (authors) [pt

  18. Use of statistic control of the process as part of a quality assurance plan; Empleo del control estadistico de proceso como parte de un plan de aseguramiento de la calidad

    Energy Technology Data Exchange (ETDEWEB)

    Acosta, S.; Lewis, C., E-mail: sacosta@am.gob.ar [Autoridad Regulatoria Nuclear (ARN), Buenos Aires (Argentina)

    2013-07-01

    One of the technical requirements of the standard IRAM ISO 17025 for the accreditation of testing laboratories, is the assurance of the quality of the results through the control and monitoring of the factors influencing the reliability of them. The grade the factors contribute to the total measurement uncertainty, determines which of them should be considered when developing a quality assurance plan. The laboratory of environmental measurements of strontium-90 in the accreditation process, performs most of its determinations in samples with values close to the detection limit. For this reason the correct characterization of the white, is a critical parameter and is verified through a letter for statistical process control. The scope of the present work is concerned the control of whites and so it was collected a statistically significant amount of data, for a period of time that is covered of different conditions. This allowed consider significant variables in the process, such as temperature and humidity, and build a graph of white control, which forms the basis of a statistical process control. The data obtained were lower and upper limits for the preparation of the charter white control. In this way the process of characterization of white was considered to operate under statistical control and concludes that it can be used as part of a plan of insurance of the quality.

  19. Nonparametric predictive inference in statistical process control

    NARCIS (Netherlands)

    Arts, G.R.J.; Coolen, F.P.A.; Laan, van der P.

    2004-01-01

    Statistical process control (SPC) is used to decide when to stop a process as confidence in the quality of the next item(s) is low. Information to specify a parametric model is not always available, and as SPC is of a predictive nature, we present a control chart developed using nonparametric

  20. Statistical process control methods allow the analysis and improvement of anesthesia care.

    Science.gov (United States)

    Fasting, Sigurd; Gisvold, Sven E

    2003-10-01

    Quality aspects of the anesthetic process are reflected in the rate of intraoperative adverse events. The purpose of this report is to illustrate how the quality of the anesthesia process can be analyzed using statistical process control methods, and exemplify how this analysis can be used for quality improvement. We prospectively recorded anesthesia-related data from all anesthetics for five years. The data included intraoperative adverse events, which were graded into four levels, according to severity. We selected four adverse events, representing important quality and safety aspects, for statistical process control analysis. These were: inadequate regional anesthesia, difficult emergence from general anesthesia, intubation difficulties and drug errors. We analyzed the underlying process using 'p-charts' for statistical process control. In 65,170 anesthetics we recorded adverse events in 18.3%; mostly of lesser severity. Control charts were used to define statistically the predictable normal variation in problem rate, and then used as a basis for analysis of the selected problems with the following results: Inadequate plexus anesthesia: stable process, but unacceptably high failure rate; Difficult emergence: unstable process, because of quality improvement efforts; Intubation difficulties: stable process, rate acceptable; Medication errors: methodology not suited because of low rate of errors. By applying statistical process control methods to the analysis of adverse events, we have exemplified how this allows us to determine if a process is stable, whether an intervention is required, and if quality improvement efforts have the desired effect.

  1. Quality control with R an ISO standards approach

    CERN Document Server

    Cano, Emilio L; Prieto Corcoba, Mariano

    2015-01-01

    Presenting a practitioner's guide to capabilities and best practices of quality control systems using the R programming language, this volume emphasizes accessibility and ease-of-use through detailed explanations of R code as well as standard statistical methodologies. In the interest of reaching the widest possible audience of quality-control professionals and statisticians, examples throughout are structured to simplify complex equations and data structures, and to demonstrate their applications to quality control processes, such as ISO standards. The volume balances its treatment of key aspects of quality control, statistics, and programming in R, making the text accessible to beginners and expert quality control professionals alike. Several appendices serve as useful references for ISO standards and common tasks performed while applying quality control with R.

  2. A quality improvement project using statistical process control methods for type 2 diabetes control in a resource-limited setting.

    Science.gov (United States)

    Flood, David; Douglas, Kate; Goldberg, Vera; Martinez, Boris; Garcia, Pablo; Arbour, MaryCatherine; Rohloff, Peter

    2017-08-01

    Quality improvement (QI) is a key strategy for improving diabetes care in low- and middle-income countries (LMICs). This study reports on a diabetes QI project in rural Guatemala whose primary aim was to improve glycemic control of a panel of adult diabetes patients. Formative research suggested multiple areas for programmatic improvement in ambulatory diabetes care. This project utilized the Model for Improvement and Agile Global Health, our organization's complementary healthcare implementation framework. A bundle of improvement activities were implemented at the home, clinic and institutional level. Control charts of mean hemoglobin A1C (HbA1C) and proportion of patients meeting target HbA1C showed improvement as special cause variation was identified 3 months after the intervention began. Control charts for secondary process measures offered insights into the value of different components of the intervention. Intensity of home-based diabetes education emerged as an important driver of panel glycemic control. Diabetes QI work is feasible in resource-limited settings in LMICs and can improve glycemic control. Statistical process control charts are a promising methodology for use with panels or registries of diabetes patients. © The Author 2017. Published by Oxford University Press in association with the International Society for Quality in Health Care. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  3. A case study: application of statistical process control tool for determining process capability and sigma level.

    Science.gov (United States)

    Chopra, Vikram; Bairagi, Mukesh; Trivedi, P; Nagar, Mona

    2012-01-01

    Statistical process control is the application of statistical methods to the measurement and analysis of variation process. Various regulatory authorities such as Validation Guidance for Industry (2011), International Conference on Harmonisation ICH Q10 (2009), the Health Canada guidelines (2009), Health Science Authority, Singapore: Guidance for Product Quality Review (2008), and International Organization for Standardization ISO-9000:2005 provide regulatory support for the application of statistical process control for better process control and understanding. In this study risk assessments, normal probability distributions, control charts, and capability charts are employed for selection of critical quality attributes, determination of normal probability distribution, statistical stability, and capability of production processes, respectively. The objective of this study is to determine tablet production process quality in the form of sigma process capability. By interpreting data and graph trends, forecasting of critical quality attributes, sigma process capability, and stability of process were studied. The overall study contributes to an assessment of process at the sigma level with respect to out-of-specification attributes produced. Finally, the study will point to an area where the application of quality improvement and quality risk assessment principles for achievement of six sigma-capable processes is possible. Statistical process control is the most advantageous tool for determination of the quality of any production process. This tool is new for the pharmaceutical tablet production process. In the case of pharmaceutical tablet production processes, the quality control parameters act as quality assessment parameters. Application of risk assessment provides selection of critical quality attributes among quality control parameters. Sequential application of normality distributions, control charts, and capability analyses provides a valid statistical

  4. Statistical comparisons of Savannah River anemometer data applied to quality control of instrument networks

    International Nuclear Information System (INIS)

    Porch, W.M.; Dickerson, M.H.

    1976-08-01

    Continuous monitoring of extensive meteorological instrument arrays is a requirement in the study of important mesoscale atmospheric phenomena. The phenomena include pollution transport prediction from continuous area sources, or one time releases of toxic materials and wind energy prospecting in areas of topographic enhancement of the wind. Quality control techniques that can be applied to these data to determine if the instruments are operating within their prescribed tolerances were investigated. Savannah River Plant data were analyzed with both independent and comparative statistical techniques. The independent techniques calculate the mean, standard deviation, moments about the mean, kurtosis, skewness, probability density distribution, cumulative probability and power spectra. The comparative techniques include covariance, cross-spectral analysis and two dimensional probability density. At present the calculating and plotting routines for these statistical techniques do not reside in a single code so it is difficult to ascribe independent memory size and computation time accurately. However, given the flexibility of a data system which includes simple and fast running statistics at the instrument end of the data network (ASF) and more sophisticated techniques at the computational end (ACF) a proper balance will be attained. These techniques are described in detail and preliminary results are presented

  5. Development of statistical and analytical techniques for use in national quality control schemes for steroid hormones

    International Nuclear Information System (INIS)

    Wilson, D.W.; Gaskell, S.J.; Fahmy, D.R.; Joyce, B.G.; Groom, G.V.; Griffiths, K.; Kemp, K.W.; Nix, A.B.J.; Rowlands, R.J.

    1979-01-01

    Adopting the rationale that the improvement of intra-laboratory performance of immunometric assays will enable the assessment of national QC schemes to become more meaningful, the group of participating laboratories has developed statistical and analytical techniques for the improvement of accuracy, precision and monitoring of error for the determination of steroid hormones. These developments are now described and their relevance to NQC schemes discussed. Attention has been focussed on some of the factors necessary for improving standards of quality in immunometric assays and their relevance to laboratories participating in NQC schemes as described. These have included the 'accuracy', precision and robustness of assay procedures as well as improved methods for internal quality control. (Auth.)

  6. Manufacturing Squares: An Integrative Statistical Process Control Exercise

    Science.gov (United States)

    Coy, Steven P.

    2016-01-01

    In the exercise, students in a junior-level operations management class are asked to manufacture a simple product. Given product specifications, they must design a production process, create roles and design jobs for each team member, and develop a statistical process control plan that efficiently and effectively controls quality during…

  7. Statistical method for quality control in presence of measurement errors

    International Nuclear Information System (INIS)

    Lauer-Peccoud, M.R.

    1998-01-01

    In a quality inspection of a set of items where the measurements of values of a quality characteristic of the item are contaminated by random errors, one can take wrong decisions which are damageable to the quality. So of is important to control the risks in such a way that a final quality level is insured. We consider that an item is defective or not if the value G of its quality characteristic is larger or smaller than a given level g. We assume that, due to the lack of precision of the measurement instrument, the measurement M of this characteristic is expressed by ∫ (G) + ξ where f is an increasing function such that the value ∫ (g 0 ) is known and ξ is a random error with mean zero and given variance. First we study the problem of the determination of a critical measure m such that a specified quality target is reached after the classification of a lot of items where each item is accepted or rejected depending on whether its measurement is smaller or greater than m. Then we analyse the problem of testing the global quality of a lot from the measurements for a example of items taken from the lot. For these two kinds of problems and for different quality targets, we propose solutions emphasizing on the case where the function ∫ is linear and the error ξ and the variable G are Gaussian. Simulation results allow to appreciate the efficiency of the different considered control procedures and their robustness with respect to deviations from the assumptions used in the theoretical derivations. (author)

  8. Automation in Siemens fuel manufacturing - the basis for quality improvement by statistical process control (SPC)

    International Nuclear Information System (INIS)

    Drecker, St.; Hoff, A.; Dietrich, M.; Guldner, R.

    1999-01-01

    Statistical Process Control (SPC) is one of the systematic tools to perform a valuable contribution to the control and planning activities for manufacturing processes and product quality. Advanced Nuclear Fuels GmbH (ANF) started a program to introduce SPC in all sections of the manufacturing process of fuel assemblies. The concept phase is based on a realization of SPC in 3 pilot projects. The existing manufacturing devices are reviewed for the utilization of SPC. Subsequent modifications were made to provide the necessary interfaces. The processes 'powder/pellet manufacturing'. 'cladding tube manufacturing' and 'laser-welding of spacers' are located at the different locations of ANF. Due to the completion of the first steps and the experience obtained by the pilot projects, the introduction program for SPC has already been extended to other manufacturing processes. (authors)

  9. Multileaf collimator performance monitoring and improvement using semiautomated quality control testing and statistical process control

    International Nuclear Information System (INIS)

    Létourneau, Daniel; McNiven, Andrea; Keller, Harald; Wang, An; Amin, Md Nurul; Pearce, Jim; Norrlinger, Bernhard; Jaffray, David A.

    2014-01-01

    Purpose: High-quality radiation therapy using highly conformal dose distributions and image-guided techniques requires optimum machine delivery performance. In this work, a monitoring system for multileaf collimator (MLC) performance, integrating semiautomated MLC quality control (QC) tests and statistical process control tools, was developed. The MLC performance monitoring system was used for almost a year on two commercially available MLC models. Control charts were used to establish MLC performance and assess test frequency required to achieve a given level of performance. MLC-related interlocks and servicing events were recorded during the monitoring period and were investigated as indicators of MLC performance variations. Methods: The QC test developed as part of the MLC performance monitoring system uses 2D megavoltage images (acquired using an electronic portal imaging device) of 23 fields to determine the location of the leaves with respect to the radiation isocenter. The precision of the MLC performance monitoring QC test and the MLC itself was assessed by detecting the MLC leaf positions on 127 megavoltage images of a static field. After initial calibration, the MLC performance monitoring QC test was performed 3–4 times/week over a period of 10–11 months to monitor positional accuracy of individual leaves for two different MLC models. Analysis of test results was performed using individuals control charts per leaf with control limits computed based on the measurements as well as two sets of specifications of ±0.5 and ±1 mm. Out-of-specification and out-of-control leaves were automatically flagged by the monitoring system and reviewed monthly by physicists. MLC-related interlocks reported by the linear accelerator and servicing events were recorded to help identify potential causes of nonrandom MLC leaf positioning variations. Results: The precision of the MLC performance monitoring QC test and the MLC itself was within ±0.22 mm for most MLC leaves

  10. Multileaf collimator performance monitoring and improvement using semiautomated quality control testing and statistical process control.

    Science.gov (United States)

    Létourneau, Daniel; Wang, An; Amin, Md Nurul; Pearce, Jim; McNiven, Andrea; Keller, Harald; Norrlinger, Bernhard; Jaffray, David A

    2014-12-01

    High-quality radiation therapy using highly conformal dose distributions and image-guided techniques requires optimum machine delivery performance. In this work, a monitoring system for multileaf collimator (MLC) performance, integrating semiautomated MLC quality control (QC) tests and statistical process control tools, was developed. The MLC performance monitoring system was used for almost a year on two commercially available MLC models. Control charts were used to establish MLC performance and assess test frequency required to achieve a given level of performance. MLC-related interlocks and servicing events were recorded during the monitoring period and were investigated as indicators of MLC performance variations. The QC test developed as part of the MLC performance monitoring system uses 2D megavoltage images (acquired using an electronic portal imaging device) of 23 fields to determine the location of the leaves with respect to the radiation isocenter. The precision of the MLC performance monitoring QC test and the MLC itself was assessed by detecting the MLC leaf positions on 127 megavoltage images of a static field. After initial calibration, the MLC performance monitoring QC test was performed 3-4 times/week over a period of 10-11 months to monitor positional accuracy of individual leaves for two different MLC models. Analysis of test results was performed using individuals control charts per leaf with control limits computed based on the measurements as well as two sets of specifications of ± 0.5 and ± 1 mm. Out-of-specification and out-of-control leaves were automatically flagged by the monitoring system and reviewed monthly by physicists. MLC-related interlocks reported by the linear accelerator and servicing events were recorded to help identify potential causes of nonrandom MLC leaf positioning variations. The precision of the MLC performance monitoring QC test and the MLC itself was within ± 0.22 mm for most MLC leaves and the majority of the

  11. Statistical Data Mining for Efficient Quality Control in Manufacturing

    DEFF Research Database (Denmark)

    Khan, Abdul Rauf; Schiøler, Henrik; Knudsen, Torben Steen

    2015-01-01

    of the process e.g sensor measurements, machine readings etc, and the major contributor of these big data sets are different quality control processes. In this article we will present methodology to extract valuable insight from manufacturing data. The proposed methodology is based on comparison of probabilities...

  12. Feasibility study of using statistical process control to customized quality assurance in proton therapy.

    Science.gov (United States)

    Rah, Jeong-Eun; Shin, Dongho; Oh, Do Hoon; Kim, Tae Hyun; Kim, Gwe-Ya

    2014-09-01

    To evaluate and improve the reliability of proton quality assurance (QA) processes and, to provide an optimal customized tolerance level using the statistical process control (SPC) methodology. The authors investigated the consistency check of dose per monitor unit (D/MU) and range in proton beams to see whether it was within the tolerance level of the daily QA process. This study analyzed the difference between the measured and calculated ranges along the central axis to improve the patient-specific QA process in proton beams by using process capability indices. The authors established a customized tolerance level of ±2% for D/MU and ±0.5 mm for beam range in the daily proton QA process. In the authors' analysis of the process capability indices, the patient-specific range measurements were capable of a specification limit of ±2% in clinical plans. SPC methodology is a useful tool for customizing the optimal QA tolerance levels and improving the quality of proton machine maintenance, treatment delivery, and ultimately patient safety.

  13. Feasibility study of using statistical process control to customized quality assurance in proton therapy

    International Nuclear Information System (INIS)

    Rah, Jeong-Eun; Oh, Do Hoon; Shin, Dongho; Kim, Tae Hyun; Kim, Gwe-Ya

    2014-01-01

    Purpose: To evaluate and improve the reliability of proton quality assurance (QA) processes and, to provide an optimal customized tolerance level using the statistical process control (SPC) methodology. Methods: The authors investigated the consistency check of dose per monitor unit (D/MU) and range in proton beams to see whether it was within the tolerance level of the daily QA process. This study analyzed the difference between the measured and calculated ranges along the central axis to improve the patient-specific QA process in proton beams by using process capability indices. Results: The authors established a customized tolerance level of ±2% for D/MU and ±0.5 mm for beam range in the daily proton QA process. In the authors’ analysis of the process capability indices, the patient-specific range measurements were capable of a specification limit of ±2% in clinical plans. Conclusions: SPC methodology is a useful tool for customizing the optimal QA tolerance levels and improving the quality of proton machine maintenance, treatment delivery, and ultimately patient safety

  14. The statistical process control methods - SPC

    Directory of Open Access Journals (Sweden)

    Floreková Ľubica

    1998-03-01

    Full Text Available Methods of statistical evaluation of quality – SPC (item 20 of the documentation system of quality control of ISO norm, series 900 of various processes, products and services belong amongst basic qualitative methods that enable us to analyse and compare data pertaining to various quantitative parameters. Also they enable, based on the latter, to propose suitable interventions with the aim of improving these processes, products and services. Theoretical basis and applicatibily of the principles of the: - diagnostics of a cause and effects, - Paret analysis and Lorentz curve, - number distribution and frequency curves of random variable distribution, - Shewhart regulation charts, are presented in the contribution.

  15. A method for evaluating treatment quality using in vivo EPID dosimetry and statistical process control in radiation therapy.

    Science.gov (United States)

    Fuangrod, Todsaporn; Greer, Peter B; Simpson, John; Zwan, Benjamin J; Middleton, Richard H

    2017-03-13

    Purpose Due to increasing complexity, modern radiotherapy techniques require comprehensive quality assurance (QA) programmes, that to date generally focus on the pre-treatment stage. The purpose of this paper is to provide a method for an individual patient treatment QA evaluation and identification of a "quality gap" for continuous quality improvement. Design/methodology/approach A statistical process control (SPC) was applied to evaluate treatment delivery using in vivo electronic portal imaging device (EPID) dosimetry. A moving range control chart was constructed to monitor the individual patient treatment performance based on a control limit generated from initial data of 90 intensity-modulated radiotherapy (IMRT) and ten volumetric-modulated arc therapy (VMAT) patient deliveries. A process capability index was used to evaluate the continuing treatment quality based on three quality classes: treatment type-specific, treatment linac-specific, and body site-specific. Findings The determined control limits were 62.5 and 70.0 per cent of the χ pass-rate for IMRT and VMAT deliveries, respectively. In total, 14 patients were selected for a pilot study the results of which showed that about 1 per cent of all treatments contained errors relating to unexpected anatomical changes between treatment fractions. Both rectum and pelvis cancer treatments demonstrated process capability indices were less than 1, indicating the potential for quality improvement and hence may benefit from further assessment. Research limitations/implications The study relied on the application of in vivo EPID dosimetry for patients treated at the specific centre. Sampling patients for generating the control limits were limited to 100 patients. Whilst the quantitative results are specific to the clinical techniques and equipment used, the described method is generally applicable to IMRT and VMAT treatment QA. Whilst more work is required to determine the level of clinical significance, the

  16. Experience in statistical quality control for road construction in South Africa

    CSIR Research Space (South Africa)

    Mitchell, MF

    1977-06-01

    Full Text Available of statistically oriented acceptance control procedures to a major road construction project is examined and it is concluded that such procedures promise to be of benefit to both the client and the contractor....

  17. Statistical Process Control Charts for Measuring and Monitoring Temporal Consistency of Ratings

    Science.gov (United States)

    Omar, M. Hafidz

    2010-01-01

    Methods of statistical process control were briefly investigated in the field of educational measurement as early as 1999. However, only the use of a cumulative sum chart was explored. In this article other methods of statistical quality control are introduced and explored. In particular, methods in the form of Shewhart mean and standard deviation…

  18. An easy and low cost option for economic statistical process control ...

    African Journals Online (AJOL)

    a large number of nonconforming products are manufactured. ... size, n, sampling interval, h, and control limit parameter, k, that minimize the ...... [11] Montgomery DC, 2001, Introduction to statistical quality control, 4th Edition, John Wiley, New.

  19. Statistical process control: separating signal from noise in emergency department operations.

    Science.gov (United States)

    Pimentel, Laura; Barrueto, Fermin

    2015-05-01

    Statistical process control (SPC) is a visually appealing and statistically rigorous methodology very suitable to the analysis of emergency department (ED) operations. We demonstrate that the control chart is the primary tool of SPC; it is constructed by plotting data measuring the key quality indicators of operational processes in rationally ordered subgroups such as units of time. Control limits are calculated using formulas reflecting the variation in the data points from one another and from the mean. SPC allows managers to determine whether operational processes are controlled and predictable. We review why the moving range chart is most appropriate for use in the complex ED milieu, how to apply SPC to ED operations, and how to determine when performance improvement is needed. SPC is an excellent tool for operational analysis and quality improvement for these reasons: 1) control charts make large data sets intuitively coherent by integrating statistical and visual descriptions; 2) SPC provides analysis of process stability and capability rather than simple comparison with a benchmark; 3) SPC allows distinction between special cause variation (signal), indicating an unstable process requiring action, and common cause variation (noise), reflecting a stable process; and 4) SPC keeps the focus of quality improvement on process rather than individual performance. Because data have no meaning apart from their context, and every process generates information that can be used to improve it, we contend that SPC should be seriously considered for driving quality improvement in emergency medicine. Copyright © 2015 Elsevier Inc. All rights reserved.

  20. Control by quality: proposition of a typology.

    Science.gov (United States)

    Pujo, P; Pillet, M

    The application of Quality tools and methods in industrial management has always had a fundamental impact on the control of production. It influences the behavior of the actors concerned, while introducing the necessary notions and formalizations, especially for production systems with little or no automation, which constitute a large part of the industrial activity. Several quality approaches are applied in the workshop and are implemented at the level of the control. In this paper, the authors present a typology of the various approaches that have successively influenced control, such as statistical process control, quality assurance, and continuous improvement. First the authors present a parallel between production control and quality organizational structure. They note the duality between control, which is aimed at increasing productivity, and quality, which aims to satisfy the needs of the customer. They also note the hierarchical organizational structure of these two systems of management with, at each level, the notion of a feedback loop. This notion is fundamental to any kind of decision making. The paper is organized around the operational, tactical, and strategic levels, by describing for each level the main methods and tools for control by quality. The overview of these tools and methods starts at the operational level, with the Statistical Process Control, the Taguchi technique, and the "six sigma" approach. On the tactical level, we find a quality system approach, with a documented description of the procedures introduced in the firm. The management system can refer here to Quality Assurance, Total Productive Maintenance, or Management by Total Quality. The formalization through procedures of the rules of decision governing the process control enhances the validity of these rules. This leads to the enhancement of their reliability and to their consolidation. All this counterbalances the human, intrinsically fluctuating, behavior of the control

  1. Multivariate statistical process control in product quality review assessment - A case study.

    Science.gov (United States)

    Kharbach, M; Cherrah, Y; Vander Heyden, Y; Bouklouze, A

    2017-11-01

    According to the Food and Drug Administration and the European Good Manufacturing Practices (GMP) guidelines, Annual Product Review (APR) is a mandatory requirement in GMP. It consists of evaluating a large collection of qualitative or quantitative data in order to verify the consistency of an existing process. According to the Code of Federal Regulation Part 11 (21 CFR 211.180), all finished products should be reviewed annually for the quality standards to determine the need of any change in specification or manufacturing of drug products. Conventional Statistical Process Control (SPC) evaluates the pharmaceutical production process by examining only the effect of a single factor at the time using a Shewhart's chart. It neglects to take into account the interaction between the variables. In order to overcome this issue, Multivariate Statistical Process Control (MSPC) can be used. Our case study concerns an APR assessment, where 164 historical batches containing six active ingredients, manufactured in Morocco, were collected during one year. Each batch has been checked by assaying the six active ingredients by High Performance Liquid Chromatography according to European Pharmacopoeia monographs. The data matrix was evaluated both by SPC and MSPC. The SPC indicated that all batches are under control, while the MSPC, based on Principal Component Analysis (PCA), for the data being either autoscaled or robust scaled, showed four and seven batches, respectively, out of the Hotelling T 2 95% ellipse. Also, an improvement of the capability of the process is observed without the most extreme batches. The MSPC can be used for monitoring subtle changes in the manufacturing process during an APR assessment. Copyright © 2017 Académie Nationale de Pharmacie. Published by Elsevier Masson SAS. All rights reserved.

  2. Quality control of gamma radiation measuring systems

    International Nuclear Information System (INIS)

    Surma, M.J.

    2002-01-01

    The problem of quality control and assurance of gamma radiation measuring systems has been described in detail. The factors deciding of high quality of radiometric measurements as well as statistical testing and calibration of measuring systems have been presented and discussed

  3. Statistical process control analysis for patient quality assurance of intensity modulated radiation therapy

    Science.gov (United States)

    Lee, Rena; Kim, Kyubo; Cho, Samju; Lim, Sangwook; Lee, Suk; Shim, Jang Bo; Huh, Hyun Do; Lee, Sang Hoon; Ahn, Sohyun

    2017-11-01

    This study applied statistical process control to set and verify the quality assurances (QA) tolerance standard for our hospital's characteristics with the criteria standards that are applied to all the treatment sites with this analysis. Gamma test factor of delivery quality assurances (DQA) was based on 3%/3 mm. Head and neck, breast, prostate cases of intensity modulated radiation therapy (IMRT) or volumetric arc radiation therapy (VMAT) were selected for the analysis of the QA treatment sites. The numbers of data used in the analysis were 73 and 68 for head and neck patients. Prostate and breast were 49 and 152 by MapCHECK and ArcCHECK respectively. C p value of head and neck and prostate QA were above 1.0, C pml is 1.53 and 1.71 respectively, which is close to the target value of 100%. C pml value of breast (IMRT) was 1.67, data values are close to the target value of 95%. But value of was 0.90, which means that the data values are widely distributed. C p and C pml of breast VMAT QA were respectively 1.07 and 2.10. This suggests that the VMAT QA has better process capability than the IMRT QA. Consequently, we should pay more attention to planning and QA before treatment for breast Radiotherapy.

  4. Statistical Methods in Assembly Quality Management of Multi-Element Products on Automatic Rotor Lines

    Science.gov (United States)

    Pries, V. V.; Proskuriakov, N. E.

    2018-04-01

    To control the assembly quality of multi-element mass-produced products on automatic rotor lines, control methods with operational feedback are required. However, due to possible failures in the operation of the devices and systems of automatic rotor line, there is always a real probability of getting defective (incomplete) products into the output process stream. Therefore, a continuous sampling control of the products completeness, based on the use of statistical methods, remains an important element in managing the quality of assembly of multi-element mass products on automatic rotor lines. The feature of continuous sampling control of the multi-element products completeness in the assembly process is its breaking sort, which excludes the possibility of returning component parts after sampling control to the process stream and leads to a decrease in the actual productivity of the assembly equipment. Therefore, the use of statistical procedures for continuous sampling control of the multi-element products completeness when assembled on automatic rotor lines requires the use of such sampling plans that ensure a minimum size of control samples. Comparison of the values of the limit of the average output defect level for the continuous sampling plan (CSP) and for the automated continuous sampling plan (ACSP) shows the possibility of providing lower limit values for the average output defects level using the ACSP-1. Also, the average sample size when using the ACSP-1 plan is less than when using the CSP-1 plan. Thus, the application of statistical methods in the assembly quality management of multi-element products on automatic rotor lines, involving the use of proposed plans and methods for continuous selective control, will allow to automating sampling control procedures and the required level of quality of assembled products while minimizing sample size.

  5. A Statistical Project Control Tool for Engineering Managers

    Science.gov (United States)

    Bauch, Garland T.

    2001-01-01

    This slide presentation reviews the use of a Statistical Project Control Tool (SPCT) for managing engineering projects. A literature review pointed to a definition of project success, (i.e., A project is successful when the cost, schedule, technical performance, and quality satisfy the customer.) The literature review also pointed to project success factors, and traditional project control tools, and performance measures that are detailed in the report. The essential problem is that with resources becoming more limited, and an increasing number or projects, project failure is increasing, there is a limitation of existing methods and systematic methods are required. The objective of the work is to provide a new statistical project control tool for project managers. Graphs using the SPCT method plotting results of 3 successful projects and 3 failed projects are reviewed, with success and failure being defined by the owner.

  6. Blind image quality assessment based on aesthetic and statistical quality-aware features

    Science.gov (United States)

    Jenadeleh, Mohsen; Masaeli, Mohammad Masood; Moghaddam, Mohsen Ebrahimi

    2017-07-01

    The main goal of image quality assessment (IQA) methods is the emulation of human perceptual image quality judgments. Therefore, the correlation between objective scores of these methods with human perceptual scores is considered as their performance metric. Human judgment of the image quality implicitly includes many factors when assessing perceptual image qualities such as aesthetics, semantics, context, and various types of visual distortions. The main idea of this paper is to use a host of features that are commonly employed in image aesthetics assessment in order to improve blind image quality assessment (BIQA) methods accuracy. We propose an approach that enriches the features of BIQA methods by integrating a host of aesthetics image features with the features of natural image statistics derived from multiple domains. The proposed features have been used for augmenting five different state-of-the-art BIQA methods, which use statistical natural scene statistics features. Experiments were performed on seven benchmark image quality databases. The experimental results showed significant improvement of the accuracy of the methods.

  7. Statistical Process Control in the Practice of Program Evaluation.

    Science.gov (United States)

    Posavac, Emil J.

    1995-01-01

    A technique developed to monitor the quality of manufactured products, statistical process control (SPC), incorporates several features that may prove attractive to evaluators. This paper reviews the history of SPC, suggests how the approach can enrich program evaluation, and illustrates its use in a hospital-based example. (SLD)

  8. Adaptive Statistical Iterative Reconstruction-V Versus Adaptive Statistical Iterative Reconstruction: Impact on Dose Reduction and Image Quality in Body Computed Tomography.

    Science.gov (United States)

    Gatti, Marco; Marchisio, Filippo; Fronda, Marco; Rampado, Osvaldo; Faletti, Riccardo; Bergamasco, Laura; Ropolo, Roberto; Fonio, Paolo

    The aim of this study was to evaluate the impact on dose reduction and image quality of the new iterative reconstruction technique: adaptive statistical iterative reconstruction (ASIR-V). Fifty consecutive oncologic patients acted as case controls undergoing during their follow-up a computed tomography scan both with ASIR and ASIR-V. Each study was analyzed in a double-blinded fashion by 2 radiologists. Both quantitative and qualitative analyses of image quality were conducted. Computed tomography scanner radiation output was 38% (29%-45%) lower (P ASIR-V examinations than for the ASIR ones. The quantitative image noise was significantly lower (P ASIR-V. Adaptive statistical iterative reconstruction-V had a higher performance for the subjective image noise (P = 0.01 for 5 mm and P = 0.009 for 1.25 mm), the other parameters (image sharpness, diagnostic acceptability, and overall image quality) being similar (P > 0.05). Adaptive statistical iterative reconstruction-V is a new iterative reconstruction technique that has the potential to provide image quality equal to or greater than ASIR, with a dose reduction around 40%.

  9. Production process and quality control for the HTTR fuel

    International Nuclear Information System (INIS)

    Yoshimuta, S.; Suzuki, N.; Kaneko, M.; Fukuda, K.

    1991-01-01

    Development of the production and inspection technology for High Temperature Engineering Test Reactor (HTTR) fuel has been carried out by cooperative work between Japan Atomic Energy Research Institute (JAERI) and Nuclear Fuel Industries, Ltd (NFI). The performance and the quality level of the developed fuel are well established to meet the design requirements of the HTTR. For the commercial scale production of the fuel, statistical quality control and quality assurance must be carefully considered in order to assure the safety of the HTTR. It is also important to produce the fuel under well controlled process condition. To meet these requirements in the production of the HTTR fuel, a new production process and quality control system is to be introduced in the new facilities. The main feature of the system is a computer integrated control system. Process control data at each production stage of products and semi-products are all gathered by terminal computers and processed by a host computer. The processed information is effectively used for the production, quality and accountancy control. With the aid of this system, all the products will be easily traceable from starting materials to final stages and the statistical evaluation of the quality of products becomes more reliable. (author). 8 figs

  10. Project T.E.A.M. (Technical Education Advancement Modules). Advanced Statistical Process Control.

    Science.gov (United States)

    Dunlap, Dale

    This instructional guide, one of a series developed by the Technical Education Advancement Modules (TEAM) project, is a 20-hour advanced statistical process control (SPC) and quality improvement course designed to develop the following competencies: (1) understanding quality systems; (2) knowing the process; (3) solving quality problems; and (4)…

  11. Implementation of statistical process control for proteomic experiments via LC MS/MS.

    Science.gov (United States)

    Bereman, Michael S; Johnson, Richard; Bollinger, James; Boss, Yuval; Shulman, Nick; MacLean, Brendan; Hoofnagle, Andrew N; MacCoss, Michael J

    2014-04-01

    Statistical process control (SPC) is a robust set of tools that aids in the visualization, detection, and identification of assignable causes of variation in any process that creates products, services, or information. A tool has been developed termed Statistical Process Control in Proteomics (SProCoP) which implements aspects of SPC (e.g., control charts and Pareto analysis) into the Skyline proteomics software. It monitors five quality control metrics in a shotgun or targeted proteomic workflow. None of these metrics require peptide identification. The source code, written in the R statistical language, runs directly from the Skyline interface, which supports the use of raw data files from several of the mass spectrometry vendors. It provides real time evaluation of the chromatographic performance (e.g., retention time reproducibility, peak asymmetry, and resolution), and mass spectrometric performance (targeted peptide ion intensity and mass measurement accuracy for high resolving power instruments) via control charts. Thresholds are experiment- and instrument-specific and are determined empirically from user-defined quality control standards that enable the separation of random noise and systematic error. Finally, Pareto analysis provides a summary of performance metrics and guides the user to metrics with high variance. The utility of these charts to evaluate proteomic experiments is illustrated in two case studies.

  12. Real-time statistical quality control and ARM

    International Nuclear Information System (INIS)

    Blough, D.K.

    1992-05-01

    An important component of the Atmospheric Radiation Measurement (ARM) Program is real-time quality control of data obtained from meteorological instruments. It is the goal of the ARM program to enhance the predictive capabilities of global circulation models by incorporating in them more detailed information on the radiative characteristics of the earth's atmosphere. To this end, a number of Cloud and Radiation Testbeds (CART's) will be built at various locations worldwide. Each CART will consist of an array of instruments designed to collect radiative data. The large amount of data obtained from these instruments necessitates real-time processing in order to flag outliers and possible instrument malfunction. The Bayesian dynamic linear model (DLM) proves to be an effective way of monitoring the time series data which each instrument generates. It provides a flexible yet powerful approach to detecting in real-time sudden shifts in a non-stationary multivariate time series. An application of these techniques to data arising from a remote sensing instrument to be used in the CART is provided. Using real data from a wind profiler, the ability of the DLM to detect outliers is studied. 5 refs

  13. Automatic optimisation of beam orientations using the simplex algorithm and optimisation of quality control using statistical process control (S.P.C.) for intensity modulated radiation therapy (I.M.R.T.)

    International Nuclear Information System (INIS)

    Gerard, K.

    2008-11-01

    Intensity Modulated Radiation Therapy (I.M.R.T.) is currently considered as a technique of choice to increase the local control of the tumour while reducing the dose to surrounding organs at risk. However, its routine clinical implementation is partially held back by the excessive amount of work required to prepare the patient treatment. In order to increase the efficiency of the treatment preparation, two axes of work have been defined. The first axis concerned the automatic optimisation of beam orientations. We integrated the simplex algorithm in the treatment planning system. Starting from the dosimetric objectives set by the user, it can automatically determine the optimal beam orientations that best cover the target volume while sparing organs at risk. In addition to time sparing, the simplex results of three patients with a cancer of the oropharynx, showed that the quality of the plan is also increased compared to a manual beam selection. Indeed, for an equivalent or even a better target coverage, it reduces the dose received by the organs at risk. The second axis of work concerned the optimisation of pre-treatment quality control. We used an industrial method: Statistical Process Control (S.P.C.) to retrospectively analyse the absolute dose quality control results performed using an ionisation chamber at Centre Alexis Vautrin (C.A.V.). This study showed that S.P.C. is an efficient method to reinforce treatment security using control charts. It also showed that our dose delivery process was stable and statistically capable for prostate treatments, which implies that a reduction of the number of controls can be considered for this type of treatment at the C.A.V.. (author)

  14. [Statistical approach to evaluate the occurrence of out-of acceptable ranges and accuracy for antimicrobial susceptibility tests in inter-laboratory quality control program].

    Science.gov (United States)

    Ueno, Tamio; Matuda, Junichi; Yamane, Nobuhisa

    2013-03-01

    To evaluate the occurrence of out-of acceptable ranges and accuracy of antimicrobial susceptibility tests, we applied a new statistical tool to the Inter-Laboratory Quality Control Program established by the Kyushu Quality Control Research Group. First, we defined acceptable ranges of minimum inhibitory concentration (MIC) for broth microdilution tests and inhibitory zone diameter for disk diffusion tests on the basis of Clinical and Laboratory Standards Institute (CLSI) M100-S21. In the analysis, more than two out-of acceptable range results in the 20 tests were considered as not allowable according to the CLSI document. Of the 90 participating laboratories, 46 (51%) experienced one or more occurrences of out-of acceptable range results. Then, a binomial test was applied to each participating laboratory. The results indicated that the occurrences of out-of acceptable range results in the 11 laboratories were significantly higher when compared to the CLSI recommendation (allowable rate laboratory was statistically compared with zero using a Student's t-test. The results revealed that 5 of the 11 above laboratories reported erroneous test results that systematically drifted to the side of resistance. In conclusion, our statistical approach has enabled us to detect significantly higher occurrences and source of interpretive errors in antimicrobial susceptibility tests; therefore, this approach can provide us with additional information that can improve the accuracy of the test results in clinical microbiology laboratories.

  15. Pattern-based feature extraction for fault detection in quality relevant process control

    NARCIS (Netherlands)

    Peruzzo, S.; Holenderski, M.J.; Lukkien, J.J.

    2017-01-01

    Statistical quality control (SQC) applies multivariate statistics to monitor production processes over time and detect changes in their performance in terms of meeting specification limits on key product quality metrics. These limits are imposed by customers and typically assumed to be a single

  16. Statistical Engine Knock Control

    DEFF Research Database (Denmark)

    Stotsky, Alexander A.

    2008-01-01

    A new statistical concept of the knock control of a spark ignition automotive engine is proposed . The control aim is associated with the statistical hy pothesis test which compares the threshold value to the average value of the max imal amplitud e of the knock sensor signal at a given freq uency....... C ontrol algorithm which is used for minimization of the regulation error realizes a simple count-up-count-d own logic. A new ad aptation algorithm for the knock d etection threshold is also d eveloped . C onfi d ence interval method is used as the b asis for ad aptation. A simple statistical mod el...... which includ es generation of the amplitud e signals, a threshold value d etermination and a knock sound mod el is d eveloped for evaluation of the control concept....

  17. An Improvement of the Hotelling T2 Statistic in Monitoring Multivariate Quality Characteristics

    Directory of Open Access Journals (Sweden)

    Ashkan Shabbak

    2012-01-01

    Full Text Available The Hotelling T2 statistic is the most popular statistic used in multivariate control charts to monitor multiple qualities. However, this statistic is easily affected by the existence of more than one outlier in the data set. To rectify this problem, robust control charts, which are based on the minimum volume ellipsoid and the minimum covariance determinant, have been proposed. Most researchers assess the performance of multivariate control charts based on the number of signals without paying much attention to whether those signals are really outliers. With due respect, we propose to evaluate control charts not only based on the number of detected outliers but also with respect to their correct positions. In this paper, an Upper Control Limit based on the median and the median absolute deviation is also proposed. The results of this study signify that the proposed Upper Control Limit improves the detection of correct outliers but that it suffers from a swamping effect when the positions of outliers are not taken into consideration. Finally, a robust control chart based on the diagnostic robust generalised potential procedure is introduced to remedy this drawback.

  18. Management of Uncertainty by Statistical Process Control and a Genetic Tuned Fuzzy System

    Directory of Open Access Journals (Sweden)

    Stephan Birle

    2016-01-01

    Full Text Available In food industry, bioprocesses like fermentation often are a crucial part of the manufacturing process and decisive for the final product quality. In general, they are characterized by highly nonlinear dynamics and uncertainties that make it difficult to control these processes by the use of traditional control techniques. In this context, fuzzy logic controllers offer quite a straightforward way to control processes that are affected by nonlinear behavior and uncertain process knowledge. However, in order to maintain process safety and product quality it is necessary to specify the controller performance and to tune the controller parameters. In this work, an approach is presented to establish an intelligent control system for oxidoreductive yeast propagation as a representative process biased by the aforementioned uncertainties. The presented approach is based on statistical process control and fuzzy logic feedback control. As the cognitive uncertainty among different experts about the limits that define the control performance as still acceptable may differ a lot, a data-driven design method is performed. Based upon a historic data pool statistical process corridors are derived for the controller inputs control error and change in control error. This approach follows the hypothesis that if the control performance criteria stay within predefined statistical boundaries, the final process state meets the required quality definition. In order to keep the process on its optimal growth trajectory (model based reference trajectory a fuzzy logic controller is used that alternates the process temperature. Additionally, in order to stay within the process corridors, a genetic algorithm was applied to tune the input and output fuzzy sets of a preliminarily parameterized fuzzy controller. The presented experimental results show that the genetic tuned fuzzy controller is able to keep the process within its allowed limits. The average absolute error to the

  19. Computer-aided control of high-quality cast iron

    Directory of Open Access Journals (Sweden)

    S. Pietrowski

    2008-04-01

    Full Text Available The study discusses the possibility of control of the high-quality grey cast iron and ductile iron using the author’s genuine computer programs. The programs have been developed with the help of algorithms based on statistical relationships that are said to exist between the characteristic parameters of DTA curves and properties, like Rp0,2, Rm, A5 and HB. It has been proved that the spheroidisation and inoculation treatment of cast iron changes in an important way the characteristic parameters of DTA curves, thus enabling a control of these operations as regards their correctness and effectiveness, along with the related changes in microstructure and mechanical properties of cast iron. Moreover, some examples of statistical relationships existing between the typical properties of ductile iron and its control process were given for cases of the melts consistent and inconsistent with the adopted technology.A test stand for control of the high-quality cast iron and respective melts has been schematically depicted.

  20. Multivariate Statistical Process Control

    DEFF Research Database (Denmark)

    Kulahci, Murat

    2013-01-01

    As sensor and computer technology continues to improve, it becomes a normal occurrence that we confront with high dimensional data sets. As in many areas of industrial statistics, this brings forth various challenges in statistical process control (SPC) and monitoring for which the aim...... is to identify “out-of-control” state of a process using control charts in order to reduce the excessive variation caused by so-called assignable causes. In practice, the most common method of monitoring multivariate data is through a statistic akin to the Hotelling’s T2. For high dimensional data with excessive...... amount of cross correlation, practitioners are often recommended to use latent structures methods such as Principal Component Analysis to summarize the data in only a few linear combinations of the original variables that capture most of the variation in the data. Applications of these control charts...

  1. Statistical Inference at Work: Statistical Process Control as an Example

    Science.gov (United States)

    Bakker, Arthur; Kent, Phillip; Derry, Jan; Noss, Richard; Hoyles, Celia

    2008-01-01

    To characterise statistical inference in the workplace this paper compares a prototypical type of statistical inference at work, statistical process control (SPC), with a type of statistical inference that is better known in educational settings, hypothesis testing. Although there are some similarities between the reasoning structure involved in…

  2. Statistical quality control charts for liver transplant process indicators: evaluation of a single-center experience.

    Science.gov (United States)

    Varona, M A; Soriano, A; Aguirre-Jaime, A; Barrera, M A; Medina, M L; Bañon, N; Mendez, S; Lopez, E; Portero, J; Dominguez, D; Gonzalez, A

    2012-01-01

    Liver transplantation, the best option for many end-stage liver diseases, is indicated in more candidates than the donor availability. In this situation, this demanding treatment must achieve excellence, accessibility and patient satisfaction to be ethical, scientific, and efficient. The current consensus of quality measurements promoted by the Sociedad Española de Trasplante Hepático (SETH) seeks to depict criteria, indicators, and standards for liver transplantation in Spain. According to this recommendation, the Canary Islands liver program has studied its experience. We separated the 411 cadaveric transplants performed in the last 15 years into 2 groups: The first 100 and the other 311. The 8 criteria of SETH 2010 were correctly fulfilled. In most indicators, the outcomes were favorable, with an actuarial survivals at 1, 3, 5, and 10 years of 84%, 79%, 76%, and 65%, respectively; excellent results in retransplant rates (early 0.56% and long-term 5.9%), primary nonfunction rate (0.43%), waiting list mortality (13.34%), and patient satisfaction (91.5%). On the other hand, some indicators of mortality were worse as perioperative, postoperative, and early mortality with normal graft function and reoperation rate. After the analyses of the series with statistical quality control charts, we observed an improvement in all indicators, even in the apparently worst, early mortality with normal graft functions in a stable program. Such results helped us to discover specific areas to improve the program. The application of the quality measurement, as SETH consensus recommends, has shown in our study that despite being a consuming time process, it is a useful tool. Copyright © 2012 Elsevier Inc. All rights reserved.

  3. Pengendalian Kualitas Produk Di Industri Garment Dengan Menggunakan Statistical Procces Control (SPC

    Directory of Open Access Journals (Sweden)

    Rizal Rachman

    2017-09-01

    Full Text Available Abstrak Perusahaan memandang bahwa kualitas sebagai faktor kunci yang membawa keberhasilan dan standar mutu yang telah ditetapkan oleh buyer. Tujuan penelitian ini adalah untuk mengetahui tingkat kerusakan produk dalam batas pengendalian kualitas pada proses produksi pakaian jadi pada PT. Asia Penta Garment. Penelitian ini menggunakan metode statistical procces control. Data yang diambil dalam penelitian ini mengunakan data sekunder berupa laporan jumlah produksi dan kerusakan pakaian jadi di bagian finishing pada Januari 2017. Berdasarkan hasil menunjukkan kerusakan diluar batas pengendalian yaitu ada yang diluar batas kendali (out of control dengan batas pengendalian atas (UCL dan batas pengendalian bawah (LCL dan rata-rata kerusakan diluar batas kendali.Untuk meningkatkan kualitas produk khususnya pakaian yang dihasilkan perusahaan, kebijakan mutu yang telah ditetapkan harus dijalankan dengan benar, antara lain dalam pemilihan negoisasi bahan baku dengan buyer sesuai standar, perekrutan tenaga kerja yang berpengalaman, kedisiplinan kerja yang tinggi, pembinaan para karyawan, pemberian bonus pada karyawan yang sesuai target dan disiplin tinggi, perbaikan mesin secara terus menerus dan memperbaiki lingkungan kerja yang bersih, nyaman, serta aman.   Kata Kunci : Pengendalian kualitas, Kualitas produk, SPC. Abstract The Company considers that quality as a key factor that brings success and quality standards set by the buyer. The purpose of this study was to determine the level of damage to the product within the limits of quality control in the production process apparel in PT. Asia Penta Garment. This study uses a statistical procces control methode. Data taken in this study using secondary data from reports on the number of production and damage to clothing in the finishing section in January 2017. Based on the results show the damage outside the control limits is nothing beyond the control limit (out of control with the upper control limit

  4. Statistical process control for electron beam monitoring.

    Science.gov (United States)

    López-Tarjuelo, Juan; Luquero-Llopis, Naika; García-Mollá, Rafael; Quirós-Higueras, Juan David; Bouché-Babiloni, Ana; Juan-Senabre, Xavier Jordi; de Marco-Blancas, Noelia; Ferrer-Albiach, Carlos; Santos-Serra, Agustín

    2015-07-01

    To assess the electron beam monitoring statistical process control (SPC) in linear accelerator (linac) daily quality control. We present a long-term record of our measurements and evaluate which SPC-led conditions are feasible for maintaining control. We retrieved our linac beam calibration, symmetry, and flatness daily records for all electron beam energies from January 2008 to December 2013, and retrospectively studied how SPC could have been applied and which of its features could be used in the future. A set of adjustment interventions designed to maintain these parameters under control was also simulated. All phase I data was under control. The dose plots were characterized by rising trends followed by steep drops caused by our attempts to re-center the linac beam calibration. Where flatness and symmetry trends were detected they were less-well defined. The process capability ratios ranged from 1.6 to 9.3 at a 2% specification level. Simulated interventions ranged from 2% to 34% of the total number of measurement sessions. We also noted that if prospective SPC had been applied it would have met quality control specifications. SPC can be used to assess the inherent variability of our electron beam monitoring system. It can also indicate whether a process is capable of maintaining electron parameters under control with respect to established specifications by using a daily checking device, but this is not practical unless a method to establish direct feedback from the device to the linac can be devised. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  5. Statistical Techniques for Project Control

    CERN Document Server

    Badiru, Adedeji B

    2012-01-01

    A project can be simple or complex. In each case, proven project management processes must be followed. In all cases of project management implementation, control must be exercised in order to assure that project objectives are achieved. Statistical Techniques for Project Control seamlessly integrates qualitative and quantitative tools and techniques for project control. It fills the void that exists in the application of statistical techniques to project control. The book begins by defining the fundamentals of project management then explores how to temper quantitative analysis with qualitati

  6. Evaluating Statistical Process Control (SPC) techniques and computing the uncertainty of force calibrations

    Science.gov (United States)

    Navard, Sharon E.

    1989-01-01

    In recent years there has been a push within NASA to use statistical techniques to improve the quality of production. Two areas where statistics are used are in establishing product and process quality control of flight hardware and in evaluating the uncertainty of calibration of instruments. The Flight Systems Quality Engineering branch is responsible for developing and assuring the quality of all flight hardware; the statistical process control methods employed are reviewed and evaluated. The Measurement Standards and Calibration Laboratory performs the calibration of all instruments used on-site at JSC as well as those used by all off-site contractors. These calibrations must be performed in such a way as to be traceable to national standards maintained by the National Institute of Standards and Technology, and they must meet a four-to-one ratio of the instrument specifications to calibrating standard uncertainty. In some instances this ratio is not met, and in these cases it is desirable to compute the exact uncertainty of the calibration and determine ways of reducing it. A particular example where this problem is encountered is with a machine which does automatic calibrations of force. The process of force calibration using the United Force Machine is described in detail. The sources of error are identified and quantified when possible. Suggestions for improvement are made.

  7. Implementation of quality control systematics for personnel monitoring services

    International Nuclear Information System (INIS)

    Franco, J.O.A.

    1984-01-01

    The implementation of statistical quality control techniques used in industrial practise is proposed to dosimetric services. 'Control charts' and 'sampling inspection' are adapted respectively for control of measuring process and of dose results produced in routine. A chapter on Radiation Protection and Personnel Monitoring was included. (M.A.C.) [pt

  8. Batch-to-batch quality consistency evaluation of botanical drug products using multivariate statistical analysis of the chromatographic fingerprint.

    Science.gov (United States)

    Xiong, Haoshu; Yu, Lawrence X; Qu, Haibin

    2013-06-01

    Botanical drug products have batch-to-batch quality variability due to botanical raw materials and the current manufacturing process. The rational evaluation and control of product quality consistency are essential to ensure the efficacy and safety. Chromatographic fingerprinting is an important and widely used tool to characterize the chemical composition of botanical drug products. Multivariate statistical analysis has showed its efficacy and applicability in the quality evaluation of many kinds of industrial products. In this paper, the combined use of multivariate statistical analysis and chromatographic fingerprinting is presented here to evaluate batch-to-batch quality consistency of botanical drug products. A typical botanical drug product in China, Shenmai injection, was selected as the example to demonstrate the feasibility of this approach. The high-performance liquid chromatographic fingerprint data of historical batches were collected from a traditional Chinese medicine manufacturing factory. Characteristic peaks were weighted by their variability among production batches. A principal component analysis model was established after outliers were modified or removed. Multivariate (Hotelling T(2) and DModX) control charts were finally successfully applied to evaluate the quality consistency. The results suggest useful applications for a combination of multivariate statistical analysis with chromatographic fingerprinting in batch-to-batch quality consistency evaluation for the manufacture of botanical drug products.

  9. Initiating statistical process control to improve quality outcomes in colorectal surgery.

    Science.gov (United States)

    Keller, Deborah S; Stulberg, Jonah J; Lawrence, Justin K; Samia, Hoda; Delaney, Conor P

    2015-12-01

    Unexpected variations in postoperative length of stay (LOS) negatively impact resources and patient outcomes. Statistical process control (SPC) measures performance, evaluates productivity, and modifies processes for optimal performance. The goal of this study was to initiate SPC to identify LOS outliers and evaluate its feasibility to improve outcomes in colorectal surgery. Review of a prospective database identified colorectal procedures performed by a single surgeon. Patients were grouped into elective and emergent categories and then stratified by laparoscopic and open approaches. All followed a standardized enhanced recovery protocol. SPC was applied to identify outliers and evaluate causes within each group. A total of 1294 cases were analyzed--83% elective (n = 1074) and 17% emergent (n = 220). Emergent cases were 70.5% open and 29.5% laparoscopic; elective cases were 36.8% open and 63.2% laparoscopic. All groups had a wide range in LOS. LOS outliers ranged from 8.6% (elective laparoscopic) to 10.8% (emergent laparoscopic). Evaluation of outliers demonstrated patient characteristics of higher ASA scores, longer operating times, ICU requirement, and temporary nursing at discharge. Outliers had higher postoperative complication rates in elective open (57.1 vs. 20.0%) and elective lap groups (77.6 vs. 26.1%). Outliers also had higher readmission rates for emergent open (11.4 vs. 5.4%), emergent lap (14.3 vs. 9.2%), and elective lap (32.8 vs. 6.9%). Elective open outliers did not follow trends of longer LOS or higher reoperation rates. SPC is feasible and promising for improving colorectal surgery outcomes. SPC identified patient and process characteristics associated with increased LOS. SPC may allow real-time outlier identification, during quality improvement efforts, and reevaluation of outcomes after introducing process change. SPC has clinical implications for improving patient outcomes and resource utilization.

  10. Internal quality control of neutron activation analysis laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Kim, S. H.; Mun, J. H.; BaeK, S. Y.; Jung, Y. S.; Kim, Y. J. [KAERI, Taejon (Korea, Republic of)

    2004-07-01

    The importance for quality assurance and control in analytical laboratories has been emphasized, day by day. Internal quality control using certified reference materials(CRMs) can be one of effective methods for this purpose. In this study, 10 kinds of CRMs consisting of soil, sediment and biological matrix were analyzed. To evaluate the confidence of analytical results and the validation of testing method and procedure, the accuracy and the precision of the measured elements were treated statistically and the reproducibility was compared with those values produced before 2003.

  11. Statistical Process Control in a Modern Production Environment

    DEFF Research Database (Denmark)

    Windfeldt, Gitte Bjørg

    gathered here and standard statistical software. In Paper 2 a new method for process monitoring is introduced. The method uses a statistical model of the quality characteristic and a sliding window of observations to estimate the probability that the next item will not respect the specications......Paper 1 is aimed at practicians to help them test the assumption that the observations in a sample are independent and identically distributed. An assumption that is essential when using classical Shewhart charts. The test can easily be performed in the control chart setup using the samples....... If the estimated probability exceeds a pre-determined threshold the process will be stopped. The method is exible, allowing a complexity in modeling that remains invisible to the end user. Furthermore, the method allows to build diagnostic plots based on the parameters estimates that can provide valuable insight...

  12. Statistical process control in wine industry using control cards

    OpenAIRE

    Dimitrieva, Evica; Atanasova-Pacemska, Tatjana; Pacemska, Sanja

    2013-01-01

    This paper is based on the research of the technological process of automatic filling of bottles of wine in winery in Stip, Republic of Macedonia. The statistical process control using statistical control card is created. The results and recommendations for improving the process are discussed.

  13. Statistical process control applied to intensity modulated radiotherapy pretreatment controls with portal dosimetry;Maitrise statistique des processus appliquee aux controles avant traitement par dosimetrie portale en radiotherapie conformationnelle avec modulation d'intensite

    Energy Technology Data Exchange (ETDEWEB)

    Villani, N.; Noel, A. [Laboratoire de recherche en radiophysique, CRAN UMR 7039, Nancy universite-CNRS, 54 - Vandoeuvre-les-Nancy (France); Villani, N.; Gerard, K.; Marchesi, V.; Huger, S.; Noel, A. [Departement de radiophysique, centre Alexis-Vautrin, 54 - Vandoeuvre-les-Nancy (France); Francois, P. [Institut Curie, 75 - Paris (France)

    2010-06-15

    Purpose The first purpose of this study was to illustrate the contribution of statistical process control for a better security in intensity modulated radiotherapy (I.M.R.T.) treatments. This improvement is possible by controlling the dose delivery process, characterized by pretreatment quality control results. So, it is necessary to put under control portal dosimetry measurements (currently, the ionisation chamber measurements were already monitored by statistical process control thanks to statistical process control tools). The second objective was to state whether it is possible to substitute ionisation chamber with portal dosimetry in order to optimize time devoted to pretreatment quality control. Patients and methods At Alexis-Vautrin center, pretreatment quality controls in I.M.R.T. for prostate and head and neck treatments were performed for each beam of each patient. These controls were made with an ionisation chamber, which is the reference detector for the absolute dose measurement, and with portal dosimetry for the verification of dose distribution. Statistical process control is a statistical analysis method, coming from industry, used to control and improve the studied process quality. It uses graphic tools as control maps to follow-up process, warning the operator in case of failure, and quantitative tools to evaluate the process toward its ability to respect guidelines: this is the capability study. The study was performed on 450 head and neck beams and on 100 prostate beams. Results Control charts, showing drifts, both slow and weak, and also both strong and fast, of mean and standard deviation have been established and have shown special cause introduced (manual shift of the leaf gap of the multi-leaf collimator). Correlation between dose measured at one point, given with the E.P.I.D. and the ionisation chamber has been evaluated at more than 97% and disagreement cases between the two measurements were identified. Conclusion The study allowed to

  14. Internal quality control: planning and implementation strategies.

    Science.gov (United States)

    Westgard, James O

    2003-11-01

    The first essential in setting up internal quality control (IQC) of a test procedure in the clinical laboratory is to select the proper IQC procedure to implement, i.e. choosing the statistical criteria or control rules, and the number of control measurements, according to the quality required for the test and the observed performance of the method. Then the right IQC procedure must be properly implemented. This review focuses on strategies for planning and implementing IQC procedures in order to improve the quality of the IQC. A quantitative planning process is described that can be implemented with graphical tools such as power function or critical-error graphs and charts of operating specifications. Finally, a total QC strategy is formulated to minimize cost and maximize quality. A general strategy for IQC implementation is recommended that employs a three-stage design in which the first stage provides high error detection, the second stage low false rejection and the third stage prescribes the length of the analytical run, making use of an algorithm involving the average of normal patients' data.

  15. Illinois' Forests, 2005: Statistics, Methods, and Quality Assurance

    Science.gov (United States)

    Susan J. Crocker; Charles J. Barnett; Mark A. Hatfield

    2013-01-01

    The first full annual inventory of Illinois' forests was completed in 2005. This report contains 1) descriptive information on methods, statistics, and quality assurance of data collection, 2) a glossary of terms, 3) tables that summarize quality assurance, and 4) a core set of tabular estimates for a variety of forest resources. A detailed analysis of inventory...

  16. Multivariate Statistical Analysis of Water Quality data in Indian River Lagoon, Florida

    Science.gov (United States)

    Sayemuzzaman, M.; Ye, M.

    2015-12-01

    The Indian River Lagoon, is part of the longest barrier island complex in the United States, is a region of particular concern to the environmental scientist because of the rapid rate of human development throughout the region and the geographical position in between the colder temperate zone and warmer sub-tropical zone. Thus, the surface water quality analysis in this region always brings the newer information. In this present study, multivariate statistical procedures were applied to analyze the spatial and temporal water quality in the Indian River Lagoon over the period 1998-2013. Twelve parameters have been analyzed on twelve key water monitoring stations in and beside the lagoon on monthly datasets (total of 27,648 observations). The dataset was treated using cluster analysis (CA), principle component analysis (PCA) and non-parametric trend analysis. The CA was used to cluster twelve monitoring stations into four groups, with stations on the similar surrounding characteristics being in the same group. The PCA was then applied to the similar groups to find the important water quality parameters. The principal components (PCs), PC1 to PC5 was considered based on the explained cumulative variances 75% to 85% in each cluster groups. Nutrient species (phosphorus and nitrogen), salinity, specific conductivity and erosion factors (TSS, Turbidity) were major variables involved in the construction of the PCs. Statistical significant positive or negative trends and the abrupt trend shift were detected applying Mann-Kendall trend test and Sequential Mann-Kendall (SQMK), for each individual stations for the important water quality parameters. Land use land cover change pattern, local anthropogenic activities and extreme climate such as drought might be associated with these trends. This study presents the multivariate statistical assessment in order to get better information about the quality of surface water. Thus, effective pollution control/management of the surface

  17. Multivariate Statistical Process Control Charts and the Problem of Interpretation: A Short Overview and Some Applications in Industry

    OpenAIRE

    Bersimis, Sotiris; Panaretos, John; Psarakis, Stelios

    2005-01-01

    Woodall and Montgomery [35] in a discussion paper, state that multivariate process control is one of the most rapidly developing sections of statistical process control. Nowadays, in industry, there are many situations in which the simultaneous monitoring or control, of two or more related quality - process characteristics is necessary. Process monitoring problems in which several related variables are of interest are collectively known as Multivariate Statistical Process Control (MSPC).This ...

  18. Quality Control in Production Processes

    Directory of Open Access Journals (Sweden)

    Prístavka Miroslav

    2016-09-01

    Full Text Available The tools for quality management are used for quality improvement throughout the whole Europe and developed countries. Simple statistics are considered one of the most basic methods. The goal was to apply the simple statistical methods to practice and to solve problems by using them. Selected methods are used for processing the list of internal discrepancies within the organization, and for identification of the root cause of the problem and its appropriate solution. Seven basic quality tools are simple graphical tools, but very effective in solving problems related to quality. They are called essential because they are suitable for people with at least basic knowledge in statistics; therefore, they can be used to solve the vast majority of problems.

  19. Advanced methods of quality control in nuclear fuel fabrication

    International Nuclear Information System (INIS)

    Onoufriev, Vladimir

    2004-01-01

    Under pressure of current economic and electricity market situation utilities implement more demanding fuel utilization schemes including higher burn ups and thermal rates, longer fuel cycles and usage of Mo fuel. Therefore, fuel vendors have recently initiated new R and D programmes aimed at improving fuel quality, design and materials to produce robust and reliable fuel. In the beginning of commercial fuel fabrication, emphasis was given to advancements in Quality Control/Quality Assurance related mainly to product itself. During recent years, emphasis was transferred to improvements in process control and to implementation of overall Total Quality Management (TQM) programmes. In the area of fuel quality control, statistical control methods are now widely implemented replacing 100% inspection. This evolution, some practical examples and IAEA activities are described in the paper. The paper presents major findings of the latest IAEA Technical Meetings (TMs) and training courses in the area with emphasis on information received at the TM and training course held in 1999 and other latest publications to provide an overview of new developments in process/quality control, their implementation and results obtained including new approaches to QC

  20. Multivariate Statistical Process Control Charts: An Overview

    OpenAIRE

    Bersimis, Sotiris; Psarakis, Stelios; Panaretos, John

    2006-01-01

    In this paper we discuss the basic procedures for the implementation of multivariate statistical process control via control charting. Furthermore, we review multivariate extensions for all kinds of univariate control charts, such as multivariate Shewhart-type control charts, multivariate CUSUM control charts and multivariate EWMA control charts. In addition, we review unique procedures for the construction of multivariate control charts, based on multivariate statistical techniques such as p...

  1. Automated statistical modeling of analytical measurement systems

    International Nuclear Information System (INIS)

    Jacobson, J.J.

    1992-01-01

    The statistical modeling of analytical measurement systems at the Idaho Chemical Processing Plant (ICPP) has been completely automated through computer software. The statistical modeling of analytical measurement systems is one part of a complete quality control program used by the Remote Analytical Laboratory (RAL) at the ICPP. The quality control program is an integration of automated data input, measurement system calibration, database management, and statistical process control. The quality control program and statistical modeling program meet the guidelines set forth by the American Society for Testing Materials and American National Standards Institute. A statistical model is a set of mathematical equations describing any systematic bias inherent in a measurement system and the precision of a measurement system. A statistical model is developed from data generated from the analysis of control standards. Control standards are samples which are made up at precise known levels by an independent laboratory and submitted to the RAL. The RAL analysts who process control standards do not know the values of those control standards. The object behind statistical modeling is to describe real process samples in terms of their bias and precision and, to verify that a measurement system is operating satisfactorily. The processing of control standards gives us this ability

  2. Using Statistical Process Control to Drive Improvement in Neonatal Care: A Practical Introduction to Control Charts.

    Science.gov (United States)

    Gupta, Munish; Kaplan, Heather C

    2017-09-01

    Quality improvement (QI) is based on measuring performance over time, and variation in data measured over time must be understood to guide change and make optimal improvements. Common cause variation is natural variation owing to factors inherent to any process; special cause variation is unnatural variation owing to external factors. Statistical process control methods, and particularly control charts, are robust tools for understanding data over time and identifying common and special cause variation. This review provides a practical introduction to the use of control charts in health care QI, with a focus on neonatology. Copyright © 2017 Elsevier Inc. All rights reserved.

  3. A comprehensive analysis of the IMRT dose delivery process using statistical process control (SPC)

    Energy Technology Data Exchange (ETDEWEB)

    Gerard, Karine; Grandhaye, Jean-Pierre; Marchesi, Vincent; Kafrouni, Hanna; Husson, Francois; Aletti, Pierre [Research Center for Automatic Control (CRAN), Nancy University, CNRS, 54516 Vandoeuvre-les-Nancy (France); Department of Medical Physics, Alexis Vautrin Cancer Center, 54511 Vandoeuvre-les-Nancy Cedex (France) and DOSIsoft SA, 94230 Cachan (France); Research Laboratory for Innovative Processes (ERPI), Nancy University, EA 3767, 5400 Nancy Cedex (France); Department of Medical Physics, Alexis Vautrin Cancer Center, 54511 Vandoeuvre-les-Nancy Cedex (France); DOSIsoft SA, 94230 Cachan (France); Research Center for Automatic Control (CRAN), Nancy University, CNRS, 54516 Vandoeuvre-les-Nancy, France and Department of Medical Physics, Alexis Vautrin Cancer Center, 54511 Vandoeuvre-les-Nancy Cedex (France)

    2009-04-15

    The aim of this study is to introduce tools to improve the security of each IMRT patient treatment by determining action levels for the dose delivery process. To achieve this, the patient-specific quality control results performed with an ionization chamber--and which characterize the dose delivery process--have been retrospectively analyzed using a method borrowed from industry: Statistical process control (SPC). The latter consisted in fulfilling four principal well-structured steps. The authors first quantified the short term variability of ionization chamber measurements regarding the clinical tolerances used in the cancer center ({+-}4% of deviation between the calculated and measured doses) by calculating a control process capability (C{sub pc}) index. The C{sub pc} index was found superior to 4, which implies that the observed variability of the dose delivery process is not biased by the short term variability of the measurement. Then, the authors demonstrated using a normality test that the quality control results could be approximated by a normal distribution with two parameters (mean and standard deviation). Finally, the authors used two complementary tools--control charts and performance indices--to thoroughly analyze the IMRT dose delivery process. Control charts aim at monitoring the process over time using statistical control limits to distinguish random (natural) variations from significant changes in the process, whereas performance indices aim at quantifying the ability of the process to produce data that are within the clinical tolerances, at a precise moment. The authors retrospectively showed that the analysis of three selected control charts (individual value, moving-range, and EWMA control charts) allowed efficient drift detection of the dose delivery process for prostate and head-and-neck treatments before the quality controls were outside the clinical tolerances. Therefore, when analyzed in real time, during quality controls, they should

  4. A comprehensive analysis of the IMRT dose delivery process using statistical process control (SPC).

    Science.gov (United States)

    Gérard, Karine; Grandhaye, Jean-Pierre; Marchesi, Vincent; Kafrouni, Hanna; Husson, François; Aletti, Pierre

    2009-04-01

    The aim of this study is to introduce tools to improve the security of each IMRT patient treatment by determining action levels for the dose delivery process. To achieve this, the patient-specific quality control results performed with an ionization chamber--and which characterize the dose delivery process--have been retrospectively analyzed using a method borrowed from industry: Statistical process control (SPC). The latter consisted in fulfilling four principal well-structured steps. The authors first quantified the short-term variability of ionization chamber measurements regarding the clinical tolerances used in the cancer center (+/- 4% of deviation between the calculated and measured doses) by calculating a control process capability (C(pc)) index. The C(pc) index was found superior to 4, which implies that the observed variability of the dose delivery process is not biased by the short-term variability of the measurement. Then, the authors demonstrated using a normality test that the quality control results could be approximated by a normal distribution with two parameters (mean and standard deviation). Finally, the authors used two complementary tools--control charts and performance indices--to thoroughly analyze the IMRT dose delivery process. Control charts aim at monitoring the process over time using statistical control limits to distinguish random (natural) variations from significant changes in the process, whereas performance indices aim at quantifying the ability of the process to produce data that are within the clinical tolerances, at a precise moment. The authors retrospectively showed that the analysis of three selected control charts (individual value, moving-range, and EWMA control charts) allowed efficient drift detection of the dose delivery process for prostate and head-and-neck treatments before the quality controls were outside the clinical tolerances. Therefore, when analyzed in real time, during quality controls, they should improve the

  5. Model-generated air quality statistics for application in vegetation response models in Alberta

    International Nuclear Information System (INIS)

    McVehil, G.E.; Nosal, M.

    1990-01-01

    To test and apply vegetation response models in Alberta, air pollution statistics representative of various parts of the Province are required. At this time, air quality monitoring data of the requisite accuracy and time resolution are not available for most parts of Alberta. Therefore, there exists a need to develop appropriate air quality statistics. The objectives of the work reported here were to determine the applicability of model generated air quality statistics and to develop by modelling, realistic and representative time series of hourly SO 2 concentrations that could be used to generate the statistics demanded by vegetation response models

  6. Water Quality attainment Information from Clean Water Act Statewide Statistical Surveys

    Data.gov (United States)

    U.S. Environmental Protection Agency — Designated uses assessed by statewide statistical surveys and their state and national attainment categories. Statewide statistical surveys are water quality...

  7. Water Quality Stressor Information from Clean Water Act Statewide Statistical Surveys

    Data.gov (United States)

    U.S. Environmental Protection Agency — Stressors assessed by statewide statistical surveys and their state and national attainment categories. Statewide statistical surveys are water quality assessments...

  8. Association between product quality control and process quality control of bulk milk

    NARCIS (Netherlands)

    Velthuis, A.; Asseldonk, van M.A.P.M.

    2010-01-01

    Assessment of dairy-milk quality is based on product quality control (testing bulk-milk samples) and process quality control (auditing dairy farms). It is unknown whether process control improves product quality. To quantify possible association between product control and process control a

  9. Batch-to-Batch Quality Consistency Evaluation of Botanical Drug Products Using Multivariate Statistical Analysis of the Chromatographic Fingerprint

    OpenAIRE

    Xiong, Haoshu; Yu, Lawrence X.; Qu, Haibin

    2013-01-01

    Botanical drug products have batch-to-batch quality variability due to botanical raw materials and the current manufacturing process. The rational evaluation and control of product quality consistency are essential to ensure the efficacy and safety. Chromatographic fingerprinting is an important and widely used tool to characterize the chemical composition of botanical drug products. Multivariate statistical analysis has showed its efficacy and applicability in the quality evaluation of many ...

  10. Quality Control Applications

    CERN Document Server

    Chorafas, Dimitris N

    2013-01-01

    Quality control is a constant priority in electrical, mechanical, aeronautical, and nuclear engineering – as well as in the vast domain of electronics, from home appliances to computers and telecommunications. Quality Control Applications provides guidance and valuable insight into quality control policies; their methods, their implementation, constant observation and associated technical audits. What has previously been a mostly mathematical topic is translated here for engineers concerned with the practical implementation of quality control. Once the fundamentals of quality control are established, Quality Control Applications goes on to develop this knowledge and explain how to apply it in the most effective way. Techniques are described and supported using relevant, real-life, case studies to provide detail and clarity for those without a mathematical background. Among the many practical examples, two case studies dramatize the importance of quality assurance: A shot-by-shot analysis of the errors made ...

  11. A statistical rationale for establishing process quality control limits using fixed sample size, for critical current verification of SSC superconducting wire

    International Nuclear Information System (INIS)

    Pollock, D.A.; Brown, G.; Capone, D.W. II; Christopherson, D.; Seuntjens, J.M.; Woltz, J.

    1992-01-01

    This work has demonstrated the statistical concepts behind the XBAR R method for determining sample limits to verify billet I c performance and process uniformity. Using a preliminary population estimate for μ and σ from a stable production lot of only 5 billets, we have shown that reasonable sensitivity to systematic process drift and random within billet variation may be achieved, by using per billet subgroup sizes of moderate proportions. The effects of subgroup size (n) and sampling risk (α and β) on the calculated control limits have been shown to be important factors that need to be carefully considered when selecting an actual number of measurements to be used per billet, for each supplier process. Given the present method of testing in which individual wire samples are ramped to I c only once, with measurement uncertainty due to repeatability and reproducibility (typically > 1.4%), large subgroups (i.e. >30 per billet) appear to be unnecessary, except as an inspection tool to confirm wire process history for each spool. The introduction of the XBAR R method or a similar Statistical Quality Control procedure is recommend for use in the superconducing wire production program, particularly when the program transitions from requiring tests for all pieces of wire to sampling each production unit

  12. Metrological aspects to quality control for natural gas analyses

    Energy Technology Data Exchange (ETDEWEB)

    Ribeiro, Claudia Cipriano; Borges, Cleber Nogueira; Cunha, Valnei S. [Instituto Nacional de Metrologia, Normalizacao e Qualidade Industrial (INMETRO), Rio de Janeiro, RJ (Brazil); Augusto, Cristiane R. [Universidade Federal do Rio de Janeiro (UFRJ), RJ (Brazil); Augusto, Marco Ignazio [Companhia Estadual de Gas do Rio de Janeiro (CEG), RJ (Brazil)

    2008-07-01

    The Product's Quality and Services are fundamental topics in the globalized commercial relationship inclusive concern the measurements in natural gas. Considerable investments were necessary for industry especially about the quality control in the commercialized gas with an inclusion of the natural gas in Brazilian energetic resources The Brazilian Regulatory Agency, ANP - Agencia Nacional de Petroleo, Gas Natural e Biocombustiveis - created the Resolution ANP no.16. This Resolution defines the natural gas specification, either national or international source, for commercialization in Brazil and list the tolerance concentration for some components. Between of this components are the inert compounds like the CO{sub 2} and N{sub 2}. The presence of this compounds reduce the calorific power, apart from increase the resistance concern the detonation in the case of vehicular application, and occasion the reduction in the methane concentration in the gas. Controls charts can be useful to verify if the process are or not under Statistical Control. The process can be considerate under statistical control if the measurements have it values between in lower and upper limits stated previously The controls charts can be approach several characteristics in each subgroup: means, standard deviations, amplitude or proportion of defects. The charts are draws for a specific characteristic and to detect some deviate in the process under specific environment conditions. The CEG - Companhia de Distribuicao de Gas do Rio de Janeiro and the DQUIM - Chemical Metrology Division has an agreement for technical cooperation in research and development of gas natural composition Concern the importance of the natural gas in the Nation development, as well as the question approaching the custody transference, the objective of this work is demonstrate the control quality of the natural gas composition between the CEG laboratory and the DQUIM laboratory aiming the quality increase of the

  13. Implementing self sustained quality control procedures in a clinical laboratory.

    Science.gov (United States)

    Khatri, Roshan; K C, Sanjay; Shrestha, Prabodh; Sinha, J N

    2013-01-01

    Quality control is an essential component in every clinical laboratory which maintains the excellence of laboratory standards, supplementing to proper disease diagnosis, patient care and resulting in overall strengthening of health care system. Numerous quality control schemes are available, with combinations of procedures, most of which are tedious, time consuming and can be "too technical" whereas commercially available quality control materials can be expensive especially for laboratories in developing nations like Nepal. Here, we present a procedure performed at our centre with self prepared control serum and use of simple statistical tools for quality assurance. The pooled serum was prepared as per guidelines for preparation of stabilized liquid quality control serum from human sera. Internal Quality Assessment was performed on this sample, on a daily basis which included measurement of 12 routine biochemical parameters. The results were plotted on Levey-Jennings charts and analysed with quality control rules, for a period of one month. The mean levels of biochemical analytes in self prepared control serum were within normal physiological range. This serum was evaluated every day along with patients' samples. The results obtained were plotted on control charts and analysed using common quality control rules to identify possible systematic and random errors. Immediate mitigation measures were taken and the dispatch of erroneous reports was avoided. In this study we try to highlight on a simple internal quality control procedure which can be performed by laboratories, with minimum technology, expenditure, and expertise and improve reliability and validity of the test reports.

  14. Radiation measurements and quality control

    International Nuclear Information System (INIS)

    McLaughlin, W.L.

    1977-01-01

    Accurate measurements are essential to research leading to a successful radiation process and to the commissioning of the process and the facility. On the other hand, once the process is in production, the importance to quality control of measuring radiation quantities (i.e., absorbed dose, dose rate, dose distribution) rather than various other parameters of the process (i.e. conveyor speed, dwell time, radiation field characteristics, product dimensions) is not clearly established. When the safety of the product is determined by the magnitude of the administered dose, as in radiation sterilization, waste control, or food preservation, accuracy and precision of the measurement of the effective dose are vital. Since physical dose measurements are usually simpler, more reliable and reproducible than biological testing of the product, there is a trend toward using standardized dosimetry for quality control of some processes. In many industrial products, however, such as vulcanized rubber, textiles, plastics, coatings, films, wire and cable, the effective dose can be controlled satisfactorily by controlling process variables or by product testing itself. In the measurement of radiation dose profiles by dosimetry, it is necessary to have suitable dose meter calibrations, to account for sources of error and imprecision, and to use correct statistical procedures in specifying dwell times or conveyor speeds and source and product parameters to achieve minimum and maximum doses within specifications. (author)

  15. Validation of analytical methods for the quality control of Naproxen suppositories

    International Nuclear Information System (INIS)

    Rodriguez Hernandez, Yaslenis; Suarez Perez, Yania; Garcia Pulpeiro, Oscar; Hernandez Contreras, Orestes Yuniel

    2011-01-01

    The analysis methods that will be used for the quality control of the future Cuban-made Naproxen suppositories for adults and children were developed for the first time in this paper. One method based on direct ultraviolet spectrophotometry was put forward, which proved to be specific, linear, accurate and precise for the quality control of Naproxen suppositories, taking into account the presence of chromophore groups in their structure. Likewise, the direct semi-aqueous acid-base volumetry method aimed at the quality control of the Naproxen raw material was changed and adapted to the quality control of suppositories. On the basis of the validation process, there was demonstrated the adequate specificity of this method with respect to the formulation components, as well as its linearity, accuracy and precision in 1-3 mg/ml range. The final results were compared and no significant statistical differences among the replicas per each dose were found in both methods; therefore, both may be used in the quality control of Naproxen suppositories

  16. Analytical techniques and quality control in biomedical trace element research

    DEFF Research Database (Denmark)

    Heydorn, K.

    1994-01-01

    The small number of analytical results in trace element research calls for special methods of quality control. It is shown that when the analytical methods are in statistical control, only small numbers of duplicate or replicate results are needed to ascertain the absence of systematic errors....../kg. Measurement compatibility is obtained by control of traceability to certified reference materials, (C) 1994 Wiley-Liss, Inc....

  17. Control of Bank Consolidated Financial Statements Quality

    Directory of Open Access Journals (Sweden)

    Margarita S. Ambarchyan

    2013-01-01

    Full Text Available The author presents the multiple linear regression model of bank consolidated financial statements quality. The article considers six characteristics that can be used to estimate the level of bank consolidated financial statements quality. The multiple linear regression model was developed, using the results of point-based system of consolidated financial statements of thirty European bank and financial groups on the basis of the developed characteristics. The author offers to use the characteristic significance factor in the process of consolidated financial statements appraisal by points. The constructed regression model is checked on accuracy and statistical significance. The model can be used by internal auditors and financial analytics as an instrument for bank and non-bank consolidated financial statements quality control

  18. Quality control and conduct of genome-wide association meta-analyses

    DEFF Research Database (Denmark)

    Winkler, Thomas W; Day, Felix R; Croteau-Chonka, Damien C

    2014-01-01

    Rigorous organization and quality control (QC) are necessary to facilitate successful genome-wide association meta-analyses (GWAMAs) of statistics aggregated across multiple genome-wide association studies. This protocol provides guidelines for (i) organizational aspects of GWAMAs, and for (ii) QC...

  19. Characterization of groundwater quality using water evaluation indices, multivariate statistics and geostatistics in central Bangladesh

    Directory of Open Access Journals (Sweden)

    Md. Bodrud-Doza

    2016-04-01

    Full Text Available This study investigates the groundwater quality in the Faridpur district of central Bangladesh based on preselected 60 sample points. Water evaluation indices and a number of statistical approaches such as multivariate statistics and geostatistics are applied to characterize water quality, which is a major factor for controlling the groundwater quality in term of drinking purposes. The study reveal that EC, TDS, Ca2+, total As and Fe values of groundwater samples exceeded Bangladesh and international standards. Ground water quality index (GWQI exhibited that about 47% of the samples were belonging to good quality water for drinking purposes. The heavy metal pollution index (HPI, degree of contamination (Cd, heavy metal evaluation index (HEI reveal that most of the samples belong to low level of pollution. However, Cd provide better alternative than other indices. Principle component analysis (PCA suggests that groundwater quality is mainly related to geogenic (rock–water interaction and anthropogenic source (agrogenic and domestic sewage in the study area. Subsequently, the findings of cluster analysis (CA and correlation matrix (CM are also consistent with the PCA results. The spatial distributions of groundwater quality parameters are determined by geostatistical modeling. The exponential semivariagram model is validated as the best fitted models for most of the indices values. It is expected that outcomes of the study will provide insights for decision makers taking proper measures for groundwater quality management in central Bangladesh.

  20. Plan delivery quality assurance for CyberKnife: Statistical process control analysis of 350 film-based patient-specific QAs.

    Science.gov (United States)

    Bellec, J; Delaby, N; Jouyaux, F; Perdrieux, M; Bouvier, J; Sorel, S; Henry, O; Lafond, C

    2017-07-01

    Robotic radiosurgery requires plan delivery quality assurance (DQA) but there has never been a published comprehensive analysis of a patient-specific DQA process in a clinic. We proposed to evaluate 350 consecutive film-based patient-specific DQAs using statistical process control. We evaluated the performance of the process to propose achievable tolerance criteria for DQA validation and we sought to identify suboptimal DQA using control charts. DQAs were performed on a CyberKnife-M6 using Gafchromic-EBT3 films. The signal-to-dose conversion was performed using a multichannel-correction and a scanning protocol that combined measurement and calibration in a single scan. The DQA analysis comprised a gamma-index analysis at 3%/1.5mm and a separate evaluation of spatial and dosimetric accuracy of the plan delivery. Each parameter was plotted on a control chart and control limits were calculated. A capability index (Cpm) was calculated to evaluate the ability of the process to produce results within specifications. The analysis of capability showed that a gamma pass rate of 85% at 3%/1.5mm was highly achievable as acceptance criteria for DQA validation using a film-based protocol (Cpm>1.33). 3.4% of DQA were outside a control limit of 88% for gamma pass-rate. The analysis of the out-of-control DQA helped identify a dosimetric error in our institute for a specific treatment type. We have defined initial tolerance criteria for DQA validations. We have shown that the implementation of a film-based patient-specific DQA protocol with the use of control charts is an effective method to improve patient treatment safety on CyberKnife. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  1. Quality control in the histopathology laboratory: An overview with stress on the need for a structured national external quality assessment scheme

    Directory of Open Access Journals (Sweden)

    Iyengar Jayaram

    2009-01-01

    Full Text Available The concept of quality control in histopathology is relatively young and less well understood. Like in other disciplines of laboratory medicine, the concept of quality and its control is applicable to pre analytical, analytical and post analytical activities. Assessment of both precision and accuracy performances is possible by appropriate internal and external quality control and assessment schemes. This article is a review of all processes that achieve quality reporting in histopathology. There is a special focus on external quality assessment - a scheme that lacks organization on a national level in our country. Statistical data derived from a small scale external quality assurance program is also analyzed along with recommendations to organize an effective national scheme with the participation of authorized zonal centers.

  2. Statistical process control in nursing research.

    Science.gov (United States)

    Polit, Denise F; Chaboyer, Wendy

    2012-02-01

    In intervention studies in which randomization to groups is not possible, researchers typically use quasi-experimental designs. Time series designs are strong quasi-experimental designs but are seldom used, perhaps because of technical and analytic hurdles. Statistical process control (SPC) is an alternative analytic approach to testing hypotheses about intervention effects using data collected over time. SPC, like traditional statistical methods, is a tool for understanding variation and involves the construction of control charts that distinguish between normal, random fluctuations (common cause variation), and statistically significant special cause variation that can result from an innovation. The purpose of this article is to provide an overview of SPC and to illustrate its use in a study of a nursing practice improvement intervention. Copyright © 2011 Wiley Periodicals, Inc.

  3. INSTITUTIONAL MANAGEMENT OF EUROPEAN STATISTICS AND OF THEIR QUALITY - CURRENT CONCERNS AT EUROPEAN LEVEL

    Directory of Open Access Journals (Sweden)

    Daniela ŞTEFĂNESCU

    2011-08-01

    Full Text Available The issues referring to official statistics quality and reliability became the main topics of debates as far as statistical governance in Europe is concerned. The Council welcomed the Commission Communication to the European Parliament and to the Council « Towards robust quality management for European Statistics » (COM 211, appreciating that the approach and the objective of the strategy would confer the European Statistical System (ESS the quality management framework for the coordination of consolidated economic policies. The Council pointed out that the European Statistical System management was improved during recent years, that progress was noticed in relation with high quality statistics production and dissemination within the European Union, but has also noticed that, in the context of recent financial crisis, certain weaknesses were identified, particularly related to quality management general framework.„Greece Case” proved that progresses were not enough for guaranteeing the complete independence of national statistical institutes and entailed the need for further consolidating ESS governance. Several undertakings are now in the preparatory stage, in accordance with the Commission Communication; these actions are welcomed, but the question arise: are these sufficient for definitively solving the problem?The paper aims to go ahead in the attempt of identifying a different way, innovative (courageous! on the long run, towards an advanced institutional structure of ESS, by setting up the European System of Statistical Institutes, similar to the European System of Central Banks, that would require a change in the Treaty.

  4. Quality control in diagnostic radiology - patient dosimetry

    Energy Technology Data Exchange (ETDEWEB)

    Prlic, I; Radalj, Z; Brumen, V; Cerovac, H [Institute for Medical Research and Occupational Health, Laboratory for Radiation Protection and Dosimetry, Zagreb (Croatia); Gladic, J [Institute for Physics, Laboratory for Solid State Physics, Zagreb (Croatia); Tercek, V [Clinical Hospital Sisters of Mercy, Health Physics Department, Zagreb (Croatia)

    1997-12-31

    In order to establish the Quality Criteria for diagnostic radiographic images in the radiology departments in Republic of Croatia we have started the several Quality Control projects on the field. The measurements are performed according to some methodology recommendations in our law but the methodology, measurement principles, measurement equipment, phantoms, measurable parameters for the good use by radiographers, statistical and numerical evaluation, dosimetric philosophy etc. where first recognized as a private/or group hazard of each person involved in the procedure of evaluation of diagnostic radiology images/diagnosis. The important quality elements of the imaging process are: the diagnostic quality of the radiographic image, the radiation dose to the patient and the choice of the radiographic technique. This depends on the x-ray unit (tube) radiation quality, image processing quality and final image evaluation quality. In this paper we will show how the Quality Control measurements can be easily connected to the dose delivered to the patient for the known diagnostic procedure and how this can be used by radiographers in their daily work. The reproducibility of the x-ray generator was checked before the service calibration and after the service calibration. The table of kV dependence and output dose per mAs was calculated and the ESD (entrance surface dose) was measured/calculated for the specific diagnostic procedure. After the phantom calculation were made and the dose prediction for the given procedure was done, measurements were done on the patients (digital dosemeters, TLD and film dosemeter combinations). We are claiming that there is no need to measure each patient if the proper Quality Control measurements are done and the proper table of ESD for each particular x-ray tube in diagnostic departments is calculated for the radiographers daily use. (author). 1 example, 1 fig., 13 refs.

  5. Application of machine learning and expert systems to Statistical Process Control (SPC) chart interpretation

    Science.gov (United States)

    Shewhart, Mark

    1991-01-01

    Statistical Process Control (SPC) charts are one of several tools used in quality control. Other tools include flow charts, histograms, cause and effect diagrams, check sheets, Pareto diagrams, graphs, and scatter diagrams. A control chart is simply a graph which indicates process variation over time. The purpose of drawing a control chart is to detect any changes in the process signalled by abnormal points or patterns on the graph. The Artificial Intelligence Support Center (AISC) of the Acquisition Logistics Division has developed a hybrid machine learning expert system prototype which automates the process of constructing and interpreting control charts.

  6. SAQC: SNP Array Quality Control

    Directory of Open Access Journals (Sweden)

    Li Ling-Hui

    2011-04-01

    Full Text Available Abstract Background Genome-wide single-nucleotide polymorphism (SNP arrays containing hundreds of thousands of SNPs from the human genome have proven useful for studying important human genome questions. Data quality of SNP arrays plays a key role in the accuracy and precision of downstream data analyses. However, good indices for assessing data quality of SNP arrays have not yet been developed. Results We developed new quality indices to measure the quality of SNP arrays and/or DNA samples and investigated their statistical properties. The indices quantify a departure of estimated individual-level allele frequencies (AFs from expected frequencies via standardized distances. The proposed quality indices followed lognormal distributions in several large genomic studies that we empirically evaluated. AF reference data and quality index reference data for different SNP array platforms were established based on samples from various reference populations. Furthermore, a confidence interval method based on the underlying empirical distributions of quality indices was developed to identify poor-quality SNP arrays and/or DNA samples. Analyses of authentic biological data and simulated data show that this new method is sensitive and specific for the detection of poor-quality SNP arrays and/or DNA samples. Conclusions This study introduces new quality indices, establishes references for AFs and quality indices, and develops a detection method for poor-quality SNP arrays and/or DNA samples. We have developed a new computer program that utilizes these methods called SNP Array Quality Control (SAQC. SAQC software is written in R and R-GUI and was developed as a user-friendly tool for the visualization and evaluation of data quality of genome-wide SNP arrays. The program is available online (http://www.stat.sinica.edu.tw/hsinchou/genetics/quality/SAQC.htm.

  7. Quality Management, Quality Assurance and Quality Control in Blood Establishments

    OpenAIRE

    Bolbate, N

    2008-01-01

    Quality terms and the roots of the matter are analyzed according to European Committee’s recommendations. Essence of process and product quality control as well as essence of quality assurance is described. Quality system’s structure including quality control, quality assurance and management is justified in the article.

  8. Cleaving of TOPAS and PMMA microstructured polymer optical fibers: Core-shift and statistical quality optimization

    DEFF Research Database (Denmark)

    Stefani, Alessio; Nielsen, Kristian; Rasmussen, Henrik K.

    2012-01-01

    We fabricated an electronically controlled polymer optical fiber cleaver, which uses a razor-blade guillotine and provides independent control of fiber temperature, blade temperature, and cleaving speed. To determine the optimum cleaving conditions of microstructured polymer optical fibers (m......POFs) with hexagonal hole structures we developed a program for cleaving quality optimization, which reads in a microscope image of the fiber end-facet and determines the core-shift and the statistics of the hole diameter, hole-to-hole pitch, hole ellipticity, and direction of major ellipse axis. For 125μm in diameter...

  9. Kansas's forests, 2005: statistics, methods, and quality assurance

    Science.gov (United States)

    Patrick D. Miles; W. Keith Moser; Charles J. Barnett

    2011-01-01

    The first full annual inventory of Kansas's forests was completed in 2005 after 8,868 plots were selected and 468 forested plots were visited and measured. This report includes detailed information on forest inventory methods and data quality estimates. Important resource statistics are included in the tables. A detailed analysis of Kansas inventory is presented...

  10. Nebraska's forests, 2005: statistics, methods, and quality assurance

    Science.gov (United States)

    Patrick D. Miles; Dacia M. Meneguzzo; Charles J. Barnett

    2011-01-01

    The first full annual inventory of Nebraska's forests was completed in 2005 after 8,335 plots were selected and 274 forested plots were visited and measured. This report includes detailed information on forest inventory methods, and data quality estimates. Tables of various important resource statistics are presented. Detailed analysis of the inventory data are...

  11. Bootstrap-based confidence estimation in PCA and multivariate statistical process control

    DEFF Research Database (Denmark)

    Babamoradi, Hamid

    be used to detect outliers in the data since the outliers can distort the bootstrap estimates. Bootstrap-based confidence limits were suggested as alternative to the asymptotic limits for control charts and contribution plots in MSPC (Paper II). The results showed that in case of the Q-statistic......Traditional/Asymptotic confidence estimation has limited applicability since it needs statistical theories to estimate the confidences, which are not available for all indicators/parameters. Furthermore, in case the theories are available for a specific indicator/parameter, the theories are based....... The goal was to improve process monitoring by improving the quality of MSPC charts and contribution plots. Bootstrapping algorithm to build confidence limits was illustrated in a case study format (Paper I). The main steps in the algorithm were discussed where a set of sensible choices (plus...

  12. HIV quality report cards: impact of case-mix adjustment and statistical methods.

    Science.gov (United States)

    Ohl, Michael E; Richardson, Kelly K; Goto, Michihiko; Vaughan-Sarrazin, Mary; Schweizer, Marin L; Perencevich, Eli N

    2014-10-15

    There will be increasing pressure to publicly report and rank the performance of healthcare systems on human immunodeficiency virus (HIV) quality measures. To inform discussion of public reporting, we evaluated the influence of case-mix adjustment when ranking individual care systems on the viral control quality measure. We used data from the Veterans Health Administration (VHA) HIV Clinical Case Registry and administrative databases to estimate case-mix adjusted viral control for 91 local systems caring for 12 368 patients. We compared results using 2 adjustment methods, the observed-to-expected estimator and the risk-standardized ratio. Overall, 10 913 patients (88.2%) achieved viral control (viral load ≤400 copies/mL). Prior to case-mix adjustment, system-level viral control ranged from 51% to 100%. Seventeen (19%) systems were labeled as low outliers (performance significantly below the overall mean) and 11 (12%) as high outliers. Adjustment for case mix (patient demographics, comorbidity, CD4 nadir, time on therapy, and income from VHA administrative databases) reduced the number of low outliers by approximately one-third, but results differed by method. The adjustment model had moderate discrimination (c statistic = 0.66), suggesting potential for unadjusted risk when using administrative data to measure case mix. Case-mix adjustment affects rankings of care systems on the viral control quality measure. Given the sensitivity of rankings to selection of case-mix adjustment methods-and potential for unadjusted risk when using variables limited to current administrative databases-the HIV care community should explore optimal methods for case-mix adjustment before moving forward with public reporting. Published by Oxford University Press on behalf of the Infectious Diseases Society of America 2014. This work is written by (a) US Government employee(s) and is in the public domain in the US.

  13. Applicability of statistical process control techniques

    NARCIS (Netherlands)

    Schippers, W.A.J.

    1998-01-01

    This paper concerns the application of Process Control Techniques (PCTs) for the improvement of the technical performance of discrete production processes. Successful applications of these techniques, such as Statistical Process Control Techniques (SPC), can be found in the literature. However, some

  14. An integrated model of statistical process control and maintenance based on the delayed monitoring

    International Nuclear Information System (INIS)

    Yin, Hui; Zhang, Guojun; Zhu, Haiping; Deng, Yuhao; He, Fei

    2015-01-01

    This paper develops an integrated model of statistical process control and maintenance decision. The proposal of delayed monitoring policy postpones the sampling process till a scheduled time and contributes to ten-scenarios of the production process, where equipment failure may occur besides quality shift. The equipment failure and the control chart alert trigger the corrective maintenance and the predictive maintenance, respectively. The occurrence probability, the cycle time and the cycle cost of each scenario are obtained by integral calculation; therefore, a mathematical model is established to minimize the expected cost by using the genetic algorithm. A Monte Carlo simulation experiment is conducted and compared with the integral calculation in order to ensure the analysis of the ten-scenario model. Another ordinary integrated model without delayed monitoring is also established as comparison. The results of a numerical example indicate satisfactory economic performance of the proposed model. Finally, a sensitivity analysis is performed to investigate the effect of model parameters. - Highlights: • We develop an integrated model of statistical process control and maintenance. • We propose delayed monitoring policy and derive an economic model with 10 scenarios. • We consider two deterioration mechanisms, quality shift and equipment failure. • The delayed monitoring policy will help reduce the expected cost

  15. Principles and Practices for Quality Assurance and Quality Control

    Science.gov (United States)

    Jones, Berwyn E.

    1999-01-01

    Quality assurance and quality control are vital parts of highway runoff water-quality monitoring projects. To be effective, project quality assurance must address all aspects of the project, including project management responsibilities and resources, data quality objectives, sampling and analysis plans, data-collection protocols, data quality-control plans, data-assessment procedures and requirements, and project outputs. Quality control ensures that the data quality objectives are achieved as planned. The historical development and current state of the art of quality assurance and quality control concepts described in this report can be applied to evaluation of data from prior projects.

  16. Application of Metabolomics to Quality Control of Natural Product Derived Medicines.

    Science.gov (United States)

    Lee, Kyung-Min; Jeon, Jun-Yeong; Lee, Byeong-Ju; Lee, Hwanhui; Choi, Hyung-Kyoon

    2017-11-01

    Metabolomics has been used as a powerful tool for the analysis and quality assessment of the natural product (NP)-derived medicines. It is increasingly being used in the quality control and standardization of NP-derived medicines because they are composed of hundreds of natural compounds. The most common techniques that are used in metabolomics consist of NMR, GC-MS, and LC-MS in combination with multivariate statistical analyses including principal components analysis (PCA) and partial least squares-discriminant analysis (PLS-DA). Currently, the quality control of the NP-derived medicines is usually conducted using HPLC and is specified by one or two indicators. To create a superior quality control framework and avoid adulterated drugs, it is necessary to be able to determine and establish standards based on multiple ingredients using metabolic profiling and fingerprinting. Therefore, the application of various analytical tools in the quality control of NP-derived medicines forms the major part of this review. Veregen ® (Medigene AG, Planegg/Martinsried, Germany), which is the first botanical prescription drug approved by US Food and Drug Administration, is reviewed as an example that will hopefully provide future directions and perspectives on metabolomics technologies available for the quality control of NP-derived medicines.

  17. Preparation of quality control samples in radioimmunoassay for thyroid stimulating hormone (TSH)

    International Nuclear Information System (INIS)

    Ali, O.M.

    2006-03-01

    To days, the radioimmunoassay is becomes the best technique to analysis different concentrations of substance, especially in medical and research laboratories. Although the specificity of RIA techniques, the quality controls must takes place to give good results as possible. In this dissertation i prepared quality control samples of thyroid stimulating hormone (TSH), to use it in RIA techniques and to control the reliability results of those laboratories which used these methods. We used China production kits of RIA method to determine the level of hormone (low-normal-high) concentration. Statistical parameters were used to drown the control chart of the mean to these data.(Author)

  18. Checking quality control?

    DEFF Research Database (Denmark)

    Brodersen, Lars

    2005-01-01

    How is quality control doing within the community of GIS, web-services based on geo-information, GI etc.?......How is quality control doing within the community of GIS, web-services based on geo-information, GI etc.?...

  19. SALE, Quality Control of Analytical Chemical Measurements

    International Nuclear Information System (INIS)

    Bush, W.J.; Gentillon, C.D.

    1985-01-01

    1 - Description of problem or function: The Safeguards Analytical Laboratory Evaluation (SALE) program is a statistical analysis program written to analyze the data received from laboratories participating in the SALE quality control and evaluation program. The system is aimed at identifying and reducing analytical chemical measurement errors. Samples of well-characterized materials are distributed to laboratory participants at periodic intervals for determination of uranium or plutonium concentration and isotopic distributions. The results of these determinations are statistically evaluated and participants are informed of the accuracy and precision of their results. 2 - Method of solution: Various statistical techniques produce the SALE output. Assuming an unbalanced nested design, an analysis of variance is performed, resulting in a test of significance for time and analyst effects. A trend test is performed. Both within- laboratory and between-laboratory standard deviations are calculated. 3 - Restrictions on the complexity of the problem: Up to 1500 pieces of data for each nuclear material sampled by a maximum of 75 laboratories may be analyzed

  20. EFFECT OF QUALITY CONTROL SYSTEM ON AUDIT QUALITY WITH PROFESSIONAL COMMITMENTS AS A MODERATION VARIABLE

    Directory of Open Access Journals (Sweden)

    Ramadhani R.

    2017-12-01

    Full Text Available This study aims to test the effect of every element of Quality Control System (QCS that is leadership responsibilities for quality on audit, relevant ethical requirements, acceptance and continuance of client relationships and certain engagements, assignment of engagement team, engagement performance, monitoring, and documentation on audit quality as well as to test whether the professional commitment moderate effect of every element of QCS on audit quality. The population was the staff auditors working in public accounting firms domiciled in Jakarta City, especially Central Jakarta area with the drawing of 84 respondents. The statistical method used was SEM PLS with the help of SmartPLS application. The results of this study indicate that from seven elements of QCS, only relevant ethical requirements that affect on audit quality. Furthermore, the study also found that professional commitment cannot moderate the relationship between the seven elements of QCS on audit quality.

  1. Statistical analysis of longitudinal quality of life data with missing measurements

    NARCIS (Netherlands)

    Zwinderman, A. H.

    1992-01-01

    The statistical analysis of longitudinal quality of life data in the presence of missing data is discussed. In cancer trials missing data are generated due to the fact that patients die, drop out, or are censored. These missing data are problematic in the monitoring of the quality of life during the

  2. Quality control of intelligence research

    International Nuclear Information System (INIS)

    Lu Yan; Xin Pingping; Wu Jian

    2014-01-01

    Quality control of intelligence research is the core issue of intelligence management, is a problem in study of information science This paper focuses on the performance of intelligence to explain the significance of intelligence research quality control. In summing up the results of the study on the basis of the analysis, discusses quality control methods in intelligence research, introduces the experience of foreign intelligence research quality control, proposes some recommendations to improve quality control in intelligence research. (authors)

  3. Applying Statistical Process Control to Clinical Data: An Illustration.

    Science.gov (United States)

    Pfadt, Al; And Others

    1992-01-01

    Principles of statistical process control are applied to a clinical setting through the use of control charts to detect changes, as part of treatment planning and clinical decision-making processes. The logic of control chart analysis is derived from principles of statistical inference. Sample charts offer examples of evaluating baselines and…

  4. Interaction between production control and quality control

    NARCIS (Netherlands)

    Bij, van der J.D.; Ekert, van J.H.W.

    1999-01-01

    Describes a qualitative study on interaction between systems for production control and quality control within industrial organisations. Production control and quality control interact in a sense. Good performance for one aspect often influences or frustrates the performance of the other. As far as

  5. Quality control in radiotherapy

    International Nuclear Information System (INIS)

    Batalla, A.

    2009-01-01

    The authors discuss the modalities and periodicities of internal quality control on radiotherapy installations. They indicate the different concerned systems and the aspects and items to be controlled (patient and personnel security, apparatus mechanical characteristics, beam quality, image quality, isodose and irradiation duration calculation, data transfer). They present the measurement instruments and tools used for the mechanical controls, dose measurement, beam homogeneity and symmetry, anatomic data acquisition systems, and dose distribution and control imagery calculation

  6. Statistical methods for quality assurance

    International Nuclear Information System (INIS)

    Rinne, H.; Mittag, H.J.

    1989-01-01

    This is the first German-language textbook on quality assurance and the fundamental statistical methods that is suitable for private study. The material for this book has been developed from a course of Hagen Open University and is characterized by a particularly careful didactical design which is achieved and supported by numerous illustrations and photographs, more than 100 exercises with complete problem solutions, many fully displayed calculation examples, surveys fostering a comprehensive approach, bibliography with comments. The textbook has an eye to practice and applications, and great care has been taken by the authors to avoid abstraction wherever appropriate, to explain the proper conditions of application of the testing methods described, and to give guidance for suitable interpretation of results. The testing methods explained also include latest developments and research results in order to foster their adoption in practice. (orig.) [de

  7. Using statistical process control for monitoring the prevalence of hospital-acquired pressure ulcers.

    Science.gov (United States)

    Kottner, Jan; Halfens, Ruud

    2010-05-01

    Institutionally acquired pressure ulcers are used as outcome indicators to assess the quality of pressure ulcer prevention programs. Determining whether quality improvement projects that aim to decrease the proportions of institutionally acquired pressure ulcers lead to real changes in clinical practice depends on the measurement method and statistical analysis used. To examine whether nosocomial pressure ulcer prevalence rates in hospitals in the Netherlands changed, a secondary data analysis using different statistical approaches was conducted of annual (1998-2008) nationwide nursing-sensitive health problem prevalence studies in the Netherlands. Institutions that participated regularly in all survey years were identified. Risk-adjusted nosocomial pressure ulcers prevalence rates, grade 2 to 4 (European Pressure Ulcer Advisory Panel system) were calculated per year and hospital. Descriptive statistics, chi-square trend tests, and P charts based on statistical process control (SPC) were applied and compared. Six of the 905 healthcare institutions participated in every survey year and 11,444 patients in these six hospitals were identified as being at risk for pressure ulcers. Prevalence rates per year ranged from 0.05 to 0.22. Chi-square trend tests revealed statistically significant downward trends in four hospitals but based on SPC methods, prevalence rates of five hospitals varied by chance only. Results of chi-square trend tests and SPC methods were not comparable, making it impossible to decide which approach is more appropriate. P charts provide more valuable information than single P values and are more helpful for monitoring institutional performance. Empirical evidence about the decrease of nosocomial pressure ulcer prevalence rates in the Netherlands is contradictory and limited.

  8. Statistical learning methods: Basics, control and performance

    Energy Technology Data Exchange (ETDEWEB)

    Zimmermann, J. [Max-Planck-Institut fuer Physik, Foehringer Ring 6, 80805 Munich (Germany)]. E-mail: zimmerm@mppmu.mpg.de

    2006-04-01

    The basics of statistical learning are reviewed with a special emphasis on general principles and problems for all different types of learning methods. Different aspects of controlling these methods in a physically adequate way will be discussed. All principles and guidelines will be exercised on examples for statistical learning methods in high energy and astrophysics. These examples prove in addition that statistical learning methods very often lead to a remarkable performance gain compared to the competing classical algorithms.

  9. Statistical learning methods: Basics, control and performance

    International Nuclear Information System (INIS)

    Zimmermann, J.

    2006-01-01

    The basics of statistical learning are reviewed with a special emphasis on general principles and problems for all different types of learning methods. Different aspects of controlling these methods in a physically adequate way will be discussed. All principles and guidelines will be exercised on examples for statistical learning methods in high energy and astrophysics. These examples prove in addition that statistical learning methods very often lead to a remarkable performance gain compared to the competing classical algorithms

  10. The Profile of Creativity and Proposing Statistical Problem Quality Level Reviewed From Cognitive Style

    Science.gov (United States)

    Awi; Ahmar, A. S.; Rahman, A.; Minggi, I.; Mulbar, U.; Asdar; Ruslan; Upu, H.; Alimuddin; Hamda; Rosidah; Sutamrin; Tiro, M. A.; Rusli

    2018-01-01

    This research aims to reveal the profile about the level of creativity and the ability to propose statistical problem of students at Mathematics Education 2014 Batch in the State University of Makassar in terms of their cognitive style. This research uses explorative qualitative method by giving meta-cognitive scaffolding at the time of research. The hypothesis of research is that students who have field independent (FI) cognitive style in statistics problem posing from the provided information already able to propose the statistical problem that can be solved and create new data and the problem is already been included as a high quality statistical problem, while students who have dependent cognitive field (FD) commonly are still limited in statistics problem posing that can be finished and do not load new data and the problem is included as medium quality statistical problem.

  11. Quality assurance and quality control

    International Nuclear Information System (INIS)

    Anon.

    1987-01-01

    The practice of nuclear diagnostic imaging requires an appropriate quality assurance program to attain high standards of efficiency and reliability. The International Atomic Energy Agency defines the term quality assurance as ''the closeness with which the outcome of a given procedure approaches some ideal, free from all errors and artifacts.'' The term quality control is used in reference to the specific measures taken to ensure that one particular aspect of the procedure is satisfactory. Therefore, quality assurance is a hospital-wide concept that should involve all aspects of clinical practice. Quality control is concerned with the submission of requests for procedures; the scheduling of patients; the preparation and dispensing of radiopharmaceuticals; the protection of patients, staff, and the general public against radiation hazards and accidents caused by radioactive materials or by faulty equipment; the setting up, use, and maintenance of electronic instruments; the methodology of the actual procedures; the analysis and interpretation of data; the reporting of results; and, finally, the keeping of records. The chapter discusses each of these areas

  12. Industrial statistics and its recent contributions to total quality in the Netherlands

    NARCIS (Netherlands)

    Does, R.J.M.M.; Roes, K.C.B.

    1996-01-01

    The use of statistical methods in quality management has a long history. Most of the pioneers, such as Walter A. Shewhart and W. Edwards Deming, refer to themselves as statisticians. Statistical thinking in industry means that all work is a series of interconnected processes, that all processes show

  13. Application of environmetric methods to investigate control factors on water quality

    Directory of Open Access Journals (Sweden)

    Boyacioglu Hülya

    2017-09-01

    Full Text Available In the study, environmetric methods were successfully performed a to explore natural and anthropogenic controls on reservoir water quality, b to investigate spatial and temporal differences in quality, and c to determine quality variables discriminating three reservoirs in Izmir, Turkey. Results showed that overall water quality was mainly governed by “natural factors” in the whole region. A parameter that was the most important in contributing to water quality variation for one reservoir was not important for another. Between summer and winter periods, difference in arsenic concentrations were statistically significant in the Tahtalı, Ürkmez and iron concentrations were in the Balçova reservoirs. Observation of high/low levels in two seasons was explained by different processes as for instance, dilution from runoff at times of high flow seeped through soil and entered the river along with the rainwater run-off and adsorption. Three variables “boron, arsenic and sulphate” discriminated quality among Balçova & Tahtalı, Balçova & Ürkmez and two variables “zinc and arsenic” among the Tahtalı & Ürkmez reservoirs. The results illustrated the usefulness of multivariate statistical techniques to fingerprint pollution sources and investigate temporal/spatial variations in water quality.

  14. Application of statistical classification methods for predicting the acceptability of well-water quality

    Science.gov (United States)

    Cameron, Enrico; Pilla, Giorgio; Stella, Fabio A.

    2018-01-01

    The application of statistical classification methods is investigated—in comparison also to spatial interpolation methods—for predicting the acceptability of well-water quality in a situation where an effective quantitative model of the hydrogeological system under consideration cannot be developed. In the example area in northern Italy, in particular, the aquifer is locally affected by saline water and the concentration of chloride is the main indicator of both saltwater occurrence and groundwater quality. The goal is to predict if the chloride concentration in a water well will exceed the allowable concentration so that the water is unfit for the intended use. A statistical classification algorithm achieved the best predictive performances and the results of the study show that statistical classification methods provide further tools for dealing with groundwater quality problems concerning hydrogeological systems that are too difficult to describe analytically or to simulate effectively.

  15. Statistics for Engineers

    International Nuclear Information System (INIS)

    Kim, Jin Gyeong; Park, Jin Ho; Park, Hyeon Jin; Lee, Jae Jun; Jun, Whong Seok; Whang, Jin Su

    2009-08-01

    This book explains statistics for engineers using MATLAB, which includes arrangement and summary of data, probability, probability distribution, sampling distribution, assumption, check, variance analysis, regression analysis, categorical data analysis, quality assurance such as conception of control chart, consecutive control chart, breakthrough strategy and analysis using Matlab, reliability analysis like measurement of reliability and analysis with Maltab, and Markov chain.

  16. 77 FR 46096 - Statistical Process Controls for Blood Establishments; Public Workshop

    Science.gov (United States)

    2012-08-02

    ...] Statistical Process Controls for Blood Establishments; Public Workshop AGENCY: Food and Drug Administration... workshop entitled: ``Statistical Process Controls for Blood Establishments.'' The purpose of this public workshop is to discuss the implementation of statistical process controls to validate and monitor...

  17. South Dakota's forests, 2005: statistics, methods, and quality assurance

    Science.gov (United States)

    Patrick D. Miles; Ronald J. Piva; Charles J. Barnett

    2011-01-01

    The first full annual inventory of South Dakota's forests was completed in 2005 after 8,302 plots were selected and 325 forested plots were visited and measured. This report includes detailed information on forest inventory methods and data quality estimates. Important resource statistics are included in the tables. A detailed analysis of the South Dakota...

  18. North Dakota's forests, 2005: statistics, methods, and quality assurance

    Science.gov (United States)

    Patrick D. Miles; David E. Haugen; Charles J. Barnett

    2011-01-01

    The first full annual inventory of North Dakota's forests was completed in 2005 after 7,622 plots were selected and 164 forested plots were visited and measured. This report includes detailed information on forest inventory methods and data quality estimates. Important resource statistics are included in the tables. A detailed analysis of the North Dakota...

  19. [On-site quality control of acupuncture randomized controlled trial: design of content and checklist of quality control based on PICOST].

    Science.gov (United States)

    Li, Hong-Jiao; He, Li-Yun; Liu, Zhi-Shun; Sun, Ya-Nan; Yan, Shi-Yan; Liu, Jia; Zhao, Ye; Liu, Bao-Yan

    2014-02-01

    To effectively guarantee quality of randomized controlld trial (RCT) of acupuncture and develop reasonable content and checklist of on-site quality control, influencing factors on quality of acupuncture RCT are analyzed and scientificity of quality control content and feasibility of on-site manipulation are put into overall consideration. Based on content and checklist of on-site quality control in National 11th Five-Year Plan Project Optimization of Comprehensive Treatment Plan for TCM in Prevention and Treatment of Serious Disease and Clinical Assessment on Generic Technology and Quality Control Research, it is proposed that on-site quality control of acupuncture RCT should be conducted with PICOST (patient, intervention, comparison, out come, site and time) as core, especially on quality control of interveners' skills and outcome assessment of blinding, and checklist of on-site quality control is developed to provide references for undertaking groups of the project.

  20. Improving Instruction Using Statistical Process Control.

    Science.gov (United States)

    Higgins, Ronald C.; Messer, George H.

    1990-01-01

    Two applications of statistical process control to the process of education are described. Discussed are the use of prompt feedback to teachers and prompt feedback to students. A sample feedback form is provided. (CW)

  1. Robust Control Methods for On-Line Statistical Learning

    Directory of Open Access Journals (Sweden)

    Capobianco Enrico

    2001-01-01

    Full Text Available The issue of controlling that data processing in an experiment results not affected by the presence of outliers is relevant for statistical control and learning studies. Learning schemes should thus be tested for their capacity of handling outliers in the observed training set so to achieve reliable estimates with respect to the crucial bias and variance aspects. We describe possible ways of endowing neural networks with statistically robust properties by defining feasible error criteria. It is convenient to cast neural nets in state space representations and apply both Kalman filter and stochastic approximation procedures in order to suggest statistically robustified solutions for on-line learning.

  2. Importance of implementing an analytical quality control system in a core laboratory.

    Science.gov (United States)

    Marques-Garcia, F; Garcia-Codesal, M F; Caro-Narros, M R; Contreras-SanFeliciano, T

    2015-01-01

    The aim of the clinical laboratory is to provide useful information for screening, diagnosis and monitoring of disease. The laboratory should ensure the quality of extra-analytical and analytical process, based on set criteria. To do this, it develops and implements a system of internal quality control, designed to detect errors, and compare its data with other laboratories, through external quality control. In this way it has a tool to detect the fulfillment of the objectives set, and in case of errors, allowing corrective actions to be made, and ensure the reliability of the results. This article sets out to describe the design and implementation of an internal quality control protocol, as well as its periodical assessment intervals (6 months) to determine compliance with pre-determined specifications (Stockholm Consensus(1)). A total of 40 biochemical and 15 immunochemical methods were evaluated using three different control materials. Next, a standard operation procedure was planned to develop a system of internal quality control that included calculating the error of the analytical process, setting quality specifications, and verifying compliance. The quality control data were then statistically depicted as means, standard deviations, and coefficients of variation, as well as systematic, random, and total errors. The quality specifications were then fixed and the operational rules to apply in the analytical process were calculated. Finally, our data were compared with those of other laboratories through an external quality assurance program. The development of an analytical quality control system is a highly structured process. This should be designed to detect errors that compromise the stability of the analytical process. The laboratory should review its quality indicators, systematic, random and total error at regular intervals, in order to ensure that they are meeting pre-determined specifications, and if not, apply the appropriate corrective actions

  3. Implementation of dosimetric quality control on IMRT and VMAT treatments in radiotherapy using diodes

    International Nuclear Information System (INIS)

    Gonzales, A.; Garcia, B.; Ramirez, J.; Marquina, J.

    2014-08-01

    To implement quality control of IMRT and VMAT treatments Rapid Arc radiotherapy using diode array. Were tested 90 patients with IMRT and VMAT Rapid Arc, comparing the planned dose to the dose administered, used the Map-Check-2 and Arc-Check of Sun Nuclear, they using the gamma factor for calculating and using comparison parameters 3% / 3m m. The statistic shows that the quality controls of the 90 patients analyzed, presented a percentage of diodes that pass the test between 96,7% and 100,0% of the irradiated diodes. Implemented in Clinical ALIADA Oncologia Integral, the method for quality control of IMRT and VMAT treatments Rapid Arc radiotherapy using diode array. (Author)

  4. Software product quality control

    CERN Document Server

    Wagner, Stefan

    2013-01-01

    Quality is not a fixed or universal property of software; it depends on the context and goals of its stakeholders. Hence, when you want to develop a high-quality software system, the first step must be a clear and precise specification of quality. Yet even if you get it right and complete, you can be sure that it will become invalid over time. So the only solution is continuous quality control: the steady and explicit evaluation of a product's properties with respect to its updated quality goals.This book guides you in setting up and running continuous quality control in your environment. Star

  5. Temporal aspects of surface water quality variation using robust statistical tools.

    Science.gov (United States)

    Mustapha, Adamu; Aris, Ahmad Zaharin; Ramli, Mohammad Firuz; Juahir, Hafizan

    2012-01-01

    Robust statistical tools were applied on the water quality datasets with the aim of determining the most significance parameters and their contribution towards temporal water quality variation. Surface water samples were collected from four different sampling points during dry and wet seasons and analyzed for their physicochemical constituents. Discriminant analysis (DA) provided better results with great discriminatory ability by using five parameters with (P < 0.05) for dry season affording more than 96% correct assignation and used five and six parameters for forward and backward stepwise in wet season data with P-value (P < 0.05) affording 68.20% and 82%, respectively. Partial correlation results revealed that there are strong (r(p) = 0.829) and moderate (r(p) = 0.614) relationships between five-day biochemical oxygen demand (BOD(5)) and chemical oxygen demand (COD), total solids (TS) and dissolved solids (DS) controlling for the linear effect of nitrogen in the form of ammonia (NH(3)) and conductivity for dry and wet seasons, respectively. Multiple linear regression identified the contribution of each variable with significant values r = 0.988, R(2) = 0.976 and r = 0.970, R(2) = 0.942 (P < 0.05) for dry and wet seasons, respectively. Repeated measure t-test confirmed that the surface water quality varies significantly between the seasons with significant value P < 0.05.

  6. Nonparametric predictive inference in statistical process control

    NARCIS (Netherlands)

    Arts, G.R.J.; Coolen, F.P.A.; Laan, van der P.

    2000-01-01

    New methods for statistical process control are presented, where the inferences have a nonparametric predictive nature. We consider several problems in process control in terms of uncertainties about future observable random quantities, and we develop inferences for these random quantities hased on

  7. Exploring the use of statistical process control methods to assess course changes

    Science.gov (United States)

    Vollstedt, Ann-Marie

    This dissertation pertains to the field of Engineering Education. The Department of Mechanical Engineering at the University of Nevada, Reno (UNR) is hosting this dissertation under a special agreement. This study was motivated by the desire to find an improved, quantitative measure of student quality that is both convenient to use and easy to evaluate. While traditional statistical analysis tools such as ANOVA (analysis of variance) are useful, they are somewhat time consuming and are subject to error because they are based on grades, which are influenced by numerous variables, independent of student ability and effort (e.g. inflation and curving). Additionally, grades are currently the only measure of quality in most engineering courses even though most faculty agree that grades do not accurately reflect student quality. Based on a literature search, in this study, quality was defined as content knowledge, cognitive level, self efficacy, and critical thinking. Nineteen treatments were applied to a pair of freshmen classes in an effort in increase the qualities. The qualities were measured via quiz grades, essays, surveys, and online critical thinking tests. Results from the quality tests were adjusted and filtered prior to analysis. All test results were subjected to Chauvenet's criterion in order to detect and remove outlying data. In addition to removing outliers from data sets, it was felt that individual course grades needed adjustment to accommodate for the large portion of the grade that was defined by group work. A new method was developed to adjust grades within each group based on the residual of the individual grades within the group and the portion of the course grade defined by group work. It was found that the grade adjustment method agreed 78% of the time with the manual ii grade changes instructors made in 2009, and also increased the correlation between group grades and individual grades. Using these adjusted grades, Statistical Process Control

  8. Quality assurance and quality control

    International Nuclear Information System (INIS)

    Kaden, W.

    1986-01-01

    General preconditions and methods for QA work in the nuclear field are analysed. The application of general QA principles to actual situations is illustrated by examples in the fields of engineering and of the manufacturing of mechanical and electrical components. All QA measures must be fitted to the complexity and relevance of the work steps, which are under consideration. The key to good product quality is the control of working processes. The term 'controlled process' is discussed in detail and examples of feed back systems are given. The main QA measures for the operation of nuclear power plants include the establishment of a Quality Assurance Program, training and qualification of personnel, procurement control, inspection and tests, reviews and audits. These activities are discussed. (orig.)

  9. Mediator effect of statistical process control between Total Quality Management (TQM) and business performance in Malaysian Automotive Industry

    Science.gov (United States)

    Ahmad, M. F.; Rasi, R. Z.; Zakuan, N.; Hisyamudin, M. N. N.

    2015-12-01

    In today's highly competitive market, Total Quality Management (TQM) is vital management tool in ensuring a company can success in their business. In order to survive in the global market with intense competition amongst regions and enterprises, the adoption of tools and techniques are essential in improving business performance. There are consistent results between TQM and business performance. However, only few previous studies have examined the mediator effect namely statistical process control (SPC) between TQM and business performance. A mediator is a third variable that changes the association between an independent variable and an outcome variable. This study present research proposed a TQM performance model with mediator effect of SPC with structural equation modelling, which is a more comprehensive model for developing countries, specifically for Malaysia. A questionnaire was prepared and sent to 1500 companies from automotive industry and the related vendors in Malaysia, giving a 21.8 per cent rate. Attempts were made at findings significant impact of mediator between TQM practices and business performance showed that SPC is important tools and techniques in TQM implementation. The result concludes that SPC is partial correlation between and TQM and BP with indirect effect (IE) is 0.25 which can be categorised as high moderator effect.

  10. Quality control in nuclear medicine

    International Nuclear Information System (INIS)

    Leme, P.R.

    1983-01-01

    The following topics are discussed: objectives of the quality control in nuclear medicine; the necessity of the quality control in nuclear medicine; guidelines and recommendations. An appendix is given concerning the guidelines for the quality control and instrumentation in nuclear medicine. (M.A.) [pt

  11. 40 CFR 75.21 - Quality assurance and quality control requirements.

    Science.gov (United States)

    2010-07-01

    ... quality assurance audit or any other audit, the system is out-of-control. The owner or operator shall... 40 Protection of Environment 16 2010-07-01 2010-07-01 false Quality assurance and quality control... assurance and quality control requirements. (a) Continuous emission monitoring systems. The owner or...

  12. Quality control of radiopharmaceuticals

    International Nuclear Information System (INIS)

    Kristensen, K.

    1981-01-01

    Quality assurance was introduced in the pharmaceutical field long before it was used in many other areas, and the term quality control has been used in a much broader sense than merely analytical quality control. The term Good Manufacturing Practice (GMP) has been used to describe the system used for producing safe and effective drugs of a uniform quality. GMP has also been used for the industrial production of radiopharmaceuticals. For the preparation and control of radiopharmaceuticals in hospitals a similar system has been named Good Radiopharmacy Practice (GRP). It contains the same elements as GMP but takes into account the special nature of this group of drugs. Data on the assessment of the quality of radiopharmaceuticals in relation to present standards are reviewed. The general conclusion is that the quality of radiopharmaceuticals appears comparable to that of other drugs. It seems possible to establish the production of radiopharmaceuticals, generators and preparation kits in such a way that analytical control of the final product at the hospital may be limited provided the final preparation work is carried out in accordance with GRP principles. The elements of GRP are reviewed. (author)

  13. Preparation of an estuarine sediment quality control material for the determination of trace metals

    Directory of Open Access Journals (Sweden)

    Hatje Vanessa

    2006-01-01

    Full Text Available Quality Control Materials (QCM have being used routinely in daily laboratory work as a tool to fill the gap between need and availability of Certified Reference Materials (CRM. The QCM are a low-cost alternative to CRMs, and they are in high demand, especially, for the implementation of quality control systems in laboratories of several areas. This paper describes the preparation of a QCM for the determination of trace metals in estuarine sediments and the results of an interlaboratory exercise. Homogeneity and stability studies were performed and analysis of variance was carried out with the results. No statistical significant differences were observed in the concentrations of Co, Cr, Cu, Mn, Pb and Zn between- or within bottle results. Neither the storage nor temperature affected the results. Therefore, the QCM produced is considered homogeneous and stable and can be used for statistical control charts, evaluation of reproducibility and interlaboratory exercises.

  14. Commercial jet fuel quality control

    Energy Technology Data Exchange (ETDEWEB)

    Strauss, K.H.

    1995-05-01

    The paper discusses the purpose of jet fuel quality control between the refinery and the aircraft. It describes fixed equipment, including various types of filters, and the usefulness and limitations of this equipment. Test equipment is reviewed as are various surveillance procedures. These include the Air Transport Association specification ATA 103, the FAA Advisory Circular 150/5230-4, the International Air Transport Association Guidance Material for Fuel Quality Control and Fuelling Service and the Guidelines for Quality Control at Jointly Operated Fuel Systems. Some past and current quality control problems are briefly mentioned.

  15. SPECT quality control tests

    International Nuclear Information System (INIS)

    Robilotta, C.C.; Rebelo, M.F.S.; Oliveira, M.A.; Abe, R.

    1987-01-01

    Quality control tests of tomographic system composed by a rotatory chamber (CGR Gammatomome T-9000) and a microcomputer are presented. Traditional quality control tests for scintilation chambers and specific tests for tomographic systems are reported. (M.A.C.) [pt

  16. Statistical Control Charts: Performances of Short Term Stock Trading in Croatia

    Directory of Open Access Journals (Sweden)

    Dumičić Ksenija

    2015-03-01

    Full Text Available Background: The stock exchange, as a regulated financial market, in modern economies reflects their economic development level. The stock market indicates the mood of investors in the development of a country and is an important ingredient for growth. Objectives: This paper aims to introduce an additional statistical tool used to support the decision-making process in stock trading, and it investigate the usage of statistical process control (SPC methods into the stock trading process. Methods/Approach: The individual (I, exponentially weighted moving average (EWMA and cumulative sum (CUSUM control charts were used for gaining trade signals. The open and the average prices of CROBEX10 index stocks on the Zagreb Stock Exchange were used in the analysis. The statistical control charts capabilities for stock trading in the short-run were analysed. Results: The statistical control chart analysis pointed out too many signals to buy or sell stocks. Most of them are considered as false alarms. So, the statistical control charts showed to be not so much useful in stock trading or in a portfolio analysis. Conclusions: The presence of non-normality and autocorellation has great impact on statistical control charts performances. It is assumed that if these two problems are solved, the use of statistical control charts in a portfolio analysis could be greatly improved.

  17. Can a combination of average of normals and "real time" External Quality Assurance replace Internal Quality Control?

    Science.gov (United States)

    Badrick, Tony; Graham, Peter

    2018-03-28

    Internal Quality Control and External Quality Assurance are separate but related processes that have developed independently in laboratory medicine over many years. They have different sample frequencies, statistical interpretations and immediacy. Both processes have evolved absorbing new understandings of the concept of laboratory error, sample material matrix and assay capability. However, we do not believe at the coalface that either process has led to much improvement in patient outcomes recently. It is the increasing reliability and automation of analytical platforms along with improved stability of reagents that has reduced systematic and random error, which in turn has minimised the risk of running less frequent IQC. We suggest that it is time to rethink the role of both these processes and unite them into a single approach using an Average of Normals model supported by more frequent External Quality Assurance samples. This new paradigm may lead to less confusion for laboratory staff and quicker responses to and identification of out of control situations.

  18. Preliminary Retrospective Analysis of Daily Tomotherapy Output Constancy Checks Using Statistical Process Control.

    Science.gov (United States)

    Mezzenga, Emilio; D'Errico, Vincenzo; Sarnelli, Anna; Strigari, Lidia; Menghi, Enrico; Marcocci, Francesco; Bianchini, David; Benassi, Marcello

    2016-01-01

    The purpose of this study was to retrospectively evaluate the results from a Helical TomoTherapy Hi-Art treatment system relating to quality controls based on daily static and dynamic output checks using statistical process control methods. Individual value X-charts, exponentially weighted moving average charts, and process capability and acceptability indices were used to monitor the treatment system performance. Daily output values measured from January 2014 to January 2015 were considered. The results obtained showed that, although the process was in control, there was an out-of-control situation in the principal maintenance intervention for the treatment system. In particular, process capability indices showed a decreasing percentage of points in control which was, however, acceptable according to AAPM TG148 guidelines. Our findings underline the importance of restricting the acceptable range of daily output checks and suggest a future line of investigation for a detailed process control of daily output checks for the Helical TomoTherapy Hi-Art treatment system.

  19. SU-E-T-77: A Statistical Approach to Manage Quality for Pre-Treatment Verification in IMRT/VMAT

    Energy Technology Data Exchange (ETDEWEB)

    Jassal, K [Fortis Memorial Research Institute, Gurgaon, Haryana (India); Sarkar, B [AMRI Cancer Centre and GLA university, Mathura, Kolkata, West Bengal (India); Mohanti, B; Roy, S; Ganesh, T [FMRI, Gurgaon, Haryana (India); Munshi, A [Fortis Memorial Research Institute, Gurgon, Haryana (India); Chougule, A [SMS Medical College and Hospital, Jaipur, Rajasthan (India); Sachdev, K [Malaviya National Institute of Technology, Jaipur, Rajasthan (India)

    2015-06-15

    Objective: The study presents the application of a simple concept of statistical process control (SPC) for pre-treatment quality assurance procedure analysis for planar dose measurements performed using 2D-array and a-Si electronic portal imaging device (a-Si EPID). Method: A total of 195 patients of four different anatomical sites: brain (n1=45), head & neck (n2=45), thorax (n3=50) and pelvis (n4=55) were selected for the study. Pre-treatment quality assurance for the clinically acceptable IMRT/VMAT plans was measured with 2D array and a-Si EPID of the accelerator. After the γ-analysis, control charts and the quality index Cpm was evaluated for each cohort. Results: Mean and σ of γ ( 3%/3 mm) were EPID γ %≤1= 99.9% ± 1.15% and array γ %<1 = 99.6% ± 1.06%. Among all plans γ max was consistently lower than for 2D array as compared to a-Si EPID. Fig.1 presents the X-bar control charts for every cohort. Cpm values for a-Si EPID were found to be higher than array, detailed results are presented in table 1. Conclusion: Present study demonstrates the significance of control charts used for quality management purposes in newer radiotherapy clinics. Also, provides a pictorial overview of the clinic performance for the advanced radiotherapy techniques.Higher Cpm values for EPID indicate its higher efficiency than array based measurements.

  20. Design of a Quality Control Program for the Measurement of Gross Alpha and Gross Beta Activities (LMPR-CIEMAT)

    International Nuclear Information System (INIS)

    Alvarez, A.; Yague, L.; Gasco, C.; Navarro, N.; Higueras, E.; Noguerales, C.

    2010-01-01

    In accordance with international standards, general requirements for testing laboratories have to include a quality system for planning, implementing, and assessing the work performed by the organization and for carrying out required quality assurance and quality control. The purpose of internal laboratory quality control is to monitor performance, identify problems, and initiate corrective actions. This report describes the internal quality control to monitor the gross alpha and beta activities determination. Identification of specific performance indicators, the principles that govern their use and statistical means of evaluation are explained. Finally, calculation of alpha and beta specific activities, uncertainties and detection limits are performed. (Author) 10 refs.

  1. Quality assurance and quality control of nuclear engineering during construction phase

    International Nuclear Information System (INIS)

    Zhang Zhihua; Deng Yue; Liu Yaoguang; Xu Xianqi; Zhou Shan; Qian Dazhi; Zhang Yang

    2007-01-01

    The quality assurance (QA) and quality control (QC) is a very important work in the nuclear engineering. This paper starts with how to establish quality assurance system of nuclear engineering construction phase, then introduces several experiments and techniques such as the implementation of quality assurance program, the quality assurance and quality control of contractors, the quality surveillance and control of supervisory companies, quality assurance audit and surveillance of builders. (authors)

  2. Sleep quality in patients with xerostomia: a prospective and randomized case-control study.

    Science.gov (United States)

    Lopez-Jornet, Pia; Lucero Berdugo, Maira; Fernandez-Pujante, Alba; C, Castillo Felipe; Lavella C, Zamora; A, Pons-Fuster; J, Silvestre Rangil; Silvestre, Francisco Javier

    2016-01-01

    Objectives To investigate sleep quality, anxiety/depression and quality-of-life in patients with xerostomia. Materials and methods This prospective, observational, cross-sectional study was conducted among a group of xerostomia patients (n = 30) compared with 30 matched control subjects. The following evaluation scales were used to assess the psychological profile of each patient: the Hospital Anxiety and Depression Scale, the Oral Health Impact Profile-14 (OHIP-14), the Xerostomia Inventory, the Pittsburgh Sleep Quality Index (PSQI) and the Epworth Sleepiness Scale (ESS). Results The PSQI obtained 5.3 3 ± 1.78 for patients with xerostomia compared with 4.26 ± 1.01 for control subjects (p = 0.006); ESS obtained 5.7 ± 2.1 for test patients vs 4.4 0 ± 1 for control subjects (p = 0.010). Statistical regression analysis showed that xerostomia was significantly associated with depression (p = 0.027). Conclusions Patients with xerostomia exhibited significant decreases in sleep quality compared with control subjects.

  3. Matched case-control studies: a review of reported statistical methodology

    Directory of Open Access Journals (Sweden)

    Niven DJ

    2012-04-01

    Full Text Available Daniel J Niven1, Luc R Berthiaume2, Gordon H Fick1, Kevin B Laupland11Department of Critical Care Medicine, Peter Lougheed Centre, Calgary, 2Department of Community Health Sciences, University of Calgary, Calgary, Alberta, CanadaBackground: Case-control studies are a common and efficient means of studying rare diseases or illnesses with long latency periods. Matching of cases and controls is frequently employed to control the effects of known potential confounding variables. The analysis of matched data requires specific statistical methods.Methods: The objective of this study was to determine the proportion of published, peer reviewed matched case-control studies that used statistical methods appropriate for matched data. Using a comprehensive set of search criteria we identified 37 matched case-control studies for detailed analysis.Results: Among these 37 articles, only 16 studies were analyzed with proper statistical techniques (43%. Studies that were properly analyzed were more likely to have included case patients with cancer and cardiovascular disease compared to those that did not use proper statistics (10/16 or 63%, versus 5/21 or 24%, P = 0.02. They were also more likely to have matched multiple controls for each case (14/16 or 88%, versus 13/21 or 62%, P = 0.08. In addition, studies with properly analyzed data were more likely to have been published in a journal with an impact factor listed in the top 100 according to the Journal Citation Reports index (12/16 or 69%, versus 1/21 or 5%, P ≤ 0.0001.Conclusion: The findings of this study raise concern that the majority of matched case-control studies report results that are derived from improper statistical analyses. This may lead to errors in estimating the relationship between a disease and exposure, as well as the incorrect adaptation of emerging medical literature.Keywords: case-control, matched, dependent data, statistics

  4. Analytical quality control of neutron activation analysis by interlaboratory comparison and proficiency test

    International Nuclear Information System (INIS)

    Kim, S. H.; Moon, J. H.; Jeong, Y. S.

    2002-01-01

    Two air filters (V-50, P-50) artificially loaded with urban dust were provided from IAEA and trace elements to study inter-laboratory comparison and proficiency test were determined using instrumental neutron activation analysis non-destructively. Standard reference material(Urban Particulate Matter, NIST SRM 1648) of National Institute of Standard and Technology was used for internal analytical quality control. About 20 elements in each loaded filter sample were determined, respectively. Our analytical data were compared with statistical results using neutron activation analysis, particle induced X-ray emission spectrometry, inductively coupled plasma mass spectroscopy, etc., which were collected from 49 laboratories of 40 countries. From the results that were statistically re-treated with reported values, Z-scores of our analytical values are within ±2. In addition, the results of proficiency test are passed and accuracy and precision of the analytical values are reliable. Consequently, it was proved that analytical quality control for the analysis of air dust samples is reasonable

  5. Evaluation and quality control of radiometric and spectrometric methods used in LVRA of Cuba CPHR

    International Nuclear Information System (INIS)

    Perez, Danyl; Prendes, Miguel; Fernandez, Isis M.; Gonzalez, Leidy

    1997-01-01

    The Surveillance Radiological Environmental Laboratory of the Center for Hygiene and Radiation Protection of Cuba, has developed a quality assurance system for the analytical services that it offers. The laboratory has designed and implemented a statistical control program for some measurements of standard samples and certificate reference materials. Statistical tests were used to determine the deviation of experimental results from the certificates ones. The results statistical analysis showed that the used methods are precise and exact in agreement with the requirements demanded for the measurements currently made in the laboratory. As a result a methodology for quality assurance of described methods was established and this experience is applicable to other analytical methods in the laboratory. (author). 15 refs., fig., 3 tabs

  6. Statistical process control analysis for patient-specific IMRT and VMAT QA.

    Science.gov (United States)

    Sanghangthum, Taweap; Suriyapee, Sivalee; Srisatit, Somyot; Pawlicki, Todd

    2013-05-01

    This work applied statistical process control to establish the control limits of the % gamma pass of patient-specific intensity modulated radiotherapy (IMRT) and volumetric modulated arc therapy (VMAT) quality assurance (QA), and to evaluate the efficiency of the QA process by using the process capability index (Cpml). A total of 278 IMRT QA plans in nasopharyngeal carcinoma were measured with MapCHECK, while 159 VMAT QA plans were undertaken with ArcCHECK. Six megavolts with nine fields were used for the IMRT plan and 2.5 arcs were used to generate the VMAT plans. The gamma (3%/3 mm) criteria were used to evaluate the QA plans. The % gamma passes were plotted on a control chart. The first 50 data points were employed to calculate the control limits. The Cpml was calculated to evaluate the capability of the IMRT/VMAT QA process. The results showed higher systematic errors in IMRT QA than VMAT QA due to the more complicated setup used in IMRT QA. The variation of random errors was also larger in IMRT QA than VMAT QA because the VMAT plan has more continuity of dose distribution. The average % gamma pass was 93.7% ± 3.7% for IMRT and 96.7% ± 2.2% for VMAT. The Cpml value of IMRT QA was 1.60 and VMAT QA was 1.99, which implied that the VMAT QA process was more accurate than the IMRT QA process. Our lower control limit for % gamma pass of IMRT is 85.0%, while the limit for VMAT is 90%. Both the IMRT and VMAT QA processes are good quality because Cpml values are higher than 1.0.

  7. Statistical assessment of quality of credit activity of Ukrainian banks

    Directory of Open Access Journals (Sweden)

    Moldavska Olena V.

    2013-03-01

    Full Text Available The article conducts an economic and statistical analysis of the modern state of credit activity of Ukrainian banks and main tendencies of its development. It justifies urgency of the statistical study of credit activity of banks. It offers a complex system of assessment of bank lending at two levels: the level of the banking system and the level of an individual bank. The use of the system analysis allows reflection of interconnection between effectiveness of functioning of the banking system and quality of the credit portfolio. The article considers main aspects of management of quality of the credit portfolio – level of troubled debt and credit risk. The article touches the problem of adequate quantitative assessment of troubled loans in the credit portfolios of banks, since the methodologies of its calculation used by the National Bank of Ukraine and international rating agencies are quite different. The article presents a system of methods of management of credit risk, both theoretically and providing specific examples, in the context of prevention of occurrence of risk situations or elimination of their consequences.

  8. Effective control of complex turbulent dynamical systems through statistical functionals.

    Science.gov (United States)

    Majda, Andrew J; Qi, Di

    2017-05-30

    Turbulent dynamical systems characterized by both a high-dimensional phase space and a large number of instabilities are ubiquitous among complex systems in science and engineering, including climate, material, and neural science. Control of these complex systems is a grand challenge, for example, in mitigating the effects of climate change or safe design of technology with fully developed shear turbulence. Control of flows in the transition to turbulence, where there is a small dimension of instabilities about a basic mean state, is an important and successful discipline. In complex turbulent dynamical systems, it is impossible to track and control the large dimension of instabilities, which strongly interact and exchange energy, and new control strategies are needed. The goal of this paper is to propose an effective statistical control strategy for complex turbulent dynamical systems based on a recent statistical energy principle and statistical linear response theory. We illustrate the potential practical efficiency and verify this effective statistical control strategy on the 40D Lorenz 1996 model in forcing regimes with various types of fully turbulent dynamics with nearly one-half of the phase space unstable.

  9. 40 CFR 51.359 - Quality control.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 2 2010-07-01 2010-07-01 false Quality control. 51.359 Section 51.359....359 Quality control. Quality control measures shall insure that emission testing equipment is calibrated and maintained properly, and that inspection, calibration records, and control charts are...

  10. Using Statistical Process Control to Make Data-Based Clinical Decisions.

    Science.gov (United States)

    Pfadt, Al; Wheeler, Donald J.

    1995-01-01

    Statistical process control (SPC), which employs simple statistical tools and problem-solving techniques such as histograms, control charts, flow charts, and Pareto charts to implement continual product improvement procedures, can be incorporated into human service organizations. Examples illustrate use of SPC procedures to analyze behavioral data…

  11. MapReduce Based Parallel Bayesian Network for Manufacturing Quality Control

    Science.gov (United States)

    Zheng, Mao-Kuan; Ming, Xin-Guo; Zhang, Xian-Yu; Li, Guo-Ming

    2017-09-01

    Increasing complexity of industrial products and manufacturing processes have challenged conventional statistics based quality management approaches in the circumstances of dynamic production. A Bayesian network and big data analytics integrated approach for manufacturing process quality analysis and control is proposed. Based on Hadoop distributed architecture and MapReduce parallel computing model, big volume and variety quality related data generated during the manufacturing process could be dealt with. Artificial intelligent algorithms, including Bayesian network learning, classification and reasoning, are embedded into the Reduce process. Relying on the ability of the Bayesian network in dealing with dynamic and uncertain problem and the parallel computing power of MapReduce, Bayesian network of impact factors on quality are built based on prior probability distribution and modified with posterior probability distribution. A case study on hull segment manufacturing precision management for ship and offshore platform building shows that computing speed accelerates almost directly proportionally to the increase of computing nodes. It is also proved that the proposed model is feasible for locating and reasoning of root causes, forecasting of manufacturing outcome, and intelligent decision for precision problem solving. The integration of bigdata analytics and BN method offers a whole new perspective in manufacturing quality control.

  12. Statistical methods to assess and control processes and products during nuclear fuel fabrication

    International Nuclear Information System (INIS)

    Weidinger, H.

    1999-01-01

    Very good statistical tools and techniques are available today to access the quality and the reliability of fabrication process as the original sources for a good and reliable quality of the fabricated processes. Quality control charts of different types play a key role and the high capability of modern electronic data acquisition technologies proved, at least potentially, a high efficiency in the more or less online application of these methods. These techniques focus mainly on stability and the reliability of the fabrication process. In addition, relatively simple statistical tolls are available to access the capability of fabrication process, assuming they are stable, to fulfill the product specifications. All these techniques can only result in as good a product as the product design is able to describe the product requirements necessary for good performance. Therefore it is essential that product design is strictly and closely performance oriented. However, performance orientation is only successful through an open and effective cooperation with the customer who uses or applies those products. During the last one to two decades in the west, a multi-vendor strategy has been developed by the utility, sometimes leading to three different fuel vendors for one reactor core. This development resulted in better economic conditions for the user but did not necessarily increase an open attitude with the vendor toward the using utility. The responsibility of the utility increased considerably to ensure an adequate quality of the fuel they received. As a matter of fact, sometimes the utilities had to pay a high price because of unexpected performance problems. Thus the utilities are now learning that they need to increase their knowledge and experience in the area of nuclear fuel quality management and technology. This process started some time ago in the west. However, it now also reaches the utilities in the eastern countries. (author)

  13. Analytical quality, performance indices and laboratory service

    DEFF Research Database (Denmark)

    Hilden, Jørgen; Magid, Erik

    1999-01-01

    analytical error, bias, cost effectiveness, decision-making, laboratory techniques and procedures, mass screening, models, statistical, quality control......analytical error, bias, cost effectiveness, decision-making, laboratory techniques and procedures, mass screening, models, statistical, quality control...

  14. Effectiveness of Self-Control Training on Quality of Life Dimensions in Migraine Patients

    Directory of Open Access Journals (Sweden)

    Esmaeel Soleimani

    2016-06-01

    Full Text Available Abstract Background: Migraine is a chronic neurological disorder that leads patients to avoid any kind of activities. Since different factors are involved in migraine incidence and its triggers, so drugs are used to prevent or treat it are so variable. Also, combined medications are used to relieve migraine. This study examined the effectiveness of self-control training on quality of life in patients with migraine. Materials and Methods: Statistic population of this study included all migraine patients in Ardabil in 2014(Estimation: N=1150 that 40 patients were selected by convenience sampling. Demographic and disease information questionnaire and quality of life questionnaire (SF-36 were used to collect data in clinical centers. Multivariate analysis of variance (MANCOVA was used to analyze data, because present research was a experimental and clinical trial with pre-test and post-test with control group. Results: The results showed that there is a significant difference between mean in quality of life in migraine patients and control subjects. It means that physical health and mental health of quality of life was different between control and experimental groups after self- control training. Conclusion: Self-control training can be used to enhance quality of life in migraine patients. These results have important and effective applications in the treatment of migraine patients. Generally, specialists of clinical centers can use this method alongside other treatment interventions.

  15. Network-based production quality control

    Science.gov (United States)

    Kwon, Yongjin; Tseng, Bill; Chiou, Richard

    2007-09-01

    This study investigates the feasibility of remote quality control using a host of advanced automation equipment with Internet accessibility. Recent emphasis on product quality and reduction of waste stems from the dynamic, globalized and customer-driven market, which brings opportunities and threats to companies, depending on the response speed and production strategies. The current trends in industry also include a wide spread of distributed manufacturing systems, where design, production, and management facilities are geographically dispersed. This situation mandates not only the accessibility to remotely located production equipment for monitoring and control, but efficient means of responding to changing environment to counter process variations and diverse customer demands. To compete under such an environment, companies are striving to achieve 100%, sensor-based, automated inspection for zero-defect manufacturing. In this study, the Internet-based quality control scheme is referred to as "E-Quality for Manufacturing" or "EQM" for short. By its definition, EQM refers to a holistic approach to design and to embed efficient quality control functions in the context of network integrated manufacturing systems. Such system let designers located far away from the production facility to monitor, control and adjust the quality inspection processes as production design evolves.

  16. Multivariate control charts based on net analyte signal (NAS) and Raman spectroscopy for quality control of carbamazepine

    Energy Technology Data Exchange (ETDEWEB)

    Rocha, Werickson Fortunato de Carvalho [Institute of Chemistry, University of Campinas - UNICAMP, P.O. Box 6154, 13083-970 Campinas, SP (Brazil); National Institute of Metrology, Standardization and Industrial Quality, Inmetro, Dimci/Dquim - Directorate of Metrology, Science and Industry/Division of Chemical Metrology, Av. Nossa Senhora das Gracas 50, Building 6, 25250-020, Xerem, Duque de Caxias, RJ (Brazil); Poppi, Ronei Jesus, E-mail: ronei@iqm.unicamp.br [Institute of Chemistry, University of Campinas - UNICAMP, P.O. Box 6154, 13083-970 Campinas, SP (Brazil); National Institute of Science and Technology (INCT) for Bioanalytics, 13083-970 Campinas, SP (Brazil)

    2011-10-31

    Raman spectroscopy and control charts based on the net analyte signal (NAS) were applied to polymorphic characterization of carbamazepine. Carbamazepine presents four polymorphic forms: I-IV (dihydrate). X-ray powder diffraction was used as a reference technique. The control charts were built generating three charts: the NAS chart that corresponds to the analyte of interest (form III in this case), the interference chart that corresponds to the contribution of other compounds in the sample and the residual chart that corresponds to nonsystematic variations. For each chart, statistical limits were developed using samples within the quality specifications. It was possible to identify the different polymorphic forms of carbamazepine present in pharmaceutical formulations. Thus, an alternative method for the quality monitoring of the carbamazepine polymorphic forms after the crystallization process is presented.

  17. Multivariate control charts based on net analyte signal (NAS) and Raman spectroscopy for quality control of carbamazepine

    International Nuclear Information System (INIS)

    Rocha, Werickson Fortunato de Carvalho; Poppi, Ronei Jesus

    2011-01-01

    Raman spectroscopy and control charts based on the net analyte signal (NAS) were applied to polymorphic characterization of carbamazepine. Carbamazepine presents four polymorphic forms: I-IV (dihydrate). X-ray powder diffraction was used as a reference technique. The control charts were built generating three charts: the NAS chart that corresponds to the analyte of interest (form III in this case), the interference chart that corresponds to the contribution of other compounds in the sample and the residual chart that corresponds to nonsystematic variations. For each chart, statistical limits were developed using samples within the quality specifications. It was possible to identify the different polymorphic forms of carbamazepine present in pharmaceutical formulations. Thus, an alternative method for the quality monitoring of the carbamazepine polymorphic forms after the crystallization process is presented.

  18. No-Reference Video Quality Assessment Based on Statistical Analysis in 3D-DCT Domain.

    Science.gov (United States)

    Li, Xuelong; Guo, Qun; Lu, Xiaoqiang

    2016-05-13

    It is an important task to design models for universal no-reference video quality assessment (NR-VQA) in multiple video processing and computer vision applications. However, most existing NR-VQA metrics are designed for specific distortion types which are not often aware in practical applications. A further deficiency is that the spatial and temporal information of videos is hardly considered simultaneously. In this paper, we propose a new NR-VQA metric based on the spatiotemporal natural video statistics (NVS) in 3D discrete cosine transform (3D-DCT) domain. In the proposed method, a set of features are firstly extracted based on the statistical analysis of 3D-DCT coefficients to characterize the spatiotemporal statistics of videos in different views. These features are used to predict the perceived video quality via the efficient linear support vector regression (SVR) model afterwards. The contributions of this paper are: 1) we explore the spatiotemporal statistics of videos in 3DDCT domain which has the inherent spatiotemporal encoding advantage over other widely used 2D transformations; 2) we extract a small set of simple but effective statistical features for video visual quality prediction; 3) the proposed method is universal for multiple types of distortions and robust to different databases. The proposed method is tested on four widely used video databases. Extensive experimental results demonstrate that the proposed method is competitive with the state-of-art NR-VQA metrics and the top-performing FR-VQA and RR-VQA metrics.

  19. Population-based cancer survival in the United States: Data, quality control, and statistical methods.

    Science.gov (United States)

    Allemani, Claudia; Harewood, Rhea; Johnson, Christopher J; Carreira, Helena; Spika, Devon; Bonaventure, Audrey; Ward, Kevin; Weir, Hannah K; Coleman, Michel P

    2017-12-15

    Robust comparisons of population-based cancer survival estimates require tight adherence to the study protocol, standardized quality control, appropriate life tables of background mortality, and centralized analysis. The CONCORD program established worldwide surveillance of population-based cancer survival in 2015, analyzing individual data on 26 million patients (including 10 million US patients) diagnosed between 1995 and 2009 with 1 of 10 common malignancies. In this Cancer supplement, we analyzed data from 37 state cancer registries that participated in the second cycle of the CONCORD program (CONCORD-2), covering approximately 80% of the US population. Data quality checks were performed in 3 consecutive phases: protocol adherence, exclusions, and editorial checks. One-, 3-, and 5-year age-standardized net survival was estimated using the Pohar Perme estimator and state- and race-specific life tables of all-cause mortality for each year. The cohort approach was adopted for patients diagnosed between 2001 and 2003, and the complete approach for patients diagnosed between 2004 and 2009. Articles in this supplement report population coverage, data quality indicators, and age-standardized 5-year net survival by state, race, and stage at diagnosis. Examples of tables, bar charts, and funnel plots are provided in this article. Population-based cancer survival is a key measure of the overall effectiveness of services in providing equitable health care. The high quality of US cancer registry data, 80% population coverage, and use of an unbiased net survival estimator ensure that the survival trends reported in this supplement are robustly comparable by race and state. The results can be used by policymakers to identify and address inequities in cancer survival in each state and for the United States nationally. Cancer 2017;123:4982-93. Published 2017. This article is a U.S. Government work and is in the public domain in the USA. Published 2017. This article is a U

  20. Preparation of quality control samples for thyroid hormones T3 and T4 in radioimmunoassay techniques

    International Nuclear Information System (INIS)

    Ahmed, F.O.A.

    2006-03-01

    Today, the radioimmunoassay becomes one of the best techniques for quantitative analysis of very low concentration of different substances. RIA is being widely used in medical and research laboratories. To maintain high specificity and accuracy in RIA and other related techniques the quality controls must be introduced. In this dissertation quality control samples for thyroid hormones (Triiodothyronine T3 and Thyroxin T4), using RIA techniques. Ready made chinese T4, T3 RIA kits were used. IAEA statistical package were selected.(Author)

  1. Quality evaluation of no-reference MR images using multidirectional filters and image statistics.

    Science.gov (United States)

    Jang, Jinseong; Bang, Kihun; Jang, Hanbyol; Hwang, Dosik

    2018-09-01

    This study aimed to develop a fully automatic, no-reference image-quality assessment (IQA) method for MR images. New quality-aware features were obtained by applying multidirectional filters to MR images and examining the feature statistics. A histogram of these features was then fitted to a generalized Gaussian distribution function for which the shape parameters yielded different values depending on the type of distortion in the MR image. Standard feature statistics were established through a training process based on high-quality MR images without distortion. Subsequently, the feature statistics of a test MR image were calculated and compared with the standards. The quality score was calculated as the difference between the shape parameters of the test image and the undistorted standard images. The proposed IQA method showed a >0.99 correlation with the conventional full-reference assessment methods; accordingly, this proposed method yielded the best performance among no-reference IQA methods for images containing six types of synthetic, MR-specific distortions. In addition, for authentically distorted images, the proposed method yielded the highest correlation with subjective assessments by human observers, thus demonstrating its superior performance over other no-reference IQAs. Our proposed IQA was designed to consider MR-specific features and outperformed other no-reference IQAs designed mainly for photographic images. Magn Reson Med 80:914-924, 2018. © 2018 International Society for Magnetic Resonance in Medicine. © 2018 International Society for Magnetic Resonance in Medicine.

  2. Radiopharmaceutical quality control-Pragmatic approach

    International Nuclear Information System (INIS)

    Barbier, Y.

    1994-01-01

    The quality control must be considered in a practical manner. The radiopharmaceuticals are drugs. They must satisfy the quality assurance control. These products are then conform to Pharmacopeia. But sometimes the user must control some data especially radiochemical purity and pH value. On all the administered solutions four controls are compulsory: radionuclide identity, administered radioactivity, organoleptic character and pH

  3. 7 CFR 981.42 - Quality control.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 8 2010-01-01 2010-01-01 false Quality control. 981.42 Section 981.42 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Marketing Agreements... Regulating Handling Quality Control § 981.42 Quality control. (a) Incoming. Except as provided in this...

  4. Quality control of dosemeters

    International Nuclear Information System (INIS)

    Mendes, L.

    1984-01-01

    Nuclear medicine laboratories are required to assay samples of radioactivity to be administered to patients. Almost universally, these assays are acomplished by use of a well ionization chamber isotope calibrator. The Instituto de Radioprotecao e Dosimetria (Institute for Radiological Protection and Dosimetry) of the Comissao Nacional de Energia Nuclear (National Commission for Nuclear Energy) is carrying out a National Quality Control Programme in Nuclear Medicine, supported by the International Atomic Energy Agency. The assessment of the current needs and practices of quality control in the entire country of Brazil includes Dose Calibrators and Scintilation Cameras, but this manual is restricted to the former. Quality Control Procedures for these Instruments are described in this document together with specific recommendations and assessment of its accuracy. (Author) [pt

  5. Statistical analysis of the influence of wheat black point kernels on selected indicators of wheat flour quality

    Directory of Open Access Journals (Sweden)

    Petrov Verica D.

    2011-01-01

    Full Text Available The influence of wheat black point kernels on selected indicators of wheat flour quality - farinograph and extensograph indicators, amylolytic activity, wet gluten and flour ash content, were examined in this study. The examinations were conducted on samples of wheat harvested in the years 2007 and 2008 from the area of Central Banat in four treatments-control (without black point flour and with 2, 4 and 10% of black point flour which was added as a replacement for a part of the control sample. Statistically significant differences between treatments were observed on the dough stability, falling number and extensibility. The samples with 10% of black point flour had the lowest dough stability and the highest amylolytic activity and extensibility. There was a trend of the increasing 15 min drop and water absorption with the increased share of black point flour. Extensograph area, resistance and ratio resistance to extensibility decreased with the addition of black point flour, but not properly. Mahalanobis distance indicates that the addition of 10% black point flour had the greatest influence on the observed quality indicators, thus proving that black point contributes to the technological quality of wheat, i.e .flour.

  6. Evaluation of air quality in a megacity using statistics tools

    Science.gov (United States)

    Ventura, Luciana Maria Baptista; de Oliveira Pinto, Fellipe; Soares, Laiza Molezon; Luna, Aderval Severino; Gioda, Adriana

    2018-06-01

    Local physical characteristics (e.g., meteorology and topography) associate to particle concentrations are important to evaluate air quality in a region. Meteorology and topography affect air pollutant dispersions. This study used statistics tools (PCA, HCA, Kruskal-Wallis, Mann-Whitney's test and others) to a better understanding of the relationship between fine particulate matter (PM2.5) levels and seasons, meteorological conditions and air basins. To our knowledge, it is one of the few studies performed in Latin America involving all parameters together. PM2.5 samples were collected in six sampling sites with different emission sources (industrial, vehicular, soil dust) in Rio de Janeiro, Brazil. The PM2.5 daily concentrations ranged from 1 to 61 µg m-3, with averages higher than the annual limit (15 µg m-3) for some of the sites. The results of the statistics evaluation showed that PM2.5 concentrations were not influenced by seasonality. Furthermore, air basins defined previously were not confirmed, because some sites presented similar emission sources. Therefore, new redefinitions of air basins need to be done, once they are important to air quality management.

  7. Evaluation of air quality in a megacity using statistics tools

    Science.gov (United States)

    Ventura, Luciana Maria Baptista; de Oliveira Pinto, Fellipe; Soares, Laiza Molezon; Luna, Aderval Severino; Gioda, Adriana

    2017-03-01

    Local physical characteristics (e.g., meteorology and topography) associate to particle concentrations are important to evaluate air quality in a region. Meteorology and topography affect air pollutant dispersions. This study used statistics tools (PCA, HCA, Kruskal-Wallis, Mann-Whitney's test and others) to a better understanding of the relationship between fine particulate matter (PM2.5) levels and seasons, meteorological conditions and air basins. To our knowledge, it is one of the few studies performed in Latin America involving all parameters together. PM2.5 samples were collected in six sampling sites with different emission sources (industrial, vehicular, soil dust) in Rio de Janeiro, Brazil. The PM2.5 daily concentrations ranged from 1 to 61 µg m-3, with averages higher than the annual limit (15 µg m-3) for some of the sites. The results of the statistics evaluation showed that PM2.5 concentrations were not influenced by seasonality. Furthermore, air basins defined previously were not confirmed, because some sites presented similar emission sources. Therefore, new redefinitions of air basins need to be done, once they are important to air quality management.

  8. Assessment of the beryllium lymphocyte proliferation test using statistical process control.

    Science.gov (United States)

    Cher, Daniel J; Deubner, David C; Kelsh, Michael A; Chapman, Pamela S; Ray, Rose M

    2006-10-01

    Despite more than 20 years of surveillance and epidemiologic studies using the beryllium blood lymphocyte proliferation test (BeBLPT) as a measure of beryllium sensitization (BeS) and as an aid for diagnosing subclinical chronic beryllium disease (CBD), improvements in specific understanding of the inhalation toxicology of CBD have been limited. Although epidemiologic data suggest that BeS and CBD risks vary by process/work activity, it has proven difficult to reach specific conclusions regarding the dose-response relationship between workplace beryllium exposure and BeS or subclinical CBD. One possible reason for this uncertainty could be misclassification of BeS resulting from variation in BeBLPT testing performance. The reliability of the BeBLPT, a biological assay that measures beryllium sensitization, is unknown. To assess the performance of four laboratories that conducted this test, we used data from a medical surveillance program that offered testing for beryllium sensitization with the BeBLPT. The study population was workers exposed to beryllium at various facilities over a 10-year period (1992-2001). Workers with abnormal results were offered diagnostic workups for CBD. Our analyses used a standard statistical technique, statistical process control (SPC), to evaluate test reliability. The study design involved a repeated measures analysis of BeBLPT results generated from the company-wide, longitudinal testing. Analytical methods included use of (1) statistical process control charts that examined temporal patterns of variation for the stimulation index, a measure of cell reactivity to beryllium; (2) correlation analysis that compared prior perceptions of BeBLPT instability to the statistical measures of test variation; and (3) assessment of the variation in the proportion of missing test results and how time periods with more missing data influenced SPC findings. During the period of this study, all laboratories displayed variation in test results that

  9. An easy and low cost option for economic statistical process control ...

    African Journals Online (AJOL)

    An easy and low cost option for economic statistical process control using Excel. ... in both economic and economic statistical designs of the X-control chart. ... in this paper and the numerical examples illustrated are executed on this program.

  10. 7 CFR 930.44 - Quality control.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 8 2010-01-01 2010-01-01 false Quality control. 930.44 Section 930.44 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Marketing Agreements... Control § 930.44 Quality control. (a) Quality standards. The Board may establish, with the approval of the...

  11. Statistics for non-statisticians

    CERN Document Server

    Madsen, Birger Stjernholm

    2016-01-01

    This book was written for those who need to know how to collect, analyze and present data. It is meant to be a first course for practitioners, a book for private study or brush-up on statistics, and supplementary reading for general statistics classes. The book is untraditional, both with respect to the choice of topics and the presentation: Topics were determined by what is most useful for practical statistical work, and the presentation is as non-mathematical as possible. The book contains many examples using statistical functions in spreadsheets. In this second edition, new topics have been included e.g. within the area of statistical quality control, in order to make the book even more useful for practitioners working in industry. .

  12. Software for creating quality control database in diagnostic radiology

    International Nuclear Information System (INIS)

    Stoeva, M.; Spassov, G.; Tabakov, S.

    2000-01-01

    The paper describes a PC based program with database for quality control (QC). It keeps information about all surveyed equipment and measured parameters. The first function of the program is to extract information from old (existing) MS Excel spreadsheets with QC surveys. The second function is used for input of measurements which are automatically organized in MS Excel spreadsheets and built into the database. The spreadsheets are based on the protocols described in the EMERALD Training Scheme. In addition, the program can make statistics of all measured parameters, both in absolute term and in time

  13. [Flavouring estimation of quality of grape wines with use of methods of mathematical statistics].

    Science.gov (United States)

    Yakuba, Yu F; Khalaphyan, A A; Temerdashev, Z A; Bessonov, V V; Malinkin, A D

    2016-01-01

    The questions of forming of wine's flavour integral estimation during the tasting are discussed, the advantages and disadvantages of the procedures are declared. As investigating materials we used the natural white and red wines of Russian manufactures, which were made with the traditional technologies from Vitis Vinifera, straight hybrids, blending and experimental wines (more than 300 different samples). The aim of the research was to set the correlation between the content of wine's nonvolatile matter and wine's tasting quality rating by mathematical statistics methods. The content of organic acids, amino acids and cations in wines were considered as the main factors influencing on the flavor. Basically, they define the beverage's quality. The determination of those components in wine's samples was done by the electrophoretic method «CAPEL». Together with the analytical checking of wine's samples quality the representative group of specialists simultaneously carried out wine's tasting estimation using 100 scores system. The possibility of statistical modelling of correlation of wine's tasting estimation based on analytical data of amino acids and cations determination reasonably describing the wine's flavour was examined. The statistical modelling of correlation between the wine's tasting estimation and the content of major cations (ammonium, potassium, sodium, magnesium, calcium), free amino acids (proline, threonine, arginine) and the taking into account the level of influence on flavour and analytical valuation within fixed limits of quality accordance were done with Statistica. Adequate statistical models which are able to predict tasting estimation that is to determine the wine's quality using the content of components forming the flavour properties have been constructed. It is emphasized that along with aromatic (volatile) substances the nonvolatile matter - mineral substances and organic substances - amino acids such as proline, threonine, arginine

  14. Quality control analysis at the hospital

    International Nuclear Information System (INIS)

    Kristensen, K.

    1979-01-01

    Quality control analysis is an integral part of quality assurance. In a system as with radiopharmaceuticals where part of the finishing of the product takes place at individual hospitals, the need for quality control analysis at the hospital can be discussed. Data are presented that stresses the importance of quality control by the manufacturer as a basis for limitation of such work at hospitals. A simplified programme is proposed

  15. Towards an integrated quality control procedure for eddy-covariance data

    Science.gov (United States)

    Vitale, Domenico; Papale, Dario

    2017-04-01

    The eddy-covariance technique is nowadays the most reliable and direct way, allowing to calculate the main fluxes of Sensible and Latent Heat and of Net Ecosystem Exchange, this last being the result of the difference between the CO2 assimilated by photosynthetic activities and those released to the atmosphere through the ecosystem respiration processes. Despite the improvements in accuracy of measurement instruments and software development, the eddy-covariance technique is not suitable under non-ideal conditions respect to the instruments characteristics and the physical assumption behind the technique mainly related to the well-developed and stationary turbulence conditions. Under these conditions the calculated fluxes are not reliable and need to be flagged and discarded. In order to discover these unavoidable "bad" fluxes and build dataset with the highest quality, several tests applied both on high-frequency (10-20 Hz) raw data and on half-hourly times series have been developed in the past years. Nevertheless, there is an increasing need to develop a standardized quality control procedure suitable not only for the analysis of long-term data, but also for the near-real time data processing. In this paper, we review established quality assessment procedures and present an innovative quality control strategy with the purpose of integrating the existing consolidated procedures with robust and advanced statistical tests more suitable for the analysis of time series data. The performance of the proposed quality control strategy is evaluated both on simulated and EC data distributed by the ICOS research infrastructure. It is concluded that the proposed strategy is able to flag and exclude unrealistic fluxes while being reproducible and retaining the largest possible amount of high quality data.

  16. Development on quality management concepts

    OpenAIRE

    Dragan Cristian; Stanca Costel

    2011-01-01

    The purpose of this paper is to perform an analysis of the history of the Total Quality Management (TQM) in the private sector, taking a closer look at its five stages in the Western hemisphere: quality inspection, statistical quality control, system-oriented quality assurance, company-wide quality control, total quality management

  17. Numerical and Qualitative Contrasts of Two Statistical Models for Water Quality Change in Tidal Waters

    Science.gov (United States)

    Two statistical approaches, weighted regression on time, discharge, and season and generalized additive models, have recently been used to evaluate water quality trends in estuaries. Both models have been used in similar contexts despite differences in statistical foundations and...

  18. 14 CFR 21.139 - Quality control.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Quality control. 21.139 Section 21.139... PROCEDURES FOR PRODUCTS AND PARTS Production Certificates § 21.139 Quality control. The applicant must show that he has established and can maintain a quality control system for any product, for which he...

  19. 33 CFR 385.21 - Quality control.

    Science.gov (United States)

    2010-07-01

    ... 33 Navigation and Navigable Waters 3 2010-07-01 2010-07-01 false Quality control. 385.21 Section... Processes § 385.21 Quality control. (a) The Corps of Engineers and the non-Federal sponsor shall prepare a quality control plan, in accordance with applicable Corps of Engineers regulations, for each product that...

  20. Total quality control: the deming management philosophy applied to nuclear power plants

    International Nuclear Information System (INIS)

    Heising, C.D.; Wetherell, D.L.; Melhem, S.A.; Sato, M.

    1987-01-01

    In recent years, a call has come for the development of inherently safe nuclear reactor systems that cannot have large-scale accidents. In the search for the perfect inherently safe reactor system, some are calling for the institution of computerized automated control of reactors eliminating most human operators from the control room. A different approach to the problem of the control of inherently safe reactors is that both future and present nuclear power plants need to institute total quality control (TQC) to plant operations and management. The Deming management philosophy of TQC has been implemented in a wide range of industries - particularly in Japan and the US. Specific attention is given, however, to TQC implementation in the electric power industry as applied to nuclear plants. The Kansai Electric Power Company and Florida Power and Light Company have recently implemented TQC. Statistical quality control methods have been applied to monitor and control reactor variables (for example, to the steam generator water level important to start-up operations of pressurized water reactors)

  1. Statistical process control charts for monitoring military injuries.

    Science.gov (United States)

    Schuh, Anna; Canham-Chervak, Michelle; Jones, Bruce H

    2017-12-01

    An essential aspect of an injury prevention process is surveillance, which quantifies and documents injury rates in populations of interest and enables monitoring of injury frequencies, rates and trends. To drive progress towards injury reduction goals, additional tools are needed. Statistical process control charts, a methodology that has not been previously applied to Army injury monitoring, capitalise on existing medical surveillance data to provide information to leadership about injury trends necessary for prevention planning and evaluation. Statistical process control Shewhart u-charts were created for 49 US Army installations using quarterly injury medical encounter rates, 2007-2015, for active duty soldiers obtained from the Defense Medical Surveillance System. Injuries were defined according to established military injury surveillance recommendations. Charts display control limits three standard deviations (SDs) above and below an installation-specific historical average rate determined using 28 data points, 2007-2013. Charts are available in Army strategic management dashboards. From 2007 to 2015, Army injury rates ranged from 1254 to 1494 unique injuries per 1000 person-years. Installation injury rates ranged from 610 to 2312 injuries per 1000 person-years. Control charts identified four installations with injury rates exceeding the upper control limits at least once during 2014-2015, rates at three installations exceeded the lower control limit at least once and 42 installations had rates that fluctuated around the historical mean. Control charts can be used to drive progress towards injury reduction goals by indicating statistically significant increases and decreases in injury rates. Future applications to military subpopulations, other health outcome metrics and chart enhancements are suggested. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  2. Statistics for nuclear engineers and scientists. Part 1. Basic statistical inference

    Energy Technology Data Exchange (ETDEWEB)

    Beggs, W.J.

    1981-02-01

    This report is intended for the use of engineers and scientists working in the nuclear industry, especially at the Bettis Atomic Power Laboratory. It serves as the basis for several Bettis in-house statistics courses. The objectives of the report are to introduce the reader to the language and concepts of statistics and to provide a basic set of techniques to apply to problems of the collection and analysis of data. Part 1 covers subjects of basic inference. The subjects include: descriptive statistics; probability; simple inference for normally distributed populations, and for non-normal populations as well; comparison of two populations; the analysis of variance; quality control procedures; and linear regression analysis.

  3. Random Forest Application for NEXRAD Radar Data Quality Control

    Science.gov (United States)

    Keem, M.; Seo, B. C.; Krajewski, W. F.

    2017-12-01

    Identification and elimination of non-meteorological radar echoes (e.g., returns from ground, wind turbines, and biological targets) are the basic data quality control steps before radar data use in quantitative applications (e.g., precipitation estimation). Although WSR-88Ds' recent upgrade to dual-polarization has enhanced this quality control and echo classification, there are still challenges to detect some non-meteorological echoes that show precipitation-like characteristics (e.g., wind turbine or anomalous propagation clutter embedded in rain). With this in mind, a new quality control method using Random Forest is proposed in this study. This classification algorithm is known to produce reliable results with less uncertainty. The method introduces randomness into sampling and feature selections and integrates consequent multiple decision trees. The multidimensional structure of the trees can characterize the statistical interactions of involved multiple features in complex situations. The authors explore the performance of Random Forest method for NEXRAD radar data quality control. Training datasets are selected using several clear cases of precipitation and non-precipitation (but with some non-meteorological echoes). The model is structured using available candidate features (from the NEXRAD data) such as horizontal reflectivity, differential reflectivity, differential phase shift, copolar correlation coefficient, and their horizontal textures (e.g., local standard deviation). The influence of each feature on classification results are quantified by variable importance measures that are automatically estimated by the Random Forest algorithm. Therefore, the number and types of features in the final forest can be examined based on the classification accuracy. The authors demonstrate the capability of the proposed approach using several cases ranging from distinct to complex rain/no-rain events and compare the performance with the existing algorithms (e

  4. Related regulation of quality control of industrial products

    International Nuclear Information System (INIS)

    1983-04-01

    This book introduce related regulation of quality control of industrial products, which includes regulations of industrial products quality control, enforcement ordinance of industrial products quality control, enforcement regulation of quality control of industrial products, designated items with industrial production quality indication, industrial production quality test, and industrial production quality test organization and management tips of factory quality by grade.

  5. Quality control guarantees the safety of radiotherapy

    International Nuclear Information System (INIS)

    Aaltonen, P.

    1994-01-01

    While radiotherapy equipment has seen some decisive improvements in the last few decades, the technology has also become more complicated. The advanced equipment produces increasingly good treatment results, but the condition of the equipment must be controlled efficiently so as to eliminate any defects that might jeopardise patient safety. The quality assurance measures that are taken to show that certain equipment functions as required are known as quality control. The advanced equipment and stricter requirements set for the precision of radiotherapy have meant that more attention must be paid to quality control. The present radiation legislation stipulates that radiotherapy equipment must undergo regular quality control. The implementation of the quality control is supervised by the Finnish Centre for Radiation and Nuclear Safety (STUK). Hospitals carry out quality control in accordance with a programme approved by STUK, and STUK inspectors periodically visit hospitals to check the results of quality control. (orig.)

  6. The quality analysis system implemented by FRAGEMA

    International Nuclear Information System (INIS)

    Kopff, G.

    1988-01-01

    Systematic statistical processing of measurements and quality control data obtained through manufacturing and conformity inspection is necessary for global knowledge of the fuel quality, which is useful both to the designer and to the manufacturer. For this aim the quality control data management and processing system implemented by FRAGEMA is described and illustrated with examples of the different types of statistical quality reports which are printed out. (orig.)

  7. Statistical Process Control for KSC Processing

    Science.gov (United States)

    Ford, Roger G.; Delgado, Hector; Tilley, Randy

    1996-01-01

    The 1996 Summer Faculty Fellowship Program and Kennedy Space Center (KSC) served as the basis for a research effort into statistical process control for KSC processing. The effort entailed several tasks and goals. The first was to develop a customized statistical process control (SPC) course for the Safety and Mission Assurance Trends Analysis Group. The actual teaching of this course took place over several weeks. In addition, an Internet version of the same course complete with animation and video excerpts from the course when it was taught at KSC was developed. The application of SPC to shuttle processing took up the rest of the summer research project. This effort entailed the evaluation of SPC use at KSC, both present and potential, due to the change in roles for NASA and the Single Flight Operations Contractor (SFOC). Individual consulting on SPC use was accomplished as well as an evaluation of SPC software for KSC use in the future. A final accomplishment of the orientation of the author to NASA changes, terminology, data format, and new NASA task definitions will allow future consultation when the needs arise.

  8. Quality and reliability control on assemblies

    International Nuclear Information System (INIS)

    Mueller, H.

    1976-01-01

    Taking as an example electronic assemblies in printed circuit board engineering, quality control during manufacture is dealt with. After giving a survey of four phases of quality and reliability control, some specific methods of quality control are dealt with by means of a flowchart, and by some examples the necessity and the success of these measures are shown. (RW) [de

  9. Statistical tests applied as quality control measures to leaching of nuclear waste glasses and in the evaluation of the leach vessel

    International Nuclear Information System (INIS)

    Bokelund, H.; Deelstra, K.

    1988-01-01

    Simple statistical tests, such as regression analysis and analysis of variance, have been applied to data obtained from leaching experiments carried out under various conditions of time and temperature. The precision and the accuracy of the overall leaching procedure were evaluated considering the short term within laboratory effects. The data originated from determinations of the mass losses of leached glass specimens and from measurements of the electrical conductivity and the pH of the leachants. The solution conductivity correlates highly with the normalized mass loss; hence it provides a consistency check on the measurements of the latter parameter. The overall relative precision of the leaching test method was found to be 5-12%, including the effects caused by inhomogeneity of the glass specimens. The conditions for the application of the teflon inserts often used in leaching devices have been investigated; a modified cleaning procedure is proposed to ascertain the absence of systematic errors by their repeated utilization (quality control). The operational limit of 190 0 C, as specified by the Materials Characterization Center, Richland, USA was confirmed experimentally. 8 refs.; 1 figure; 8 tabs

  10. Quality Control in Mammography: Image Quality and Patient Doses

    International Nuclear Information System (INIS)

    Ciraj Bjelac, O.; Arandjic, D.; Boris Loncar, B.; Kosutic, D.

    2008-01-01

    Mammography is method of choice for early detection of breast cancer. The purpose of this paper is preliminary evaluation the mammography practice in Serbia, in terms of both quality control indicators, i.e. image quality and patient doses. The survey demonstrated considerable variations in technical parameters that affect image quality and patients doses. Mean glandular doses ranged from 0.12 to 2.8 mGy, while reference optical density ranged from 1.2 to 2.8. Correlation between image contrast and mean glandular doses was demonstrated. Systematic implementation of quality control protocol should provide satisfactory performance of mammography units and maintain satisfactory image quality and keep patient doses as low as reasonably practicable. (author)

  11. Automated quality control methods for sensor data: a novel observatory approach

    Directory of Open Access Journals (Sweden)

    J. R. Taylor

    2013-07-01

    Full Text Available National and international networks and observatories of terrestrial-based sensors are emerging rapidly. As such, there is demand for a standardized approach to data quality control, as well as interoperability of data among sensor networks. The National Ecological Observatory Network (NEON has begun constructing their first terrestrial observing sites, with 60 locations expected to be distributed across the US by 2017. This will result in over 14 000 automated sensors recording more than > 100 Tb of data per year. These data are then used to create other datasets and subsequent "higher-level" data products. In anticipation of this challenge, an overall data quality assurance plan has been developed and the first suite of data quality control measures defined. This data-driven approach focuses on automated methods for defining a suite of plausibility test parameter thresholds. Specifically, these plausibility tests scrutinize the data range and variance of each measurement type by employing a suite of binary checks. The statistical basis for each of these tests is developed, and the methods for calculating test parameter thresholds are explored here. While these tests have been used elsewhere, we apply them in a novel approach by calculating their relevant test parameter thresholds. Finally, implementing automated quality control is demonstrated with preliminary data from a NEON prototype site.

  12. Improving of Quality Control and Quality Assurance in 14C and 3H Laboratory; Participation in the IAEA Model Project

    International Nuclear Information System (INIS)

    Obelic, B.

    2001-01-01

    Full text: Users of laboratory's analytical results are increasingly requiring demonstrable proofs of the reliability and credibility of the results using internationally accepted standards, because the economic, ecological, medical and legal decisions based on laboratory results need to be accepted nationally and internationally. Credibility, respect and opportunities of the laboratories are improved when objective evidence on the reliability and quality of the results can be given. This is achieved through inculcation of a quality culture through definition of well-defined procedures and controls and operational checks characteristic of quality assurance and quality control (Q A/QC). IAEA launched in 1999 a two-and-a-half year model project entitled Quality Control and Quality Assurance of Nuclear Analytical Techniques with participation of laboratories using alpha, beta and/or gamma spectrometry from CEE and NIS countries. The project started to introduce and implement QA principles in accordance with the ISO-17025 guide, leading eventually to a level at which the QA system is self-sustainable and might be appropriate for formal accreditation or certification by respective national authorities. Activities within the project consist of semi-annual reports, two training workshops, two inspection visits of the laboratories by IAEA experts and proficiency tests. The following topics were considered: organisation requirements, acceptance criteria and non-conformance management of QC, internal and external method validation, statistical analyses and uncertainty evaluation, standard operation procedures and quality manual documentation. 14 C and 3 H Laboratory of the Rudjer Boskovic Institute has been one of ten laboratories participating in the Project. In the Laboratory all the procedures required in the quality control were included implicitly, while during the Model Project much effort has been devoted to elaboration of explicit documentation. Since the beginning

  13. Statistic techniques of process control for MTR type

    International Nuclear Information System (INIS)

    Oliveira, F.S.; Ferrufino, F.B.J.; Santos, G.R.T.; Lima, R.M.

    2002-01-01

    This work aims at introducing some improvements on the fabrication of MTR type fuel plates, applying statistic techniques of process control. The work was divided into four single steps and their data were analyzed for: fabrication of U 3 O 8 fuel plates; fabrication of U 3 Si 2 fuel plates; rolling of small lots of fuel plates; applying statistic tools and standard specifications to perform a comparative study of these processes. (author)

  14. VGI QUALITY CONTROL

    Directory of Open Access Journals (Sweden)

    C. C. Fonte

    2015-08-01

    Full Text Available This paper presents a framework for considering quality control of volunteered geographic information (VGI. Different issues need to be considered during the conception, acquisition and post-acquisition phases of VGI creation. This includes items such as collecting metadata on the volunteer, providing suitable training, giving corrective feedback during the mapping process and use of control data, among others. Two examples of VGI data collection are then considered with respect to this quality control framework, i.e. VGI data collection by National Mapping Agencies and by the most recent Geo-Wiki tool, a game called Cropland Capture. Although good practices are beginning to emerge, there is still the need for the development and sharing of best practice, especially if VGI is to be integrated with authoritative map products or used for calibration and/or validation of land cover in the future.

  15. 7 CFR 58.928 - Quality control tests.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Quality control tests. 58.928 Section 58.928... Procedures § 58.928 Quality control tests. All dairy products and other ingredients shall be subject to inspection for quality and condition throughout each processing operation. Quality control tests shall be...

  16. Using Paper Helicopters to Teach Statistical Process Control

    Science.gov (United States)

    Johnson, Danny J.

    2011-01-01

    This hands-on project uses a paper helicopter to teach students how to distinguish between common and special causes of variability when developing and using statistical process control charts. It allows the student to experience a process that is out-of-control due to imprecise or incomplete product design specifications and to discover how the…

  17. Expert database system for quality control

    Science.gov (United States)

    Wang, Anne J.; Li, Zhi-Cheng

    1993-09-01

    There are more competitors today. Markets are not homogeneous they are fragmented into increasingly focused niches requiring greater flexibility in the product mix shorter manufacturing production runs and above allhigher quality. In this paper the author identified a real-time expert system as a way to improve plantwide quality management. The quality control expert database system (QCEDS) by integrating knowledge of experts in operations quality management and computer systems use all information relevant to quality managementfacts as well as rulesto determine if a product meets quality standards. Keywords: expert system quality control data base

  18. 7 CFR 58.335 - Quality control tests.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Quality control tests. 58.335 Section 58.335... Procedures § 58.335 Quality control tests. All milk, cream and related products are subject to inspection for quality and condition throughout each processing operation. Quality control tests shall be made on flow...

  19. Assessment and rationalization of water quality monitoring network: a multivariate statistical approach to the Kabbini River (India).

    Science.gov (United States)

    Mavukkandy, Musthafa Odayooth; Karmakar, Subhankar; Harikumar, P S

    2014-09-01

    The establishment of an efficient surface water quality monitoring (WQM) network is a critical component in the assessment, restoration and protection of river water quality. A periodic evaluation of monitoring network is mandatory to ensure effective data collection and possible redesigning of existing network in a river catchment. In this study, the efficacy and appropriateness of existing water quality monitoring network in the Kabbini River basin of Kerala, India is presented. Significant multivariate statistical techniques like principal component analysis (PCA) and principal factor analysis (PFA) have been employed to evaluate the efficiency of the surface water quality monitoring network with monitoring stations as the evaluated variables for the interpretation of complex data matrix of the river basin. The main objective is to identify significant monitoring stations that must essentially be included in assessing annual and seasonal variations of river water quality. Moreover, the significance of seasonal redesign of the monitoring network was also investigated to capture valuable information on water quality from the network. Results identified few monitoring stations as insignificant in explaining the annual variance of the dataset. Moreover, the seasonal redesign of the monitoring network through a multivariate statistical framework was found to capture valuable information from the system, thus making the network more efficient. Cluster analysis (CA) classified the sampling sites into different groups based on similarity in water quality characteristics. The PCA/PFA identified significant latent factors standing for different pollution sources such as organic pollution, industrial pollution, diffuse pollution and faecal contamination. Thus, the present study illustrates that various multivariate statistical techniques can be effectively employed in sustainable management of water resources. The effectiveness of existing river water quality monitoring

  20. Quality control of radiopharmaceuticals

    International Nuclear Information System (INIS)

    Verdera, E.S.

    1994-01-01

    The quality control of radiopharmaceuticals is based in physics, physics-chemical and biological controls. Between the different controls can be enumerated the following: visual aspect,side, number of particle beams,activity,purity,ph,isotonicity,sterility,radioinmunoessay,toxicity,stability and clinical essay

  1. Austrian Daily Climate Data Rescue and Quality Control

    Science.gov (United States)

    Jurkovic, A.; Lipa, W.; Adler, S.; Albenberger, J.; Lechner, W.; Swietli, R.; Vossberg, I.; Zehetner, S.

    2010-09-01

    Checked climate datasets are a "conditio sine qua non" for all projects that are relevant for environment and climate. In the framework of climate change studies and analysis it is essential to work with quality controlled and trustful data. Furthermore these datasets are used as input for various simulation models. In regard to investigations of extreme events, like strong precipitation periods, drought periods and similar ones we need climate data in high temporal resolution (at least in daily resolution). Because of the historical background - during Second World War the majority of our climate sheets were sent to Berlin, where the historical sheets were destroyed by a bomb attack and so important information got lost - only several climate sheets, mostly duplicates, before 1939 are available and stored in our climate data archive. In 1970 the Central Institute for Meteorology and Geodynamics in Vienna started a first attempt to digitize climate data by means of punch cards. With the introduction of a routinely climate data quality control in 1984 we can speak of high-class-checked daily data (finally checked data, quality flag 6). Our group is working on the processing of digitization and quality control of the historical data for the period 1872 to 1983 for 18 years. Since 2007 it was possible to intensify the work (processes) in the framework of an internal project, namely Austrian Climate Data Rescue and Quality Control. The aim of this initiative was - and still is - to supply daily data in an outstanding good and uniform quality. So this project is a kind of pre-project for all scientific projects which are working with daily data. In addition to routine quality checks (that are running since 1984) using the commercial Bull Software we are testing our data with additional open source software, namely ProClim.db. By the use of this spatial and statistical test procedure, the elements air temperature and precipitation - for several sites in Carinthia - could

  2. 7 CFR 58.642 - Quality control tests.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Quality control tests. 58.642 Section 58.642... Procedures § 58.642 Quality control tests. All mix ingredients shall be subject to inspection for quality and condition throughout each processing operation. Quality control tests shall be made on flow line samples as...

  3. Quality control of nuclear medicine instrumentation

    International Nuclear Information System (INIS)

    Mould, R.F.

    1983-09-01

    The proceedings of a conference held by the Hospital Physicists' Association in London 1983 on the quality control of nuclear medicine instrumentation are presented. Section I deals with the performance of the Anger gamma camera including assessment during manufacture, acceptance testing, routine testing and long-term assessment of results. Section II covers interfaces, computers, the quality control problems of emission tomography and the quality of software. Section III deals with radionuclide measurement and impurity assessment and Section IV the presentation of images and the control of image quality. (U.K.)

  4. Statistical Framework for Recreational Water Quality Criteria and Monitoring

    DEFF Research Database (Denmark)

    Halekoh, Ulrich

    2008-01-01

    recreational governmental authorities controlling water quality. The book opens with a historical account of water quality criteria in the USA between 1922 and 2003. Five chapters are related to sampling strategies and decision rules. Chapter 2 discusses the dependence of decision-making rules on short...... modeling exploiting additional information like meteorological data can support the decision process as shown in Chapter 10. The question of which information to extract from water sample analyses is closely related to the task of risk assessment for human health. Beach-water quality is often measured......Administrators of recreational waters face the basic tasks of surveillance of water quality and decisions on beach closure in case of unacceptable quality. Monitoring and subsequent decisions are based on sampled water probes and fundamental questions are which type of data to extract from...

  5. INFORMATION SYSTEM QUALITY CONTROL KNOWLEDGE

    Directory of Open Access Journals (Sweden)

    Vladimir Nikolaevich Babeshko

    2017-02-01

    Full Text Available The development of the educational system is associated with the need to control the quality of educational services. Quality control knowledge is an important part of the scientific process. The penetration of computers into all areas of activities changing approaches and technologies that previously they were used.

  6. Statistical elements in calculations procedures for air quality control; Elementi di statistica nelle procedure di calcolo per il controllo della qualita' dell'aria

    Energy Technology Data Exchange (ETDEWEB)

    Mura, M.C. [Istituto Superiore di Sanita' , Laboratorio di Igiene Ambientale, Rome (Italy)

    2001-07-01

    The statistical processing of data resulting from the monitoring of chemical atmospheric pollution aimed at air quality control is presented. The form of procedural models may offer a practical instrument to the operators in the sector. The procedural models are modular and can be easily integrated with other models. They include elementary calculation procedures and mathematical methods for statistical analysis. The calculation elements have been developed by probabilistic induction so as to relate them to the statistical analysis. The calculation elements have been developed by probabilistic induction so as to relate them to the statistical models, which are the basis of the methods used for the study and the forecast of atmospheric pollution. This report is part of the updating and training activity that the Istituto Superiore di Sanita' has been carrying on for over twenty years, addressed to operators of the environmental field. [Italian] Il processo di elaborazione statistica dei dati provenienti dal monitoraggio dell'inquinamento chimico dell'atmosfera, finalizzato al controllo della qualita' dell'aria, e' presentato in modelli di procedure al fine di fornire un sintetico strumento di lavoro agli operatori del settore. I modelli di procedure sono modulari ed integrabili. Includono gli elementi di calcolo elementare ed i metodi statistici d'analisi. Gli elementi di calcolo sono sviluppati con metodo d'induzione probabilistica per collegarli ai modelli statistici, che sono alla base dei metodi d'analisi nello studio del fenomeno dell'inquinamento atmosferico anche a fini previsionali. Il rapporto si inserisce nell'attivita' di aggiornamento e di formazione che fin dagli anni ottanta l'Istituto Superiore di Sanita' indirizza agli operatori del settore ambientale.

  7. Quality control scheme for thyroid related hormones measured by radioimmunoassay

    International Nuclear Information System (INIS)

    Kamel, R.S.

    1989-09-01

    A regional quality control scheme for thyroid related hormones measured by radioimmunoassay is being established in the Middle East. The scheme started in January 1985, with eight laboratories which were all from Iraq. At the present nineteen laboratories from Iraq, Jordan, Kuwait, Saudi Arabia and United Arab Emirates (Dubai) are now participating in the scheme. The scheme was supported by the International Atomic Energy Agency. All participants received monthly three freeze dried quality control samples for assay. Results for T3, T4 and TSH received from participants are analysed statistically batch by batch and returned to the participants. Laboratories reporting quite marked bias results were contacted to check the assay performance for that particular batch and to define the weak points. Clinical interpretation for certain well defined samples were reported. A regular case study report is recently introduced to the scheme and will be distributed regularly as one of the guidelines in establishing a trouble shooting programme throughout the scheme. The overall mean between the laboratory performance showed a good result for the T4, moderate but acceptable for T3 and poor for TSH. The statistical analysis of the results based on the concept of a ''target'' value is derived from the believed correct value the ''Median''. The overall mean bias values (ignoring signs) for respectively low, normal and high concentration samples were for T4 18.0 ± 12.5, 11.2 ± 6.4 and 11.2 ± 6.4, for T3 28.8 ± 23.5, 11.2 ± 8.4 and 13.4 ± 9.0 and for TSH 46.3 ± 50.1, 37.2 ± 28.5 and 19.1 ± 12.1. The scheme proved to be effective not only in improving the overall performance but also it helped to develop awareness of the need for internal quality control programmes and gave confidence in the results of the participants. The scheme will continue and will be expanded to involve more laboratories in the region. Refs, fig and tabs

  8. An overview of quality control practices in Ontario with particular reference to cholesterol analysis.

    Science.gov (United States)

    Krishnan, S; Webb, S; Henderson, A R; Cheung, C M; Nazir, D J; Richardson, H

    1999-03-01

    The Laboratory Proficiency Testing Program (LPTP) assesses the analytical performance of all licensed laboratories in Ontario. The LPTP Enzymes, Cardiac Markers, and Lipids Committee conducted a "Patterns of Practice" survey to assess the in-house quality control (QC) practices of laboratories in Ontario using cholesterol as the QC paradigm. The survey was questionnaire-based seeking information on statistical calculations, software rules, review process and data retention, and so on. Copies of the in-house cholesterol QC graphs were requested. A total of 120 of 210 laboratories were randomly chosen to receive the questionnaires during 1995 and 1996; 115 laboratories responded, although some did not answer all questions. The majority calculate means and standard deviations (SD) every month, using anywhere from 4 to >100 data points. 65% use a fixed mean and SD, while 17% use means calculated from the previous month. A few use a floating or cumulative mean. Some laboratories that do not use fixed means use a fixed SD. About 90% use some form of statistical quality control rules. The most common rules used to detect random error are 1(3s)/R4s while 2(2s)/4(1s)/10x are used for systematic errors. About 20% did not assay any QC at levels >5.5 mmol/L. Quality control data are reviewed daily (technologists), weekly and monthly (supervisors/directors). Most laboratories retain their QC records for up to 3 years on paper and magnetic media. On some QC graphs the mean and SD, QC product lot number, or reference to action logs are not apparent. Quality control practices in Ontario are, therefore, disappointing. Improvement is required in the use of clinically appropriate concentrations of QC material and documentation on QC graphs.

  9. Errors in patient specimen collection: application of statistical process control.

    Science.gov (United States)

    Dzik, Walter Sunny; Beckman, Neil; Selleng, Kathleen; Heddle, Nancy; Szczepiorkowski, Zbigniew; Wendel, Silvano; Murphy, Michael

    2008-10-01

    Errors in the collection and labeling of blood samples for pretransfusion testing increase the risk of transfusion-associated patient morbidity and mortality. Statistical process control (SPC) is a recognized method to monitor the performance of a critical process. An easy-to-use SPC method was tested to determine its feasibility as a tool for monitoring quality in transfusion medicine. SPC control charts were adapted to a spreadsheet presentation. Data tabulating the frequency of mislabeled and miscollected blood samples from 10 hospitals in five countries from 2004 to 2006 were used to demonstrate the method. Control charts were produced to monitor process stability. The participating hospitals found the SPC spreadsheet very suitable to monitor the performance of the sample labeling and collection and applied SPC charts to suit their specific needs. One hospital monitored subcategories of sample error in detail. A large hospital monitored the number of wrong-blood-in-tube (WBIT) events. Four smaller-sized facilities, each following the same policy for sample collection, combined their data on WBIT samples into a single control chart. One hospital used the control chart to monitor the effect of an educational intervention. A simple SPC method is described that can monitor the process of sample collection and labeling in any hospital. SPC could be applied to other critical steps in the transfusion processes as a tool for biovigilance and could be used to develop regional or national performance standards for pretransfusion sample collection. A link is provided to download the spreadsheet for free.

  10. 2. Product quality control and assurance system

    International Nuclear Information System (INIS)

    1990-01-01

    Product quality control and assurance are dealt with in relation to reliability in nuclear power engineering. The topics treated include product quality control in nuclear power engineering, product quality assurance of nuclear power plant equipment, quality assurance programs, classification of selected nuclear power equipment, and standards relating to quality control and assurance and to nuclear power engineering. Particular attention is paid to Czechoslovak and CMEA standards. (P.A.). 2 figs., 1 tab., 12 refs

  11. Determination and evaluation of air quality control. Manual of ambient air quality control in Germany

    Energy Technology Data Exchange (ETDEWEB)

    Lahmann, E.

    1997-07-01

    Measurement of air pollution emissions and ambient air quality are essential instruments for air quality control. By undertaking such measurements, pollutants are registered both at their place of origin and at the place where they may have an effect on people or the environment. Both types of measurement complement each other and are essential for the implementation of air quality legislation, particularly, in compliance with emission and ambient air quality limit values. Presented here are similar accounts of measurement principles and also contains as an Appendix a list of suitability-tested measuring devices which is based on information provided by the manufacturers. In addition, the guide of ambient air quality control contains further information on discontinuous measurement methods, on measurement planning and on the assessment of ambient air quality data. (orig./SR)

  12. Applied Behavior Analysis and Statistical Process Control?

    Science.gov (United States)

    Hopkins, B. L.

    1995-01-01

    Incorporating statistical process control (SPC) methods into applied behavior analysis is discussed. It is claimed that SPC methods would likely reduce applied behavior analysts' intimate contacts with problems and would likely yield poor treatment and research decisions. Cases and data presented by Pfadt and Wheeler (1995) are cited as examples.…

  13. 30 CFR 74.6 - Quality control.

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Quality control. 74.6 Section 74.6 Mineral... of the CMDPSU will be maintained in production through adequate quality control procedures, MSHA and... DUST SAMPLING DEVICES Approval Requirements for Coal Mine Dust Personal Sampler Unit § 74.6 Quality...

  14. Water quality, Multivariate statistical techniques, submarine out fall, spatial variation, temporal variation

    International Nuclear Information System (INIS)

    Garcia, Francisco; Palacio, Carlos; Garcia, Uriel

    2012-01-01

    Multivariate statistical techniques were used to investigate the temporal and spatial variations of water quality at the Santa Marta coastal area where a submarine out fall that discharges 1 m3/s of domestic wastewater is located. Two-way analysis of variance (ANOVA), cluster and principal component analysis and Krigging interpolation were considered for this report. Temporal variation showed two heterogeneous periods. From December to April, and July, where the concentration of the water quality parameters is higher; the rest of the year (May, June, August-November) were significantly lower. The spatial variation reported two areas where the water quality is different, this difference is related to the proximity to the submarine out fall discharge.

  15. Results of a multicentre randomised controlled trial of statistical process control charts and structured diagnostic tools to reduce ward-acquired meticillin-resistant Staphylococcus aureus: the CHART Project.

    Science.gov (United States)

    Curran, E; Harper, P; Loveday, H; Gilmour, H; Jones, S; Benneyan, J; Hood, J; Pratt, R

    2008-10-01

    Statistical process control (SPC) charts have previously been advocated for infection control quality improvement. To determine their effectiveness, a multicentre randomised controlled trial was undertaken to explore whether monthly SPC feedback from infection control nurses (ICNs) to healthcare workers of ward-acquired meticillin-resistant Staphylococcus aureus (WA-MRSA) colonisation or infection rates would produce any reductions in incidence. Seventy-five wards in 24 hospitals in the UK were randomised into three arms: (1) wards receiving SPC chart feedback; (2) wards receiving SPC chart feedback in conjunction with structured diagnostic tools; and (3) control wards receiving neither type of feedback. Twenty-five months of pre-intervention WA-MRSA data were compared with 24 months of post-intervention data. Statistically significant and sustained decreases in WA-MRSA rates were identified in all three arms (Pcontrol wards, but with no significant difference between the control and intervention arms (P=0.23). There were significantly more post-intervention 'out-of-control' episodes (P=0.021) in the control arm (averages of 0.60, 0.28, and 0.28 for Control, SPC and SPC+Tools wards, respectively). Participants identified SPC charts as an effective communication tool and valuable for disseminating WA-MRSA data.

  16. FEATURES OF THE APPLICATION OF STATISTICAL INDICATORS OF SCHEDULED FLIGHTS OF AIRCRAFT

    Directory of Open Access Journals (Sweden)

    2016-01-01

    Full Text Available Тhe possibilities of increasing the effectiveness of management of safety of regular aircraft operations on the basis of systematic approach, under normal operating conditions are considered. These new opportunities within the airline are based on Flight Safety Management System integration with quality management system. So far, however, these possibili- ties are practically not implemented due to the limited application of statistical methods. A necessary condition for the implementation of the proposed approach is the use of statistical flight data results of the quality control flight. The proper- ties and peculiarities of application of statistical indicators of flight parameters during the monitoring of flight data are analyzed. It is shown that the main statistical indicators of the controlled process are averages and variations. The features of the application of theoretical models of mathematical statistics in the analysis of flight information are indicated. It is noted that in practice the theoretical models often do not fit into the framework of its application because of the violation of the initial assumptions. Recommendations are given for the integrated use of statistical indicators of the current quality control of flights. Ultimately, the article concludes that the capabilities of the proposed approach allows on the basis of knowledge about the dynamics of statistical indicators of controlled flight process to identify hazards and develop safety indicators for the new information based on data flight operation aircraft.

  17. [Quality control in herbal supplements].

    Science.gov (United States)

    Oelker, Luisa

    2005-01-01

    Quality and safety of food and herbal supplements are the result of a whole of different elements as good manufacturing practice and process control. The process control must be active and able to individuate and correct all possible hazards. The main and most utilized instrument is the hazard analysis critical control point (HACCP) system the correct application of which can guarantee the safety of the product. Herbal supplements need, in addition to standard quality control, a set of checks to assure the harmlessness and safety of the plants used.

  18. Comparing the Effects of Reflexology and Footbath on Sleep Quality in the Elderly: A Controlled Clinical Trial.

    Science.gov (United States)

    Valizadeh, Leila; Seyyedrasooli, Alehe; Zamanazadeh, Vahid; Nasiri, Khadijeh

    2015-11-01

    Sleep disorders are common mental disorders reported among the elderly in all countries, and with nonpharmacological interventions, they could be helped to improve their sleep quality. The aim of this study was to compare the effects of two interventions, foot reflexology and foot bath, on sleep quality in elderly people. This three-group randomized clinical trial (two experimental groups and a control group) was conducted on 69 elderly men. The two experimental groups had reflexology (n = 23) and foot bath (n = 23) interventions for 6 weeks. The reflexology intervention was done in the mornings, once a week for ten minutes on each foot. The participants in the foot bath group were asked to soak their feet in 41°C to 42°C water one hour before sleeping. The pittsburgh sleep quality index (PSQI) was completed before and after the intervention through an interview process. The results showed that the PSQI scores after intervention compared to before it in the reflexology and foot bath groups were statistically significant (P = 0.01 , P = 0.001); however, in the control group did not show a statistically significant difference (P = 0.14). In addition, the total score changes among the three groups were statistically significant (P = 0.01). Comparing the score changes of quality of sleep between the reflexology and foot bath groups showed that there was no significant difference in none of the components and the total score (P = 0.09). The two interventions had the same impact on the quality of sleep. It is suggested that the training of nonpharmacological methods to improve sleep quality such as reflexology and foot bath be included in the elderly health programs. In addition, it is recommended that the impact of these interventions on subjective sleep quality using polysomnographic recordings be explored in future research.

  19. Distributed sensor architecture for intelligent control that supports quality of control and quality of service.

    Science.gov (United States)

    Poza-Lujan, Jose-Luis; Posadas-Yagüe, Juan-Luis; Simó-Ten, José-Enrique; Simarro, Raúl; Benet, Ginés

    2015-02-25

    This paper is part of a study of intelligent architectures for distributed control and communications systems. The study focuses on optimizing control systems by evaluating the performance of middleware through quality of service (QoS) parameters and the optimization of control using Quality of Control (QoC) parameters. The main aim of this work is to study, design, develop, and evaluate a distributed control architecture based on the Data-Distribution Service for Real-Time Systems (DDS) communication standard as proposed by the Object Management Group (OMG). As a result of the study, an architecture called Frame-Sensor-Adapter to Control (FSACtrl) has been developed. FSACtrl provides a model to implement an intelligent distributed Event-Based Control (EBC) system with support to measure QoS and QoC parameters. The novelty consists of using, simultaneously, the measured QoS and QoC parameters to make decisions about the control action with a new method called Event Based Quality Integral Cycle. To validate the architecture, the first five Braitenberg vehicles have been implemented using the FSACtrl architecture. The experimental outcomes, demonstrate the convenience of using jointly QoS and QoC parameters in distributed control systems.

  20. Distributed Sensor Architecture for Intelligent Control that Supports Quality of Control and Quality of Service

    Directory of Open Access Journals (Sweden)

    Jose-Luis Poza-Lujan

    2015-02-01

    Full Text Available This paper is part of a study of intelligent architectures for distributed control and communications systems. The study focuses on optimizing control systems by evaluating the performance of middleware through quality of service (QoS parameters and the optimization of control using Quality of Control (QoC parameters. The main aim of this work is to study, design, develop, and evaluate a distributed control architecture based on the Data-Distribution Service for Real-Time Systems (DDS communication standard as proposed by the Object Management Group (OMG. As a result of the study, an architecture called Frame-Sensor-Adapter to Control (FSACtrl has been developed. FSACtrl provides a model to implement an intelligent distributed Event-Based Control (EBC system with support to measure QoS and QoC parameters. The novelty consists of using, simultaneously, the measured QoS and QoC parameters to make decisions about the control action with a new method called Event Based Quality Integral Cycle. To validate the architecture, the first five Braitenberg vehicles have been implemented using the FSACtrl architecture. The experimental outcomes, demonstrate the convenience of using jointly QoS and QoC parameters in distributed control systems.

  1. Quality control education in the community college

    Science.gov (United States)

    Greene, J. Griffen; Wilson, Steve

    1966-01-01

    This paper describes the Quality Control Program at Daytona Beach Junior College, including course descriptions. The program in quality control required communication between the college and the American Society for Quality Control (ASQC). The college has machinery established for certification of the learning process, and the society has the source of teachers who are competent in the technical field and who are the employers of the educational products. The associate degree for quality control does not have a fixed program, which can serve all needs, any more than all engineering degrees have identical programs. The main ideas which would be common to all quality control programs are the concept of economic control of a repetitive process and the concept of developing individual potentialities into individuals who are needed and productive.

  2. TRAINING SYSTEM OF FUTURE SPECIALISTS: QUALITY CONTROL

    Directory of Open Access Journals (Sweden)

    Vladimir A. Romanov

    2015-01-01

    Full Text Available The aim of the investigation is development of innovative strategy of quality control training of engineers and skilled workers (hereinafter – future specialists in educational professional organizations on the principles of social partnership.Methods. Theoretic: theoretic and methodological analysis, polytheoretic synthesis, modeling. Empirical: research and generalization of the system, process and competence – based approaches experience, experiment, observation, surveys, expert evaluation, SWOT-analysis as a method of strategic planning which is to identify the internal and external factors (socio-cultural of the organization surrounding.Results. The strategy of the development of the process of quality control training in educational professional organizations and a predictive model of the system of quality control training for future engineers and workers have been created on the analysis and synthesis of a quantitative specification of the quality, the obtained experience and success in control training of future specialists in educational professional organizations in recent economic and educational conditions.Scientific novelty. There has been built a predicative model of quality control training of future specialists to meet modern standards and the principles of social partnership; the control algorithm of the learning process, developed in accordance with the standards (international of quality ISO in the implementation of the quality control systems of the process approach (matrix-based responsibility, competence and remit of those responsible for the education process in the educational organization, the «problem» terms and diagnostic tools for assessing the quality of professional training of future specialists. The perspective directions of innovation in the control of the quality of future professionals training have been determined; the parameters of a comprehensive analysis of the state of the system to ensure the

  3. Groundwater quality assessment of urban Bengaluru using multivariate statistical techniques

    Science.gov (United States)

    Gulgundi, Mohammad Shahid; Shetty, Amba

    2018-03-01

    Groundwater quality deterioration due to anthropogenic activities has become a subject of prime concern. The objective of the study was to assess the spatial and temporal variations in groundwater quality and to identify the sources in the western half of the Bengaluru city using multivariate statistical techniques. Water quality index rating was calculated for pre and post monsoon seasons to quantify overall water quality for human consumption. The post-monsoon samples show signs of poor quality in drinking purpose compared to pre-monsoon. Cluster analysis (CA), principal component analysis (PCA) and discriminant analysis (DA) were applied to the groundwater quality data measured on 14 parameters from 67 sites distributed across the city. Hierarchical cluster analysis (CA) grouped the 67 sampling stations into two groups, cluster 1 having high pollution and cluster 2 having lesser pollution. Discriminant analysis (DA) was applied to delineate the most meaningful parameters accounting for temporal and spatial variations in groundwater quality of the study area. Temporal DA identified pH as the most important parameter, which discriminates between water quality in the pre-monsoon and post-monsoon seasons and accounts for 72% seasonal assignation of cases. Spatial DA identified Mg, Cl and NO3 as the three most important parameters discriminating between two clusters and accounting for 89% spatial assignation of cases. Principal component analysis was applied to the dataset obtained from the two clusters, which evolved three factors in each cluster, explaining 85.4 and 84% of the total variance, respectively. Varifactors obtained from principal component analysis showed that groundwater quality variation is mainly explained by dissolution of minerals from rock water interactions in the aquifer, effect of anthropogenic activities and ion exchange processes in water.

  4. 14 CFR 145.211 - Quality control system.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 3 2010-01-01 2010-01-01 false Quality control system. 145.211 Section 145...) SCHOOLS AND OTHER CERTIFICATED AGENCIES REPAIR STATIONS Operating Rules § 145.211 Quality control system. (a) A certificated repair station must establish and maintain a quality control system acceptable to...

  5. 18 CFR 12.40 - Quality control programs.

    Science.gov (United States)

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Quality control... PROJECT WORKS Other Responsibilities of Applicant or Licensee § 12.40 Quality control programs. (a... meeting any requirements or standards set by the Regional Engineer. If a quality control program is...

  6. Developing methods of controlling quality costs

    Directory of Open Access Journals (Sweden)

    Gorbunova A. V.

    2017-01-01

    Full Text Available The article examines issues of managing quality costs, problems of applying economic methods of quality control, implementation of progressive methods of quality costs management in enterprises with the view of improving the efficiency of their evaluation and analysis. With the aim of increasing the effectiveness of the cost management mechanism, authors introduce controlling as a tool of deviation analysis from the standpoint of the process approach. A list of processes and corresponding evaluation criteria in the quality management system of enterprises is introduced. Authors also introduce the method of controlling quality costs and propose it for the practical application, which allows them to determine useful and unnecessary costs at the existing operating plant. Implementing the proposed recommendations in the system of cost management at an enterprise will allow to improve productivity of processes operating and reduce wasted expense on the quality of the process on the basis of determining values of useful and useless costs of quality according to criteria of processes functioning in the system of quality management.

  7. Quality of life after iatrogenic bile duct injury: a case control study.

    LENUS (Irish Health Repository)

    Hogan, Aisling M

    2012-02-01

    OBJECTIVE: To compare quality of life (QOL) of patients following iatrogenic bile duct injuries (BDI) to matched controls. SUMMARY BACKGROUND DATA: BDI complicate approximately 0.3% of all cholecystectomy procedures. The literature regarding impact on quality of life is conflicted as assessment using clinical determinants alone is insufficient. METHODS: The medical outcomes study short form 36 (SF-36), a sensitive tool for quantification of life quality outcome, was used. The study group of iatrogenic BDI was compared with an age- and sex-matched group who underwent uncomplicated cholecystectomy. Telephone questionnaire using the SF-36 quality of life tool was administered to both groups at a median postoperative time of 12 years 8 months (range, 2 months -20 years). RESULTS: Seventy-eight patients were referred with BDI but due to mortality (n = 10) and unavailability (n = 6) 62 participated. The age- and sex-matched control cohort had undergone uncomplicated cholecystectomy (n = 62). Comparison between groups revealed that 7 of 8 variables examined were statistically similar to those of the control group (physical functioning, role physical, bodily pain, general health perceptions, vitality and social functioning, and mental health index). Mean role emotional scores were slightly worse in the BDI group (46 vs. 50) but the significance was borderline (P = 0.045). Subgroup analysis by method of intervention for BDI did not demonstrate significant differences. CONCLUSION: Quality of life of surviving patients following BDI compares favorably to that after uncomplicated laparoscopic cholecystectomy.

  8. Computer controlled quality of analytical measurements

    International Nuclear Information System (INIS)

    Clark, J.P.; Huff, G.A.

    1979-01-01

    A PDP 11/35 computer system is used in evaluating analytical chemistry measurements quality control data at the Barnwell Nuclear Fuel Plant. This computerized measurement quality control system has several features which are not available in manual systems, such as real-time measurement control, computer calculated bias corrections and standard deviation estimates, surveillance applications, evaluaton of measurement system variables, records storage, immediate analyst recertificaton, and the elimination of routine analysis of known bench standards. The effectiveness of the Barnwell computer system has been demonstrated in gathering and assimilating the measurements of over 1100 quality control samples obtained during a recent plant demonstration run. These data were used to determine equaitons for predicting measurement reliability estimates (bias and precision); to evaluate the measurement system; and to provide direction for modification of chemistry methods. The analytical chemistry measurement quality control activities represented 10% of the total analytical chemistry effort

  9. THE SOCIETY’S PERCEPTION OF THE LIFE QUALITY AND POPULATION CONTROL OF STRAY DOGS

    Directory of Open Access Journals (Sweden)

    Flavio Fernando Batista Moutinho

    2015-10-01

    Full Text Available In most Brazilian municipalities there is an overpopulation of stray dogs, which causes problems to the urban order, the environment and the public health, in addition to mistreatment to these dogs. In such context we foresee the need of developing actions targeting the population control of these animals. This essay aims at knowing the perception of social actors, such as managers of entities responsible for control actions, managers of NGOs working with animal protection and population in general with respect to the life quality and population control of stray dogs. Questionnaires were used on samples of individuals of these three groups and the data thereof were analyzed with descriptive statistics techniques and frequency comparison. The results allowed us to conclude that the society’s perception of population control and life quality of these animals bear important differences under the viewpoint of the three evaluated groups; however, they also bear significant similarities, especially with respect to the perception of the responsibility for the development of population control actions, the acceptance of using public funds intended to public health in control actions, the classification of such population density as large and the poor life quality of these animals. population control, social perception, stray dog.,

  10. Application of statistical process control to qualitative molecular diagnostic assays.

    Directory of Open Access Journals (Sweden)

    Cathal P O'brien

    2014-11-01

    Full Text Available Modern pathology laboratories and in particular high throughput laboratories such as clinical chemistry have developed a reliable system for statistical process control. Such a system is absent from the majority of molecular laboratories and where present is confined to quantitative assays. As the inability to apply statistical process control to assay is an obvious disadvantage this study aimed to solve this problem by using a frequency estimate coupled with a confidence interval calculation to detect deviations from an expected mutation frequency. The results of this study demonstrate the strengths and weaknesses of this approach and highlight minimum sample number requirements. Notably, assays with low mutation frequencies and detection of small deviations from an expected value require greater samples with a resultant protracted time to detection. Modelled laboratory data was also used to highlight how this approach might be applied in a routine molecular laboratory. This article is the first to describe the application of statistical process control to qualitative laboratory data.

  11. Quality assurance, quality control and quality audit in diagnostic radiology

    International Nuclear Information System (INIS)

    Vassileva, J.

    2009-01-01

    Full text:The lecture aims to present contemporary view of quality assurance in X-Ray diagnosis and its practical realization in Bulgaria. In the lecture the concepts of quality assurance, quality control and clinical audit will be defined and their scope will be considered. An answer of the following questions will be given: why is it necessary to determine the dose of patient in X-ray studies, what is the reference dose level and how it is used for dosimetric quantity which characterized the patient's exposure in X-ray, mammography and CT scans and how they are measured, who conducted the measurement and how to keep the records, what are the variations of doses in identical tests and what defines them? The findings from a national survey of doses in diagnostic radiology, conducted in 2008-2009 and the developed new national reference levels will be presented. The main findings of the first tests of radiological equipment and the future role of quality control as well as the concept of conducting clinical audit and its role in quality assurance are also presented. Quality assurance of the diagnostic process with minimal exposure of patients is a strategic goal whose realization requires understanding, organization and practical action, both nationally and in every hospital. To achieve this the important role of education and training of physicians, radiological technicians and medical physicists is enhanced

  12. Methodological and Reporting Quality of Comparative Studies Evaluating Health-Related Quality of Life of Colorectal Cancer Patients and Controls: A Systematic Review.

    Science.gov (United States)

    Wong, Carlos K H; Guo, Vivian Y W; Chen, Jing; Lam, Cindy L K

    2016-11-01

    Health-related quality of life is an important outcome measure in patients with colorectal cancer. Comparison with normative data has been increasingly undertaken to assess the additional impact of colorectal cancer on health-related quality of life. This review aimed to critically appraise the methodological details and reporting characteristics of comparative studies evaluating differences in health-related quality of life between patients and controls. A systematic search of English-language literature published between January 1985 and May 2014 was conducted through a database search of PubMed, Web of Science, Embase, and Medline. Comparative studies reporting health-related quality-of-life outcomes among patients who have colorectal cancer and controls were selected. Methodological and reporting quality per comparison study was evaluated based on a 11-item methodological checklist proposed by Efficace in 2003 and a set of criteria predetermined by reviewers. Thirty-one comparative studies involving >10,000 patients and >10,000 controls were included. Twenty-three studies (74.2%) originated from European countries, with the largest number from the Netherlands (n = 6). Twenty-eight studies (90.3%) compared the health-related quality of life of patients with normative data published elsewhere, whereas the remaining studies recruited a group of patients who had colorectal cancer and a group of control patients within the same studies. The European Organisation for Research and Treatment of Cancer Quality-of-Life Questionnaire Core 30 was the most extensively used instrument (n = 16; 51.6%). Eight studies (25.8%) were classified as "probably robust" for clinical decision making according to the Efficace standard methodological checklist. Our further quality assessment revealed the lack of score differences reported (61.3%), contemporary comparisons (36.7%), statistical significance tested (38.7%), and matching of control group (58.1%), possibly leading to

  13. Employee quality, monitoring environment and internal control

    Directory of Open Access Journals (Sweden)

    Chunli Liu

    2017-03-01

    Full Text Available We investigate the effect of internal control employees (ICEs on internal control quality. Using special survey data from Chinese listed firms, we find that ICE quality has a significant positive influence on internal control quality. We examine the effect of monitoring on this result and find that the effect is more pronounced for firms with strict monitoring environments, especially when the firms implement the Chinese internal control regulation system (CSOX, have higher institutional ownership or attach greater importance to internal control. Our findings suggest that ICEs play an important role in the design and implementation of internal control systems. Our study should be of interest to both top managers who wish to improve corporate internal control quality and regulators who wish to understand the mechanisms of internal control monitoring.

  14. Problems of quality assurance and quality control in diagnostic radiology

    International Nuclear Information System (INIS)

    Angerstein, W.

    1986-01-01

    Topical problems of quality assurance and quality control in diagnostic radiology are discussed and possible solutions are shown. Complex units are differentiated with reference to physicians, technicians, organization of labour, methods of examination and indication. Quality control of radiologic imaging systems should involve three stages: (1) simple tests carried out by radiologic technicians, (2) measurements by service technicians, (3) testing of products by the manufacturer and independent governmental or health service test agencies. (author)

  15. Quality Control in construction.

    Science.gov (United States)

    1984-01-01

    behavioral scientists. In 1962, Dr. Kaoru Ishikawa gave shape to the form of training which featured intradepartmental groups of ten or so workers seated...and Japanese circles bears closer scrutiny. 4.3.1 Japanese Ingredients of Quality The founder of quality circles, Dr. Kaoru Ishikawa , gives six...around 51 a table; hence the name Quality Control Circle. 4 Dr. 0 Ishikawa was an engineering professor at Tokyo University, and the circles were

  16. Quality management and Juran's legacy

    NARCIS (Netherlands)

    Bisgaard, S.

    2008-01-01

    Quality management provides the framework for the industrial application of statistical quality control, design of experiments, quality improvement, and reliability methods. It is therefore helpful for quality engineers and statisticians to be familiar with basic quality management principles. In

  17. Establishment for quality control of experimental animal

    International Nuclear Information System (INIS)

    Kim, Tae Hwan; Kim, Soo Kwan; Kim, Tae Kyoung

    1999-06-01

    Until now, because we have imported experimental animal from foreign experimental animal corporation, we could have saved money by establishing the quality control of animal in barrier system. In order to improve the quality of animal experiment and efficiency of biomedical study, it is indispensable to control many factors that effect in the experiment. Therefore, it is essential to organize the system of laboratory animal care for enhancing reliability and revivability of experimental results. The purpose of the present investigation was to establish the quality control system of experimental animals that we can provide good quality animals according to the experimental condition of each investigator although the exact quality control system to estimate the infection of bacteria and virus easily remains ill-defined yet. Accordingly, we established the useful quality control system for microbiologic monitoring and environmental monitoring to protect experimental animal from harmful bacteria and virus

  18. Establishment for quality control of experimental animal

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Tae Hwan; Kim, Soo Kwan; Kim, Tae Kyoung

    1999-06-01

    Until now, because we have imported experimental animal from foreign experimental animal corporation, we could have saved money by establishing the quality control of animal in barrier system. In order to improve the quality of animal experiment and efficiency of biomedical study, it is indispensable to control many factors that effect in the experiment. Therefore, it is essential to organize the system of laboratory animal care for enhancing reliability and revivability of experimental results. The purpose of the present investigation was to establish the quality control system of experimental animals that we can provide good quality animals according to the experimental condition of each investigator although the exact quality control system to estimate the infection of bacteria and virus easily remains ill-defined yet. Accordingly, we established the useful quality control system for microbiologic monitoring and environmental monitoring to protect experimental animal from harmful bacteria and virus.

  19. Design of a Quality Control Program for the Measurement of Gross Alpha and Gross Beta Activities (LMPR-CIEMAT); Diseno del Control de Calidad de las Medidas de Actividad Alfa-Beta Total (LMPR-CIEMAT)

    Energy Technology Data Exchange (ETDEWEB)

    Alvarez, A.; Yague, L.; Gasco, C.; Navarro, N.; Higueras, E.; Noguerales, C.

    2010-10-21

    In accordance with international standards, general requirements for testing laboratories have to include a quality system for planning, implementing, and assessing the work performed by the organization and for carrying out required quality assurance and quality control. The purpose of internal laboratory quality control is to monitor performance, identify problems, and initiate corrective actions. This report describes the internal quality control to monitor the gross alpha and beta activities determination. Identification of specific performance indicators, the principles that govern their use and statistical means of evaluation are explained. Finally, calculation of alpha and beta specific activities, uncertainties and detection limits are performed. (Author) 10 refs.

  20. Quality control for dose calibrators

    International Nuclear Information System (INIS)

    Mendes, L.C.G.

    1984-01-01

    Nuclear medicine laboratories are required to assay samples of radioactivity to be administered to patients. Almost universally, these assays are accomplished by use of a well ionization chamber isotope calibrator. The Instituto de Radioprotecao e Dosimetria (Institute for Radiological Protection and Dosimetry) of the Comissao Nacional de Energia Nuclear (National Commission for Nuclear Energy) is carrying out a National Quality Control Programme in Nuclear Medicine, supported by the International Atomic Energy Agency. The assessment of the current needs and practices of quality control in the entire country of Brazil includes Dose Calibrators and Scintillation Cameras, but this manual is restricted to the former. Quality Control Procedures for these Instruments are described in this document together with specific recommendations and assessment of its accuracy. (author)

  1. The Statistical point of view of Quality: the Lean Six Sigma methodology.

    Science.gov (United States)

    Bertolaccini, Luca; Viti, Andrea; Terzi, Alberto

    2015-04-01

    Six Sigma and Lean are two quality improvement methodologies. The Lean Six Sigma methodology is applicable to repetitive procedures. Therefore, the use of this methodology in the health-care arena has focused mainly on areas of business operations, throughput, and case management and has focused on efficiency outcomes. After the revision of methodology, the paper presents a brief clinical example of the use of Lean Six Sigma as a quality improvement method in the reduction of the complications during and after lobectomies. Using Lean Six Sigma methodology, the multidisciplinary teams could identify multiple modifiable points across the surgical process. These process improvements could be applied to different surgical specialties and could result in a measurement, from statistical point of view, of the surgical quality.

  2. Statistical process control for residential treated wood

    Science.gov (United States)

    Patricia K. Lebow; Timothy M. Young; Stan Lebow

    2017-01-01

    This paper is the first stage of a study that attempts to improve the process of manufacturing treated lumber through the use of statistical process control (SPC). Analysis of industrial and auditing agency data sets revealed there are differences between the industry and agency probability density functions (pdf) for normalized retention data. Resampling of batches of...

  3. Automatic optimisation of beam orientations using the simplex algorithm and optimisation of quality control using statistical process control (S.P.C.) for intensity modulated radiation therapy (I.M.R.T.); Optimisation automatique des incidences des faisceaux par l'algorithme du simplexe et optimisation des controles qualite par la Maitrise Statistique des Processus (MSP) en Radiotherapie Conformationnelle par Modulation d'Intensite (RCMI)

    Energy Technology Data Exchange (ETDEWEB)

    Gerard, K

    2008-11-15

    Intensity Modulated Radiation Therapy (I.M.R.T.) is currently considered as a technique of choice to increase the local control of the tumour while reducing the dose to surrounding organs at risk. However, its routine clinical implementation is partially held back by the excessive amount of work required to prepare the patient treatment. In order to increase the efficiency of the treatment preparation, two axes of work have been defined. The first axis concerned the automatic optimisation of beam orientations. We integrated the simplex algorithm in the treatment planning system. Starting from the dosimetric objectives set by the user, it can automatically determine the optimal beam orientations that best cover the target volume while sparing organs at risk. In addition to time sparing, the simplex results of three patients with a cancer of the oropharynx, showed that the quality of the plan is also increased compared to a manual beam selection. Indeed, for an equivalent or even a better target coverage, it reduces the dose received by the organs at risk. The second axis of work concerned the optimisation of pre-treatment quality control. We used an industrial method: Statistical Process Control (S.P.C.) to retrospectively analyse the absolute dose quality control results performed using an ionisation chamber at Centre Alexis Vautrin (C.A.V.). This study showed that S.P.C. is an efficient method to reinforce treatment security using control charts. It also showed that our dose delivery process was stable and statistically capable for prostate treatments, which implies that a reduction of the number of controls can be considered for this type of treatment at the C.A.V.. (author)

  4. Statistical issues in reporting quality data: small samples and casemix variation.

    Science.gov (United States)

    Zaslavsky, A M

    2001-12-01

    To present two key statistical issues that arise in analysis and reporting of quality data. Casemix variation is relevant to quality reporting when the units being measured have differing distributions of patient characteristics that also affect the quality outcome. When this is the case, adjustment using stratification or regression may be appropriate. Such adjustments may be controversial when the patient characteristic does not have an obvious relationship to the outcome. Stratified reporting poses problems for sample size and reporting format, but may be useful when casemix effects vary across units. Although there are no absolute standards of reliability, high reliabilities (interunit F > or = 10 or reliability > or = 0.9) are desirable for distinguishing above- and below-average units. When small or unequal sample sizes complicate reporting, precision may be improved using indirect estimation techniques that incorporate auxiliary information, and 'shrinkage' estimation can help to summarize the strength of evidence about units with small samples. With broader understanding of casemix adjustment and methods for analyzing small samples, quality data can be analysed and reported more accurately.

  5. 42 CFR 84.256 - Quality control requirements.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 1 2010-10-01 2010-10-01 false Quality control requirements. 84.256 Section 84.256... § 84.256 Quality control requirements. (a) In addition to the construction and performance requirements specified in §§ 84.251, 84.252, 84.253, 84.254, and 84.255, the quality control requirements in paragraphs...

  6. Report on the analysis of the quality assurance and quality control data for the petroleum refining sector

    International Nuclear Information System (INIS)

    Thorton, N.; Michajluk, S.; Powell, T.; Lee, G.

    1992-07-01

    The Ontario Municipal-Industrial Strategy for Abatement (MISA) program has the ultimate goal of virtual elimination of persistent toxic contaminants from all discharges to provincial waterways. MISA effluent monitoring regulations, first promulgated for the petroleum refining sector, require direct dischargers to monitor their effluents for a specified number of contaminants and a specified frequency over a one-year period. The refineries were also required to carry out a quality control program on all process effluent streams and for specified analytical test groups. Two types of quality assurance/quality control (QA/QC) data were required: field QA/QC, which would indicate problems with field contamination or sampling, and laboratory QA/QC, which would indicate problems with the laboratory. The objectives of QA/QC analysis are to identify the significance of biases, chronic contamination, data variability, and false results, to assess data validity, and to allow data comparability among companies and laboratories. Of the 149 parameters monitored in the petroleum refining sector, 34 qualified as candidates for setting effluent limits. The QA/QC evaluation of monitoring data for the 34 parameters confirmed the presence of 18 parameters at such levels that they could be used to set statistically valid quantitative limits. 50 tabs

  7. No-reference image quality assessment based on statistics of convolution feature maps

    Science.gov (United States)

    Lv, Xiaoxin; Qin, Min; Chen, Xiaohui; Wei, Guo

    2018-04-01

    We propose a Convolutional Feature Maps (CFM) driven approach to accurately predict image quality. Our motivation bases on the finding that the Nature Scene Statistic (NSS) features on convolution feature maps are significantly sensitive to distortion degree of an image. In our method, a Convolutional Neural Network (CNN) is trained to obtain kernels for generating CFM. We design a forward NSS layer which performs on CFM to better extract NSS features. The quality aware features derived from the output of NSS layer is effective to describe the distortion type and degree an image suffered. Finally, a Support Vector Regression (SVR) is employed in our No-Reference Image Quality Assessment (NR-IQA) model to predict a subjective quality score of a distorted image. Experiments conducted on two public databases demonstrate the promising performance of the proposed method is competitive to state of the art NR-IQA methods.

  8. Data Quality Control of the French Permanent Broadband Network in the RESIF Framework

    Science.gov (United States)

    Grunberg, Marc; Lambotte, Sophie; Engels, Fabien; Dretzen, Remi; Hernandez, Alain

    2014-05-01

    In the framework of the RESIF (Réseau Sismologique et géodésique Français) project, a new information system is being setting up, allowing the improvement of the management and the distribution of high quality data from the different elements of RESIF and the associated networks. Within this information system, EOST (in Strasbourg) is in charge of collecting real-time permanent broadband seismic waveform, and performing Quality Control on these data. The real-time and validated data set are pushed to the French National Distribution Center (Isterre/Grenoble) in order to make them publicly available. Furthermore EOST hosts the BCSF-ReNaSS, in charge of the French metropolitan seismic bulletin. This allows to benefit from some high-end quality control based on the national and world-wide seismicity. Here we present first the real-time seismic data flow from the stations of the French National Broad Band Network to EOST, and then, the data Quality Control procedures that were recently installed, including some new developments. The data Quality Control consists in applying a variety of subprocesses to check the consistency of the whole system and process from the stations to the data center. This allows us to verify that instruments and data transmission are operating correctly. Moreover analysis of the ambient noise helps to characterize intrinsic seismic quality of the stations and to identify other kind of disturbances. The deployed Quality Control consist in a pipeline that starts with low-level procedures : check the real-time miniseed data file (file naming convention, data integrity), check for inconsistencies between waveform and meta-data (channel name, sample rate, etc.), compute waveform statistics (data availability, gap/overlap, mean, rms, time quality, spike). It is followed by some high-level procedures such as : power spectral density computation (PSD), STA/LTA computation to be correlated to the seismicity, phases picking and stations magnitudes

  9. [Methodological quality and reporting quality evaluation of randomized controlled trials published in China Journal of Chinese Materia Medica].

    Science.gov (United States)

    Yu, Dan-Dan; Xie, Yan-Ming; Liao, Xing; Zhi, Ying-Jie; Jiang, Jun-Jie; Chen, Wei

    2018-02-01

    To evaluate the methodological quality and reporting quality of randomized controlled trials(RCTs) published in China Journal of Chinese Materia Medica, we searched CNKI and China Journal of Chinese Materia webpage to collect RCTs since the establishment of the magazine. The Cochrane risk of bias assessment tool was used to evaluate the methodological quality of RCTs. The CONSORT 2010 list was adopted as reporting quality evaluating tool. Finally, 184 RCTs were included and evaluated methodologically, of which 97 RCTs were evaluated with reporting quality. For the methodological evaluating, 62 trials(33.70%) reported the random sequence generation; 9(4.89%) trials reported the allocation concealment; 25(13.59%) trials adopted the method of blinding; 30(16.30%) trials reported the number of patients withdrawing, dropping out and those lost to follow-up;2 trials (1.09%) reported trial registration and none of the trial reported the trial protocol; only 8(4.35%) trials reported the sample size estimation in details. For reporting quality appraising, 3 reporting items of 25 items were evaluated with high-quality,including: abstract, participants qualified criteria, and statistical methods; 4 reporting items with medium-quality, including purpose, intervention, random sequence method, and data collection of sites and locations; 9 items with low-quality reporting items including title, backgrounds, random sequence types, allocation concealment, blindness, recruitment of subjects, baseline data, harms, and funding;the rest of items were of extremely low quality(the compliance rate of reporting item<10%). On the whole, the methodological and reporting quality of RCTs published in the magazine are generally low. Further improvement in both methodological and reporting quality for RCTs of traditional Chinese medicine are warranted. It is recommended that the international standards and procedures for RCT design should be strictly followed to conduct high-quality trials

  10. Statistical quality management

    NARCIS (Netherlands)

    Laan, van der P.

    1992-01-01

    Enkele algemene opmerkingen worden gemaakt over statistische kwaliteitszorg. Totale of Integrale Kwaliteitszorg (Total Quality Management) wordt kort besproken. Voordracht gehouden op 21 oktober 1992 voor leden van de studentenvereniging GEWIS voor Wiskunde en Informatica.

  11. Employee quality, monitoring environment and internal control

    OpenAIRE

    Chunli Liu; Bin Lin; Wei Shu

    2017-01-01

    We investigate the effect of internal control employees (ICEs) on internal control quality. Using special survey data from Chinese listed firms, we find that ICE quality has a significant positive influence on internal control quality. We examine the effect of monitoring on this result and find that the effect is more pronounced for firms with strict monitoring environments, especially when the firms implement the Chinese internal control regulation system (CSOX), have higher institutional ow...

  12. Quality Control of Wind Data from 50-MHz Doppler Radar Wind Profiler

    Science.gov (United States)

    Vacek, Austin

    2016-01-01

    Upper-level wind profiles obtained from a 50-MHz Doppler Radar Wind Profiler (DRWP) instrument at Kennedy Space Center are incorporated in space launch vehicle design and day-of-launch operations to assess wind effects on the vehicle during ascent. Automated and manual quality control (QC) techniques are implemented to remove spurious data in the upper-level wind profiles caused from atmospheric and non-atmospheric artifacts over the 2010-2012 period of record (POR). By adding the new quality controlled profiles with older profiles from 1997-2009, a robust database will be constructed of upper-level wind characteristics. Statistical analysis will determine the maximum, minimum, and 95th percentile of the wind components from the DRWP profiles over recent POR and compare against the older database. Additionally, this study identifies specific QC flags triggered during the QC process to understand how much data is retained and removed from the profiles.

  13. Statistical physics of human beings in games: Controlled experiments

    International Nuclear Information System (INIS)

    Liang Yuan; Huang Ji-Ping

    2014-01-01

    It is important to know whether the laws or phenomena in statistical physics for natural systems with non-adaptive agents still hold for social human systems with adaptive agents, because this implies whether it is possible to study or understand social human systems by using statistical physics originating from natural systems. For this purpose, we review the role of human adaptability in four kinds of specific human behaviors, namely, normal behavior, herd behavior, contrarian behavior, and hedge behavior. The approach is based on controlled experiments in the framework of market-directed resource-allocation games. The role of the controlled experiments could be at least two-fold: adopting the real human decision-making process so that the system under consideration could reflect the performance of genuine human beings; making it possible to obtain macroscopic physical properties of a human system by tuning a particular factor of the system, thus directly revealing cause and effect. As a result, both computer simulations and theoretical analyses help to show a few counterparts of some laws or phenomena in statistical physics for social human systems: two-phase phenomena or phase transitions, entropy-related phenomena, and a non-equilibrium steady state. This review highlights the role of human adaptability in these counterparts, and makes it possible to study or understand some particular social human systems by means of statistical physics coming from natural systems. (topical review - statistical physics and complex systems)

  14. Pengendalian Kualitas Kertas Dengan Menggunakan Statistical Process Control di Paper Machine 3

    Directory of Open Access Journals (Sweden)

    Vera Devani

    2017-01-01

    Full Text Available Purpose of this research is to determine types and causes of defects commonly found in Paper Machine 3 by using statistical process control (SPC method.  Statistical process control (SPC is a technique for solving problems and is used to monitor, control, analyze, manage and improve products and processes using statistical methods.  Based on Pareto Diagrams, wavy defect is found as the most frequent defect, which is 81.7%.  Human factor, meanwhile, is found as the main cause of defect, primarily due to lack of understanding on machinery and lack of training both leading to errors in data input.

  15. Combining Statistical Methodologies in Water Quality Monitoring in a Hydrological Basin - Space and Time Approaches

    OpenAIRE

    Costa, Marco; A. Manuela Gonçalves

    2012-01-01

    In this work are discussed some statistical approaches that combine multivariate statistical techniques and time series analysis in order to describe and model spatial patterns and temporal evolution by observing hydrological series of water quality variables recorded in time and space. These approaches are illustrated with a data set collected in the River Ave hydrological basin located in the Northwest region of Portugal.

  16. Impact analysis of critical success factors on the benefits from statistical process control implementation

    Directory of Open Access Journals (Sweden)

    Fabiano Rodrigues Soriano

    Full Text Available Abstract The Statistical Process Control - SPC is a set of statistical techniques focused on process control, monitoring and analyzing variation causes in the quality characteristics and/or in the parameters used to control and process improvements. Implementing SPC in organizations is a complex task. The reasons for its failure are related to organizational or social factors such as lack of top management commitment and little understanding about its potential benefits. Other aspects concern technical factors such as lack of training on and understanding about the statistical techniques. The main aim of the present article is to understand the interrelations between conditioning factors associated with top management commitment (Support, SPC Training and Application, as well as to understand the relationships between these factors and the benefits associated with the implementation of the program. The Partial Least Squares Structural Equation Modeling (PLS-SEM was used in the analysis since the main goal is to establish the causal relations. A cross-section survey was used as research method to collect information of samples from Brazilian auto-parts companies, which were selected according to guides from the auto-parts industry associations. A total of 170 companies were contacted by e-mail and by phone in order to be invited to participate in the survey. However, just 93 industries agreed on participating, and only 43 answered the questionnaire. The results showed that the senior management support considerably affects the way companies develop their training programs. In turn, these trainings affect the way companies apply the techniques. Thus, it will reflect on the benefits gotten from implementing the program. It was observed that the managerial and technical aspects are closely connected to each other and that they are represented by the ratio between top management and training support. The technical aspects observed through SPC

  17. Evaluation of quality assurance/quality control data collected by the U.S. Geological Survey for water-quality activities at the Idaho National Engineering and Environmental Laboratory, Idaho, 1994 through 1995

    International Nuclear Information System (INIS)

    Williams, L.M.

    1997-03-01

    More than 4,000 water samples were collected by the US Geological Survey (USGS) from 179 monitoring sites for the water-quality monitoring program at the Idaho National Engineering Laboratory from 1994 through 1995. Approximately 500 of the water samples were replicate or blank samples collected for the quality assurance/quality control program. Analyses were performed to determine the concentrations of major ions, nutrients, trace elements, gross radioactivity and radionuclides, total organic carbon, and volatile organic compounds in the samples. To evaluate the precision of field and laboratory methods, analytical results of the replicate pairs of samples were compared statistically for equivalence on the basis of the precision associated with each result. In all, the statistical comparison of the data indicated that 95% of the replicate pairs were equivalent. Within the major ion analyses, 97% were equivalent; nutrients, 88%; trace elements, 95%; gross radioactivity and radionuclides, 93%; and organic constituents, 98%. Ninety percent or more of the analytical results for each constituent were equivalent, except for nitrite, orthophosphate, phosphorus, aluminum, iron, strontium-90, and total organic carbon

  18. Pitch Motion Stabilization by Propeller Speed Control Using Statistical Controller Design

    DEFF Research Database (Denmark)

    Nakatani, Toshihiko; Blanke, Mogens; Galeazzi, Roberto

    2006-01-01

    This paper describes dynamics analysis of a small training boat and a possibility of ship pitch stabilization by control of propeller speed. After upgrading the navigational system of an actual small training boat, in order to identify the model of the ship, the real data collected by sea trials...... were used for statistical analysis and system identification. This analysis shows that the pitching motion is indeed influenced by engine speed and it is suggested that there exists a possibility of reducing the pitching motion by properly controlling the engine throttle. Based on this observation...

  19. Automatic initialization and quality control of large-scale cardiac MRI segmentations.

    Science.gov (United States)

    Albà, Xènia; Lekadir, Karim; Pereañez, Marco; Medrano-Gracia, Pau; Young, Alistair A; Frangi, Alejandro F

    2018-01-01

    Continuous advances in imaging technologies enable ever more comprehensive phenotyping of human anatomy and physiology. Concomitant reduction of imaging costs has resulted in widespread use of imaging in large clinical trials and population imaging studies. Magnetic Resonance Imaging (MRI), in particular, offers one-stop-shop multidimensional biomarkers of cardiovascular physiology and pathology. A wide range of analysis methods offer sophisticated cardiac image assessment and quantification for clinical and research studies. However, most methods have only been evaluated on relatively small databases often not accessible for open and fair benchmarking. Consequently, published performance indices are not directly comparable across studies and their translation and scalability to large clinical trials or population imaging cohorts is uncertain. Most existing techniques still rely on considerable manual intervention for the initialization and quality control of the segmentation process, becoming prohibitive when dealing with thousands of images. The contributions of this paper are three-fold. First, we propose a fully automatic method for initializing cardiac MRI segmentation, by using image features and random forests regression to predict an initial position of the heart and key anatomical landmarks in an MRI volume. In processing a full imaging database, the technique predicts the optimal corrective displacements and positions in relation to the initial rough intersections of the long and short axis images. Second, we introduce for the first time a quality control measure capable of identifying incorrect cardiac segmentations with no visual assessment. The method uses statistical, pattern and fractal descriptors in a random forest classifier to detect failures to be corrected or removed from subsequent statistical analysis. Finally, we validate these new techniques within a full pipeline for cardiac segmentation applicable to large-scale cardiac MRI databases. The

  20. An approach to quality and performance control in a computer-assisted clinical chemistry laboratory.

    Science.gov (United States)

    Undrill, P E; Frazer, S C

    1979-01-01

    A locally developed, computer-based clinical chemistry laboratory system has been in operation since 1970. This utilises a Digital Equipment Co Ltd PDP 12 and an interconnected PDP 8/F computer. Details are presented of the performance and quality control techniques incorporated into the system. Laboratory performance is assessed through analysis of results from fixed-level control sera as well as from cumulative sum methods. At a simple level the presentation may be considered purely indicative, while at a more sophisticated level statistical concepts have been introduced to aid the laboratory controller in decision-making processes. PMID:438340

  1. A novel Python program for implementation of quality control in the ELISA.

    Science.gov (United States)

    Wetzel, Hanna N; Cohen, Cinder; Norman, Andrew B; Webster, Rose P

    2017-09-01

    The use of semi-quantitative assays such as the enzyme-linked immunosorbent assay (ELISA) requires stringent quality control of the data. However, such quality control is often lacking in academic settings due to unavailability of software and knowledge. Therefore, our aim was to develop methods to easily implement Levey-Jennings quality control methods. For this purpose, we created a program written in Python (a programming language with an open-source license) and tested it using a training set of ELISA standard curves quantifying the Fab fragment of an anti-cocaine monoclonal antibody in mouse blood. A colorimetric ELISA was developed using a goat anti-human anti-Fab capture method. Mouse blood samples spiked with the Fab fragment were tested against a standard curve of known concentrations of Fab fragment in buffer over a period of 133days stored at 4°C to assess stability of the Fab fragment and to generate a test dataset to assess the program. All standard curves were analyzed using our program to batch process the data and to generate Levey-Jennings control charts and statistics regarding the datasets. The program was able to identify values outside of two standard deviations, and this identification of outliers was consistent with the results of a two-way ANOVA. This program is freely available, which will help laboratories implement quality control methods, thus improving reproducibility within and between labs. We report here successful testing of the program with our training set and development of a method for quantification of the Fab fragment in mouse blood. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. A Comparison of Power Quality Controllers

    Directory of Open Access Journals (Sweden)

    Petr Černek

    2012-01-01

    Full Text Available This paper focuses on certain types of FACTS (Flexibile AC Transmission System controllers, which can be used for improving the power quality at the point of connection with the power network. It focuses on types of controllers that are suitable for use in large buildings, rather than in transmission networks. The goal is to compare the features of the controllers in specific tasks, and to clarify which solution is best for a specific purpose. It is in some cases better and cheaper to use a combination of controllers than a single controller. The paper also presents the features of a shunt active harmonic compensator, which is a very modern power quality controller that can be used in many cases, or in combination with other controllers. The comparison was made using a matrix diagram that, resulted from mind mapsand other analysis tools. The paper should help engineers to choose the best solution for improving the power quality in a specific power network at distribution level.

  3. Modern requirements to quality assurance and control in nuclear fuel fabrication

    International Nuclear Information System (INIS)

    Weidinger, H.G.

    1999-01-01

    This lecture have shown a new type of quality assurance management has already successfully introduced in various industries and now starts to be used increasingly in the nuclear fuel industry. Static authority regulations and a tendency to bureaucratic understanding and handling of these regulations lead to a delayed start and a relatively slow progress of these quality strategies in the nuclear fuel technology. However, the economic pressure of strong competition and increasing demands of the utilities as the user of nuclear fuel result in a more determined introduction also to this area. The different use of statistical methods of two different fuel vendors are shown. Vendor A uses old fashioned methods. The focus is on the expensive final product control and few emphasis is on design of experiments and process control. Consequently, this vendor will have high costs, not only for QC and rejection but also for repair and replace actions after delivery. To the contrary, vendor B invests primarily in the design of experiments and process control. This vendor will profit only from lower direct costs but also from being at the front line of technical development and from enjoying a satisfied and happy customer. Many well examined quality management tools are available today which help not only to improve the quality but also decrease the costs. Still, the progress in using these techniques in nuclear fuel technology is limited and not comparable to the progress in other industries like automobile production or the electronic industry. (author)

  4. quality control

    International Nuclear Information System (INIS)

    Skujina, A.; Purina, S.; Riekstina, D.

    1999-01-01

    The optimal objects: soils, spruce needles and bracken ferns were found for the environmental monitoring in the regions of possible radioactive contamination - near SalaspiIs nuclear reactor and Ignalina nuclear power plant. The determination of Sr-90 was based on the radiochemical separation of Sr-90 (=Y-90) by HDEHP extraction and counting the Cerenkov radiation. The quality control of the results was carried out. (authors)

  5. Statistical iterative reconstruction to improve image quality for digital breast tomosynthesis

    International Nuclear Information System (INIS)

    Xu, Shiyu; Chen, Ying; Lu, Jianping; Zhou, Otto

    2015-01-01

    Purpose: Digital breast tomosynthesis (DBT) is a novel modality with the potential to improve early detection of breast cancer by providing three-dimensional (3D) imaging with a low radiation dose. 3D image reconstruction presents some challenges: cone-beam and flat-panel geometry, and highly incomplete sampling. A promising means to overcome these challenges is statistical iterative reconstruction (IR), since it provides the flexibility of accurate physics modeling and a general description of system geometry. The authors’ goal was to develop techniques for applying statistical IR to tomosynthesis imaging data. Methods: These techniques include the following: a physics model with a local voxel-pair based prior with flexible parameters to fine-tune image quality; a precomputed parameter λ in the prior, to remove data dependence and to achieve a uniform resolution property; an effective ray-driven technique to compute the forward and backprojection; and an oversampled, ray-driven method to perform high resolution reconstruction with a practical region-of-interest technique. To assess the performance of these techniques, the authors acquired phantom data on the stationary DBT prototype system. To solve the estimation problem, the authors proposed an optimization-transfer based algorithm framework that potentially allows fewer iterations to achieve an acceptably converged reconstruction. Results: IR improved the detectability of low-contrast and small microcalcifications, reduced cross-plane artifacts, improved spatial resolution, and lowered noise in reconstructed images. Conclusions: Although the computational load remains a significant challenge for practical development, the superior image quality provided by statistical IR, combined with advancing computational techniques, may bring benefits to screening, diagnostics, and intraoperative imaging in clinical applications

  6. Statistical iterative reconstruction to improve image quality for digital breast tomosynthesis

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Shiyu, E-mail: shiyu.xu@gmail.com; Chen, Ying, E-mail: adachen@siu.edu [Department of Electrical and Computer Engineering, Southern Illinois University Carbondale, Carbondale, Illinois 62901 (United States); Lu, Jianping; Zhou, Otto [Department of Physics and Astronomy and Curriculum in Applied Sciences and Engineering, University of North Carolina Chapel Hill, Chapel Hill, North Carolina 27599 (United States)

    2015-09-15

    Purpose: Digital breast tomosynthesis (DBT) is a novel modality with the potential to improve early detection of breast cancer by providing three-dimensional (3D) imaging with a low radiation dose. 3D image reconstruction presents some challenges: cone-beam and flat-panel geometry, and highly incomplete sampling. A promising means to overcome these challenges is statistical iterative reconstruction (IR), since it provides the flexibility of accurate physics modeling and a general description of system geometry. The authors’ goal was to develop techniques for applying statistical IR to tomosynthesis imaging data. Methods: These techniques include the following: a physics model with a local voxel-pair based prior with flexible parameters to fine-tune image quality; a precomputed parameter λ in the prior, to remove data dependence and to achieve a uniform resolution property; an effective ray-driven technique to compute the forward and backprojection; and an oversampled, ray-driven method to perform high resolution reconstruction with a practical region-of-interest technique. To assess the performance of these techniques, the authors acquired phantom data on the stationary DBT prototype system. To solve the estimation problem, the authors proposed an optimization-transfer based algorithm framework that potentially allows fewer iterations to achieve an acceptably converged reconstruction. Results: IR improved the detectability of low-contrast and small microcalcifications, reduced cross-plane artifacts, improved spatial resolution, and lowered noise in reconstructed images. Conclusions: Although the computational load remains a significant challenge for practical development, the superior image quality provided by statistical IR, combined with advancing computational techniques, may bring benefits to screening, diagnostics, and intraoperative imaging in clinical applications.

  7. A statistical rationale for establishing process quality control limits using fixed sample size, for critical current verification of SSC superconducting wire

    International Nuclear Information System (INIS)

    Pollock, D.A.; Brown, G.; Capone, D.W. II; Christopherson, D.; Seuntjens, J.M.; Woltz, J.

    1992-03-01

    The purpose of this paper is to demonstrate a statistical method for verifying superconducting wire process stability as represented by I c . The paper does not propose changing the I c testing frequency for wire during Phase 1 of the present Vendor Qualification Program. The actual statistical limits demonstrated for one supplier's data are not expected to be suitable for all suppliers. However, the method used to develop the limits and the potential for improved process through their use, may be applied equally. Implementing the demonstrated method implies that the current practice of testing all pieces of wire from each billet, for the purpose of detecting manufacturing process errors (i.e. missing a heat-treatment cycle for a part of the billet, etc.) can be replaced by other less costly process control measures. As used in this paper process control limits for critical current are quantitative indicators of the source manufacturing process uniformity. The limits serve as alarms indicating the need for manufacturing process investigation

  8. Uneven batch data alignment with application to the control of batch end-product quality.

    Science.gov (United States)

    Wan, Jian; Marjanovic, Ognjen; Lennox, Barry

    2014-03-01

    Batch processes are commonly characterized by uneven trajectories due to the existence of batch-to-batch variations. The batch end-product quality is usually measured at the end of these uneven trajectories. It is necessary to align the time differences for both the measured trajectories and the batch end-product quality in order to implement statistical process monitoring and control schemes. Apart from synchronizing trajectories with variable lengths using an indicator variable or dynamic time warping, this paper proposes a novel approach to align uneven batch data by identifying short-window PCA&PLS models at first and then applying these identified models to extend shorter trajectories and predict future batch end-product quality. Furthermore, uneven batch data can also be aligned to be a specified batch length using moving window estimation. The proposed approach and its application to the control of batch end-product quality are demonstrated with a simulated example of fed-batch fermentation for penicillin production. Copyright © 2013 ISA. Published by Elsevier Ltd. All rights reserved.

  9. Developing methods of controlling quality costs

    OpenAIRE

    Gorbunova A. V.; Maximova O. N.; Ekova V. A.

    2017-01-01

    The article examines issues of managing quality costs, problems of applying economic methods of quality control, implementation of progressive methods of quality costs management in enterprises with the view of improving the efficiency of their evaluation and analysis. With the aim of increasing the effectiveness of the cost management mechanism, authors introduce controlling as a tool of deviation analysis from the standpoint of the process approach. A list of processes and corresponding eva...

  10. Introduction of the new process and quality control methods in fuel fabrication at Siemens/ANF

    International Nuclear Information System (INIS)

    Rogge, K.T.; Fickers, H.H.; Doerr, W.

    2000-01-01

    The central point of ANFs quality philosophy is the process of continuous improvements. With respect to the causes of defects and the efforts needed for elimination, the importance of continuous improvements is evident. In most of the cases, defects are caused in the initial stages of a product but the majority of the problems will be only detected during fabrication and inspection and in the worst case when the product is already in use. Goal of the improvement process is to assure a high product quality. Therefore, the efforts are focused on robust and centered processes. A reasonable quality planning is the basis for achieving and maintaining the quality targets. Quality planning includes prefabrication studies, in-process inspections and final inspections. The inspections provide a large amount of various quality data, process parameters as well as product proper-ties. Key data will be defined and subjected to a statistical analysis. In view of the effectiveness of the analysis, it is important, that the process parameters which influence the characteristics of the product are well known and that appropriate methods for data evaluation and visualization will be used. Main approach of the data visualization is to obtain a tighter control of the product properties and to improve the process robustness by implementation of defined improvements. With respect to the fuel safety and fuel performance, the presentation shows for typical product quality characteristics some examples of visualized quality data. The examples includes the integrity of the pellet column (rod scanner results), the spring force of PWR spacers (critical characteristic with regard to rod fretting) and the spacer intersection weld size (thermo-hydraulic fuel bundle behaviour). The presentation also includes an example for the statistical process control, the in-line surveillance of the fuel rod weld parameters which assures the integrity of the welds within tight tolerance ranges. The quality

  11. Quality assurance and quality control for Hydro-Quebec's ambient air monitoring networks

    International Nuclear Information System (INIS)

    Lambert, M.; Varfalvy, L.

    1993-01-01

    Hydro Quebec has three ambient air monitoring networks to determine the contribution of some of its thermal plants to ambient air quality. They are located in Becancour (gas turbines), Iles-de-la-Madeleine (diesel), and Tracy (conventional oil-fired). To ensure good quality results and consistency between networks, a quality assurance/quality control program was set up. A description is presented of the ambient air quality monitoring network and the quality assurance/quality control program. A guide has been created for use by the network operators, discussing objectives of the individual network, a complete description of each network, field operation for each model of instrument in use, treatment of data for each data logger in use, global considerations regarding quality assurance and control, and reports. A brief overview is presented of the guide's purpose and contents, focusing on the field operation section and the sulfur dioxide and nitrogen oxide monitors. 6 figs., 1 tab

  12. Performance and quality control of nuclear medicine instrumentation

    International Nuclear Information System (INIS)

    Paras, P.

    1981-01-01

    The status and the recent developments of nuclear medicine instrumentation performance, with an emphasis on gamma-camera performance, are discussed as the basis for quality control. New phantoms and techniques for the measurement of gamma-camera performance parameters are introduced and their usefulness for quality control is discussed. Tests and procedures for dose calibrator quality control are included. Also, the principles of quality control, tests, equipment and procedures for each type of instrument are reviewed, and minimum requirements for an effective quality assurance programme for nuclear medicine instrumentation are suggested. (author)

  13. Bootstrap Signal-to-Noise Confidence Intervals: An Objective Method for Subject Exclusion and Quality Control in ERP Studies

    Science.gov (United States)

    Parks, Nathan A.; Gannon, Matthew A.; Long, Stephanie M.; Young, Madeleine E.

    2016-01-01

    Analysis of event-related potential (ERP) data includes several steps to ensure that ERPs meet an appropriate level of signal quality. One such step, subject exclusion, rejects subject data if ERP waveforms fail to meet an appropriate level of signal quality. Subject exclusion is an important quality control step in the ERP analysis pipeline as it ensures that statistical inference is based only upon those subjects exhibiting clear evoked brain responses. This critical quality control step is most often performed simply through visual inspection of subject-level ERPs by investigators. Such an approach is qualitative, subjective, and susceptible to investigator bias, as there are no standards as to what constitutes an ERP of sufficient signal quality. Here, we describe a standardized and objective method for quantifying waveform quality in individual subjects and establishing criteria for subject exclusion. The approach uses bootstrap resampling of ERP waveforms (from a pool of all available trials) to compute a signal-to-noise ratio confidence interval (SNR-CI) for individual subject waveforms. The lower bound of this SNR-CI (SNRLB) yields an effective and objective measure of signal quality as it ensures that ERP waveforms statistically exceed a desired signal-to-noise criterion. SNRLB provides a quantifiable metric of individual subject ERP quality and eliminates the need for subjective evaluation of waveform quality by the investigator. We detail the SNR-CI methodology, establish the efficacy of employing this approach with Monte Carlo simulations, and demonstrate its utility in practice when applied to ERP datasets. PMID:26903849

  14. [Coronary artery bypass surgery: methods of performance monitoring and quality control].

    Science.gov (United States)

    Albert, A; Sergeant, P; Ennker, J

    2009-10-01

    The strength of coronary bypass operations depends on the preservation of their benefits regarding freedom of symptoms, quality of life and survival, over decades. Significant variability of the results of an operative intervention according to the hospital or the operating surgeon is considered a weakness in the procedure. The external quality insurance tries to reach a transparent service providing market through hospital ranking comparability. Widely available information and competition will promote the improvement of the whole quality. The structured dialog acts as a control instrument for the BQS (Federal Quality Insurance). It is launched in case of deviations from the standard references or statistically significant differences between the results of the operations in any hospital and the average notational results. In comparison to the external control the hospital internal control has greater ability to reach a medically useful statement regarding the results of the treatment and to correct the mistakes in time. An online information portal based on a departmental databank (DataWarehouse, DataMart) is an attractive solution for the physician in order to get transparently and timely informed about the variability in the performance.The individual surgeon significantly influences the short- and long-term treatment results. Accordingly, selection, targeted training and performance measurements are necessary.Strict risk management and failure analysis of individual cases are included in the methods of internal quality control aiming to identify and correct the inadequacies in the system and the course of treatment. According to the international as well as our own experience, at least 30% of the mortalities after bypass operations are avoidable. A functioning quality control is especially important in minimally invasive interventions because they are often technically more demanding in comparison to the conventional procedures. In the field of OPCAB surgery

  15. Proteomics Quality Control: Quality Control Software for MaxQuant Results.

    Science.gov (United States)

    Bielow, Chris; Mastrobuoni, Guido; Kempa, Stefan

    2016-03-04

    Mass spectrometry-based proteomics coupled to liquid chromatography has matured into an automatized, high-throughput technology, producing data on the scale of multiple gigabytes per instrument per day. Consequently, an automated quality control (QC) and quality analysis (QA) capable of detecting measurement bias, verifying consistency, and avoiding propagation of error is paramount for instrument operators and scientists in charge of downstream analysis. We have developed an R-based QC pipeline called Proteomics Quality Control (PTXQC) for bottom-up LC-MS data generated by the MaxQuant software pipeline. PTXQC creates a QC report containing a comprehensive and powerful set of QC metrics, augmented with automated scoring functions. The automated scores are collated to create an overview heatmap at the beginning of the report, giving valuable guidance also to nonspecialists. Our software supports a wide range of experimental designs, including stable isotope labeling by amino acids in cell culture (SILAC), tandem mass tags (TMT), and label-free data. Furthermore, we introduce new metrics to score MaxQuant's Match-between-runs (MBR) functionality by which peptide identifications can be transferred across Raw files based on accurate retention time and m/z. Last but not least, PTXQC is easy to install and use and represents the first QC software capable of processing MaxQuant result tables. PTXQC is freely available at https://github.com/cbielow/PTXQC .

  16. Coping and social problem solving correlates of asthma control and quality of life.

    Science.gov (United States)

    McCormick, Sean P; Nezu, Christine M; Nezu, Arthur M; Sherman, Michael; Davey, Adam; Collins, Bradley N

    2014-02-01

    In a sample of adults with asthma receiving care and medication in an outpatient pulmonary clinic, this study tested for statistical associations between social problem-solving styles, asthma control, and asthma-related quality of life. These variables were measured cross sectionally as a first step toward more systematic application of social problem-solving frameworks in asthma self-management training. Recruitment occurred during pulmonology clinic service hours. Forty-four adults with physician-confirmed diagnosis of asthma provided data including age, gender, height, weight, race, income, and comorbid conditions. The Asthma Control Questionnaire, the Mini Asthma Quality of Life Questionnaire (Short Form), and peak expiratory force measures offered multiple views of asthma health at the time of the study. Maladaptive coping (impulsive and careless problem-solving styles) based on transactional stress models of health were assessed with the Social Problem-Solving Inventory-Revised: Short Form. Controlling for variance associated with gender, age, and income, individuals reporting higher impulsive-careless scores exhibited significantly lower scores on asthma control (β = 0.70, p = 0.001, confidence interval (CI) [0.37-1.04]) and lower asthma-related quality of life (β = 0.79, p = 0.017, CI [0.15-1.42]). These findings suggest that specific maladaptive problem-solving styles may uniquely contribute to asthma health burdens. Because problem-solving coping strategies are both measureable and teachable, behavioral interventions aimed at facilitating adaptive coping and problem solving could positively affect patient's asthma management and quality of life.

  17. COMPARISON OF STATISTICALLY CONTROLLED MACHINING SOLUTIONS OF TITANIUM ALLOYS USING USM

    Directory of Open Access Journals (Sweden)

    R. Singh

    2010-06-01

    Full Text Available The purpose of the present investigation is to compare the statistically controlled machining solution of titanium alloys using ultrasonic machining (USM. In this study, the previously developed Taguchi model for USM of titanium and its alloys has been investigated and compared. Relationships between the material removal rate, tool wear rate, surface roughness and other controllable machining parameters (power rating, tool type, slurry concentration, slurry type, slurry temperature and slurry size have been deduced. The results of this study suggest that at the best settings of controllable machining parameters for titanium alloys (based upon the Taguchi design, the machining solution with USM is statistically controlled, which is not observed for other settings of input parameters on USM.

  18. Quality Management and Juran’s Legacy

    NARCIS (Netherlands)

    Bisgaard, S.

    2007-01-01

    Keywords: Quality Engineering Six Sigma Design for Six Sigma Abstract: Quality management provides the framework for the industrial application of statistical quality control, design of experiments, quality improvement and reliability methods. It is therefore helpful for quality engineers and

  19. Statistical analysis of the Ft. Calhoun reactor coolant pump system

    International Nuclear Information System (INIS)

    Patel, Bimal; Heising, C.D.

    1997-01-01

    In engineering science, statistical quality control techniques have traditionally been applied to control manufacturing processes. An application to commercial nuclear power plant maintenance and control is presented that can greatly improve plant safety. As a demonstration of such an approach, a specific system is analyzed: the reactor coolant pumps (RCPs) of the Ft. Calhoun nuclear power plant. This research uses capability analysis, Shewhart X-bar, R charts, canonical correlation methods, and design of experiments to analyze the process for the state of statistical control. The results obtained show that six out of ten parameters are under control specification limits and four parameters are not in the state of statistical control. The analysis shows that statistical process control methods can be applied as an early warning system capable of identifying significant equipment problems well in advance of traditional control room alarm indicators. Such a system would provide operators with ample time to respond to possible emergency situations and thus improve plant safety and reliability. (Author)

  20. Automatic Assessment of Pathological Voice Quality Using Higher-Order Statistics in the LPC Residual Domain

    Directory of Open Access Journals (Sweden)

    JiYeoun Lee

    2009-01-01

    Full Text Available A preprocessing scheme based on linear prediction coefficient (LPC residual is applied to higher-order statistics (HOSs for automatic assessment of an overall pathological voice quality. The normalized skewness and kurtosis are estimated from the LPC residual and show statistically meaningful distributions to characterize the pathological voice quality. 83 voice samples of the sustained vowel /a/ phonation are used in this study and are independently assessed by a speech and language therapist (SALT according to the grade of the severity of dysphonia of GRBAS scale. These are used to train and test classification and regression tree (CART. The best result is obtained using an optima l decision tree implemented by a combination of the normalized skewness and kurtosis, with an accuracy of 92.9%. It is concluded that the method can be used as an assessment tool, providing a valuable aid to the SALT during clinical evaluation of an overall pathological voice quality.

  1. Statistical physics of human beings in games: Controlled experiments

    Science.gov (United States)

    Liang, Yuan; Huang, Ji-Ping

    2014-07-01

    It is important to know whether the laws or phenomena in statistical physics for natural systems with non-adaptive agents still hold for social human systems with adaptive agents, because this implies whether it is possible to study or understand social human systems by using statistical physics originating from natural systems. For this purpose, we review the role of human adaptability in four kinds of specific human behaviors, namely, normal behavior, herd behavior, contrarian behavior, and hedge behavior. The approach is based on controlled experiments in the framework of market-directed resource-allocation games. The role of the controlled experiments could be at least two-fold: adopting the real human decision-making process so that the system under consideration could reflect the performance of genuine human beings; making it possible to obtain macroscopic physical properties of a human system by tuning a particular factor of the system, thus directly revealing cause and effect. As a result, both computer simulations and theoretical analyses help to show a few counterparts of some laws or phenomena in statistical physics for social human systems: two-phase phenomena or phase transitions, entropy-related phenomena, and a non-equilibrium steady state. This review highlights the role of human adaptability in these counterparts, and makes it possible to study or understand some particular social human systems by means of statistical physics coming from natural systems.

  2. Validation of analytical method for quality control of B12 Vitamin-10 000 injection

    International Nuclear Information System (INIS)

    Botet Garcia, Martha; Garcia Penna, Caridad Margarita; Troche Concepcion, Yenilen; Cannizares Arencibia, Yanara; Moreno Correoso, Barbara

    2009-01-01

    Analytical method reported by USA Pharmacopeia was validated for quality control of injectable B 1 2 Vitamin (10 000 U) by UV spectrophotometry because this is a simpler and low-cost method allowing quality control of finished product. Calibration curve was graphed at 60 to 140% interval, where it was linear with a correlation coefficient similar to 0, 9999; statistical test for interception and slope was considered non-significant. There was a recovery of 99.7 % in study concentrations interval where the Cochran (G) and Student(t) test were not significant too. Variation coefficient in repetition study was similar to 0.59 % for the 6 assayed replies, whereas in intermediate precision analysis, the Fisher and Student tests were not significant. Analytical method was linear, precise, specific and exact in study concentrations interval

  3. 7 CFR 58.733 - Quality control tests.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Quality control tests. 58.733 Section 58.733... Procedures § 58.733 Quality control tests. (a) Chemical analyses. The following chemical analyses shall be... pasteurization by means of the phosphatase test, as well as any other tests necessary to assure good quality...

  4. Topological and statistical properties of quantum control transition landscapes

    International Nuclear Information System (INIS)

    Hsieh, Michael; Wu Rebing; Rabitz, Herschel; Rosenthal, Carey

    2008-01-01

    A puzzle arising in the control of quantum dynamics is to explain the relative ease with which high-quality control solutions can be found in the laboratory and in simulations. The emerging explanation appears to lie in the nature of the quantum control landscape, which is an observable as a function of the control variables. This work considers the common case of the observable being the transition probability between an initial and a target state. For any controllable quantum system, this landscape contains only global maxima and minima, and no local extrema traps. The probability distribution function for the landscape value is used to calculate the relative volume of the region of the landscape corresponding to good control solutions. The topology of the global optima of the landscape is analysed and the optima are shown to have inherent robustness to variations in the controls. Although the relative landscape volume of good control solutions is found to shrink rapidly as the system Hilbert space dimension increases, the highly favourable landscape topology at and away from the global optima provides a rationale for understanding the relative ease of finding high-quality, stable quantum optimal control solutions

  5. Implementation of dosimetric quality control on IMRT and VMAT treatments in radiotherapy using diodes; Implementacion de control de calidad dosimetrico en tratamientos de IMRT y VMAT en radioterapia usando diodos

    Energy Technology Data Exchange (ETDEWEB)

    Gonzales, A.; Garcia, B.; Ramirez, J.; Marquina, J., E-mail: andres.gonzales@aliada.com.pe [ALIADA, Oncologia Integral, Av. Jose Galvez Barrenechea 1044, San Isidro, Lima 27 (Peru)

    2014-08-15

    To implement quality control of IMRT and VMAT treatments Rapid Arc radiotherapy using diode array. Were tested 90 patients with IMRT and VMAT Rapid Arc, comparing the planned dose to the dose administered, used the Map-Check-2 and Arc-Check of Sun Nuclear, they using the gamma factor for calculating and using comparison parameters 3% / 3m m. The statistic shows that the quality controls of the 90 patients analyzed, presented a percentage of diodes that pass the test between 96,7% and 100,0% of the irradiated diodes. Implemented in Clinical ALIADA Oncologia Integral, the method for quality control of IMRT and VMAT treatments Rapid Arc radiotherapy using diode array. (Author)

  6. Descriptive study of the quality control in mammography

    International Nuclear Information System (INIS)

    Gaona, E.; Perdigon C, G.M.; Casian C, G.A.; Azorin N, J.; Diaz G, J.A.I.; Arreola, M.

    2005-01-01

    The goal of mammography is to provide contrast between a lesion that is possible residing within the breast and normal surrounding tissue. Quality control is essential for maintaining the contrast imaging performance of a mammography system and incorporate tests that are relevant in that they are predictive of future degradation of contrast imaging performance. These tests will also be done at frequency that is high enough to intercept most drifts in quality imaging or performance before they become diagnostically significant. The quality control study has as objective to describe the results of the assessment of quality imaging elements (film optical density, contrast (density difference), uniformity, resolution and noise) of 62 mammography departments without quality control program and comparison these results with a mammography reference department with a quality control program. When comparing the results they allow seeing the clinical utility of to have a quality control program to reduce the errors of mammography interpretation. (Author)

  7. Quality-control design for surface-water sampling in the National Water-Quality Network

    Science.gov (United States)

    Riskin, Melissa L.; Reutter, David C.; Martin, Jeffrey D.; Mueller, David K.

    2018-04-10

    The data-quality objectives for samples collected at surface-water sites in the National Water-Quality Network include estimating the extent to which contamination, matrix effects, and measurement variability affect interpretation of environmental conditions. Quality-control samples provide insight into how well the samples collected at surface-water sites represent the true environmental conditions. Quality-control samples used in this program include field blanks, replicates, and field matrix spikes. This report describes the design for collection of these quality-control samples and the data management needed to properly identify these samples in the U.S. Geological Survey’s national database.

  8. Concrete and steel construction quality control and assurance

    CERN Document Server

    El-Reedy, Mohamed A

    2014-01-01

    Starting with the receipt of materials and continuing all the way through to the final completion of the construction phase, Concrete and Steel Construction: Quality Control and Assurance examines all the quality control and assurance methods involving reinforced concrete and steel structures. This book explores the proper ways to achieve high-quality construction projects, and also provides a strong theoretical and practical background. It introduces information on quality techniques and quality management, and covers the principles of quality control. The book presents all of the quality control and assurance protocols and non-destructive test methods necessary for concrete and steel construction projects, including steel materials, welding and mixing, and testing. It covers welding terminology and procedures, and discusses welding standards and procedures during the fabrication process, as well as the welding codes. It also considers the total quality management system based on ISO 9001, and utilizes numer...

  9. Statistical assessment of coal charge effect on metallurgical coke quality

    Directory of Open Access Journals (Sweden)

    Pavlína Pustějovská

    2016-06-01

    Full Text Available The paper studies coke quality. Blast furnace technique has been interested in iron ore charge; meanwhile coke was not studied because, in previous conditions, it seemed to be good enough. Nowadays, requirements for blast furnace coke has risen, especially, requirements for coke reactivity. The level of reactivity parameter is determined primarily by the composition and properties of coal mixtures for coking. The paper deals with a statistical analysis of the tightness and characteristics of the relationship between selected properties of coal mixture and coke reactivity. Software Statgraphic using both simple linear regression and multiple linear regressions was used for the calculations. Obtained regression equations provide a statistically significant prediction of the reactivity of coke, or its strength after reduction of CO2, and, thus, their subsequent management by change in composition and properties of coal mixture. There were determined indexes CSR/CRI for coke. Fifty – four results were acquired in the experimental parts where correlation between index CRI and coal components were studied. For linear regression the determinant was 55.0204%, between parameters CRI – Inertinit 21.5873%. For regression between CRI and coal components it was 31.03%. For multiple linear regression between CRI and 3 feedstock components determinant was 34.0691%. The final correlation has shown the decrease in final coke reactivity for higher ash, higher content of volatile combustible in coal increases the total coke reactivity and higher amount of inertinit in coal increases the reactivity. Generally, coke quality is significantly affected by coal processing, carbonization and maceral content of coal mixture.

  10. COLLABORATIVE TRIAL AND QUALITY CONTROL IN CHEMICAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Narsito Narsito

    2010-06-01

    Full Text Available Abstract                                                             This paper deals with some practical problems related to the quality of analytical chemical data usually met in practice. Special attention is given to the topic of quality control in analytical chemistry, since analytical data is one of the primary information from which some important scientifically based decision are to be made. The present paper starts with brief description on some fundamental aspects associated with quality of analytical data, such as sources of variation of analytical data, criteria for quality of analytical method, quality assurance in chemical analysis. The assessment of quality parameter for analytical method like the use of standard materials as well as standard methods is given. Concerning with the quality control of analytical data, the use of several techniques, such as control samples and control charts, in monitoring analytical data in quality control program are described qualitatively.  In the final part of this paper, some important remarks for the preparation of collaborative trials, including the evaluation of accuracy and reproducibility of analytical method are also given Keywords: collaborative trials, quality control, analytical data Abstract                                                             This paper deals with some practical problems related to the quality of analytical chemical data usually met in practice. Special attention is given to the topic of quality control in analytical chemistry, since analytical data is one of the primary information from which some important scientifically based decision are to be made. The present paper starts with brief description on some fundamental aspects associated with quality of analytical data, such as sources of variation of analytical data, criteria for quality of

  11. 42 CFR 84.40 - Quality control plans; filing requirements.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 1 2010-10-01 2010-10-01 false Quality control plans; filing requirements. 84.40... Control § 84.40 Quality control plans; filing requirements. As a part of each application for approval or... proposed quality control plan which shall be designed to assure the quality of respiratory protection...

  12. 30 CFR 28.30 - Quality control plans; filing requirements.

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Quality control plans; filing requirements. 28... PROTECTION FOR TRAILING CABLES IN COAL MINES Quality Control § 28.30 Quality control plans; filing... part, each applicant shall file with MSHA a proposed quality control plan which shall be designed to...

  13. Characterization of Surface Water and Groundwater Quality in the Lower Tano River Basin Using Statistical and Isotopic Approach.

    Science.gov (United States)

    Edjah, Adwoba; Stenni, Barbara; Cozzi, Giulio; Turetta, Clara; Dreossi, Giuliano; Tetteh Akiti, Thomas; Yidana, Sandow

    2017-04-01

    Adwoba Kua- Manza Edjaha, Barbara Stennib,c,Giuliano Dreossib, Giulio Cozzic, Clara Turetta c,T.T Akitid ,Sandow Yidanae a,eDepartment of Earth Science, University of Ghana Legon, Ghana West Africa bDepartment of Enviromental Sciences, Informatics and Statistics, Ca Foscari University of Venice, Italy cInstitute for the Dynamics of Environmental Processes, CNR, Venice, Italy dDepartment of Nuclear Application and Techniques, Graduate School of Nuclear and Allied Sciences University of Ghana Legon This research is part of a PhD research work "Hydrogeological Assessment of the Lower Tano river basin for sustainable economic usage, Ghana, West - Africa". In this study, the researcher investigated surface water and groundwater quality in the Lower Tano river basin. This assessment was based on some selected sampling sites associated with mining activities, and the development of oil and gas. Statistical approach was applied to characterize the quality of surface water and groundwater. Also, water stable isotopes, which is a natural tracer of the hydrological cycle was used to investigate the origin of groundwater recharge in the basin. The study revealed that Pb and Ni values of the surface water and groundwater samples exceeded the WHO standards for drinking water. In addition, water quality index (WQI), based on physicochemical parameters(EC, TDS, pH) and major ions(Ca2+, Na+, Mg2+, HCO3-,NO3-, CL-, SO42-, K+) exhibited good quality water for 60% of the sampled surface water and groundwater. Other statistical techniques, such as Heavy metal pollution index (HPI), degree of contamination (Cd), and heavy metal evaluation index (HEI), based on trace element parameters in the water samples, reveal that 90% of the surface water and groundwater samples belong to high level of pollution. Principal component analysis (PCA) also suggests that the water quality in the basin is likely affected by rock - water interaction and anthropogenic activities (sea water intrusion). This

  14. Sampling methods to the statistical control of the production of blood components.

    Science.gov (United States)

    Pereira, Paulo; Seghatchian, Jerard; Caldeira, Beatriz; Santos, Paula; Castro, Rosa; Fernandes, Teresa; Xavier, Sandra; de Sousa, Gracinda; de Almeida E Sousa, João Paulo

    2017-12-01

    The control of blood components specifications is a requirement generalized in Europe by the European Commission Directives and in the US by the AABB standards. The use of a statistical process control methodology is recommended in the related literature, including the EDQM guideline. The control reliability is dependent of the sampling. However, a correct sampling methodology seems not to be systematically applied. Commonly, the sampling is intended to comply uniquely with the 1% specification to the produced blood components. Nevertheless, on a purely statistical viewpoint, this model could be argued not to be related to a consistent sampling technique. This could be a severe limitation to detect abnormal patterns and to assure that the production has a non-significant probability of producing nonconforming components. This article discusses what is happening in blood establishments. Three statistical methodologies are proposed: simple random sampling, sampling based on the proportion of a finite population, and sampling based on the inspection level. The empirical results demonstrate that these models are practicable in blood establishments contributing to the robustness of sampling and related statistical process control decisions for the purpose they are suggested for. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Protein quality control in the nucleus

    DEFF Research Database (Denmark)

    Nielsen, Sofie V.; Poulsen, Esben Guldahl; Rebula, Caio A.

    2014-01-01

    to aggregate, cells have evolved several elaborate quality control systems to deal with these potentially toxic proteins. First, various molecular chaperones will seize the misfolded protein and either attempt to refold the protein or target it for degradation via the ubiquitin-proteasome system...... to be particularly active in protein quality control. Thus, specific ubiquitin-protein ligases located in the nucleus, target not only misfolded nuclear proteins, but also various misfolded cytosolic proteins which are transported to the nucleus prior to their degradation. In comparison, much less is known about...... these mechanisms in mammalian cells. Here we highlight recent advances in our understanding of nuclear protein quality control, in particular regarding substrate recognition and proteasomal degradation....

  16. 21 CFR 211.22 - Responsibilities of quality control unit.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 4 2010-04-01 2010-04-01 false Responsibilities of quality control unit. 211.22... Personnel § 211.22 Responsibilities of quality control unit. (a) There shall be a quality control unit that... have been fully investigated. The quality control unit shall be responsible for approving or rejecting...

  17. 40 CFR 81.112 - Charleston Intrastate Air Quality Control Region.

    Science.gov (United States)

    2010-07-01

    ... Quality Control Regions § 81.112 Charleston Intrastate Air Quality Control Region. The Charleston Intrastate Air Quality Control Region (South Carolina) consists of the territorial area encompassed by the... Quality Control Region: Region 1. 81.107Greenwood Intrastate Air Quality Control Region: Region 2. 81...

  18. Quality management using TQC

    International Nuclear Information System (INIS)

    Hwang, Ui Cheol

    1992-03-01

    This book introduces conception on quality and quality management, standardization with meaning, principle and structure of it, and system, company standard for quality management, quality management on planning, organization and operation, quality guarantee with system evaluation and information, policy management on policy, purpose and daily life, quality management on research and development infrastructure, quality management on design and production facilities, statistical method of quality management, control chart and process capability.

  19. Effect of radiation dose and adaptive statistical iterative reconstruction on image quality of pulmonary computed tomography

    International Nuclear Information System (INIS)

    Sato, Jiro; Akahane, Masaaki; Inano, Sachiko; Terasaki, Mariko; Akai, Hiroyuki; Katsura, Masaki; Matsuda, Izuru; Kunimatsu, Akira; Ohtomo, Kuni

    2012-01-01

    The purpose of this study was to assess the effects of dose and adaptive statistical iterative reconstruction (ASIR) on image quality of pulmonary computed tomography (CT). Inflated and fixed porcine lungs were scanned with a 64-slice CT system at 10, 20, 40 and 400 mAs. Using automatic exposure control, 40 mAs was chosen as standard dose. Scan data were reconstructed with filtered back projection (FBP) and ASIR. Image pairs were obtained by factorial combination of images at a selected level. Using a 21-point scale, three experienced radiologists independently rated differences in quality between adjacently displayed paired images for image noise, image sharpness and conspicuity of tiny nodules. A subjective quality score (SQS) for each image was computed based on Anderson's functional measurement theory. The standard deviation was recorded as a quantitative noise measurement. At all doses examined, SQSs improved with ASIR for all evaluation items. No significant differences were noted between the SQSs for 40%-ASIR images obtained at 20 mAs and those for FBP images at 40 mAs. Compared to the FBP algorithm, ASIR for lung CT can enable an approximately 50% dose reduction from the standard dose while preserving visualization of small structures. (author)

  20. The statistical reporting quality of articles published in 2010 in five dental journals.

    Science.gov (United States)

    Vähänikkilä, Hannu; Tjäderhane, Leo; Nieminen, Pentti

    2015-01-01

    Statistical methods play an important role in medical and dental research. In earlier studies it has been observed that current use of methods and reporting of statistics are responsible for some of the errors in the interpretation of results. The aim of this study was to investigate the quality of statistical reporting in dental research articles. A total of 200 articles published in 2010 were analysed covering five dental journals: Journal of Dental Research, Caries Research, Community Dentistry and Oral Epidemiology, Journal of Dentistry and Acta Odontologica Scandinavica. Each paper underwent careful scrutiny for the use of statistical methods and reporting. A paper with at least one poor reporting item has been classified as 'problems with reporting statistics' and a paper without any poor reporting item as 'acceptable'. The investigation showed that 18 (9%) papers were acceptable and 182 (91%) papers contained at least one poor reporting item. The proportion of at least one poor reporting item in this survey was high (91%). The authors of dental journals should be encouraged to improve the statistical section of their research articles and to present the results in such a way that it is in line with the policy and presentation of the leading dental journals.

  1. New statistical potential for quality assessment of protein models and a survey of energy functions

    Directory of Open Access Journals (Sweden)

    Rykunov Dmitry

    2010-03-01

    Full Text Available Abstract Background Scoring functions, such as molecular mechanic forcefields and statistical potentials are fundamentally important tools in protein structure modeling and quality assessment. Results The performances of a number of publicly available scoring functions are compared with a statistical rigor, with an emphasis on knowledge-based potentials. We explored the effect on accuracy of alternative choices for representing interaction center types and other features of scoring functions, such as using information on solvent accessibility, on torsion angles, accounting for secondary structure preferences and side chain orientation. Partially based on the observations made, we present a novel residue based statistical potential, which employs a shuffled reference state definition and takes into account the mutual orientation of residue side chains. Atom- and residue-level statistical potentials and Linux executables to calculate the energy of a given protein proposed in this work can be downloaded from http://www.fiserlab.org/potentials. Conclusions Among the most influential terms we observed a critical role of a proper reference state definition and the benefits of including information about the microenvironment of interaction centers. Molecular mechanical potentials were also tested and found to be over-sensitive to small local imperfections in a structure, requiring unfeasible long energy relaxation before energy scores started to correlate with model quality.

  2. An empirical comparison of key statistical attributes among potential ICU quality indicators.

    Science.gov (United States)

    Brown, Sydney E S; Ratcliffe, Sarah J; Halpern, Scott D

    2014-08-01

    Good quality indicators should have face validity, relevance to patients, and be able to be measured reliably. Beyond these general requirements, good quality indicators should also have certain statistical properties, including sufficient variability to identify poor performers, relative insensitivity to severity adjustment, and the ability to capture what providers do rather than patients' characteristics. We assessed the performance of candidate indicators of ICU quality on these criteria. Indicators included ICU readmission, mortality, several length of stay outcomes, and the processes of venous-thromboembolism and stress ulcer prophylaxis provision. Retrospective cohort study. One hundred thirty-eight U.S. ICUs from 2001-2008 in the Project IMPACT database. Two hundred sixty-eight thousand eight hundred twenty-four patients discharged from U.S. ICUs. None. We assessed indicators' (1) variability across ICU-years; (2) degree of influence by patient vs. ICU and hospital characteristics using the Omega statistic; (3) sensitivity to severity adjustment by comparing the area under the receiver operating characteristic curve (AUC) between models including vs. excluding patient variables, and (4) correlation between risk adjusted quality indicators using a Spearman correlation. Large ranges of among-ICU variability were noted for all quality indicators, particularly for prolonged length of stay (4.7-71.3%) and the proportion of patients discharged home (30.6-82.0%), and ICU and hospital characteristics outweighed patient characteristics for stress ulcer prophylaxis (ω, 0.43; 95% CI, 0.34-0.54), venous thromboembolism prophylaxis (ω, 0.57; 95% CI, 0.53-0.61), and ICU readmissions (ω, 0.69; 95% CI, 0.52-0.90). Mortality measures were the most sensitive to severity adjustment (area under the receiver operating characteristic curve % difference, 29.6%); process measures were the least sensitive (area under the receiver operating characteristic curve % differences

  3. Statistical process control using optimized neural networks: a case study.

    Science.gov (United States)

    Addeh, Jalil; Ebrahimzadeh, Ata; Azarbad, Milad; Ranaee, Vahid

    2014-09-01

    The most common statistical process control (SPC) tools employed for monitoring process changes are control charts. A control chart demonstrates that the process has altered by generating an out-of-control signal. This study investigates the design of an accurate system for the control chart patterns (CCPs) recognition in two aspects. First, an efficient system is introduced that includes two main modules: feature extraction module and classifier module. In the feature extraction module, a proper set of shape features and statistical feature are proposed as the efficient characteristics of the patterns. In the classifier module, several neural networks, such as multilayer perceptron, probabilistic neural network and radial basis function are investigated. Based on an experimental study, the best classifier is chosen in order to recognize the CCPs. Second, a hybrid heuristic recognition system is introduced based on cuckoo optimization algorithm (COA) algorithm to improve the generalization performance of the classifier. The simulation results show that the proposed algorithm has high recognition accuracy. Copyright © 2013 ISA. Published by Elsevier Ltd. All rights reserved.

  4. 40 CFR 81.88 - Billings Intrastate Air Quality Control Region.

    Science.gov (United States)

    2010-07-01

    ... Quality Control Regions § 81.88 Billings Intrastate Air Quality Control Region. The Metropolitan Billings Intrastate Air Quality Control Region (Montana) has been renamed the Billings Intrastate Air Quality Control... to by Montana authorities as follows: Sec. 481.168Great Falls Intrastate Air Quality Control Region...

  5. Intra- and Intercellular Quality Control Mechanisms of Mitochondria

    Directory of Open Access Journals (Sweden)

    Yoshimitsu Kiriyama

    2017-12-01

    Full Text Available Mitochondria function to generate ATP and also play important roles in cellular homeostasis, signaling, apoptosis, autophagy, and metabolism. The loss of mitochondrial function results in cell death and various types of diseases. Therefore, quality control of mitochondria via intra- and intercellular pathways is crucial. Intracellular quality control consists of biogenesis, fusion and fission, and degradation of mitochondria in the cell, whereas intercellular quality control involves tunneling nanotubes and extracellular vesicles. In this review, we outline the current knowledge on the intra- and intercellular quality control mechanisms of mitochondria.

  6. Fuel cycle and quality control

    International Nuclear Information System (INIS)

    Stoll, W.

    1979-01-01

    The volume of the fuel cycle is described in its economic importance and its through put, as it is envisaged for the Federal Republic of Germany. Definitions are given for quality continuing usefulness of an object and translated into quality criteria. Requirements on performance of fuel elements are defined. The way in which experimental results are translated into mass production of fuel rods, is described. The economic potential for further quality effort is derived. Future ways of development for quality control organisation and structure are outlined. (Auth.)

  7. Statistical analysis of the Ft. Calhoun reactor coolant pump system

    International Nuclear Information System (INIS)

    Heising, Carolyn D.

    1998-01-01

    In engineering science, statistical quality control techniques have traditionally been applied to control manufacturing processes. An application to commercial nuclear power plant maintenance and control is presented that can greatly improve plant safety. As a demonstration of such an approach to plant maintenance and control, a specific system is analyzed: the reactor coolant pumps (RCPs) of the Ft. Calhoun nuclear power plant. This research uses capability analysis, Shewhart X-bar, R-charts, canonical correlation methods, and design of experiments to analyze the process for the state of statistical control. The results obtained show that six out of ten parameters are under control specifications limits and four parameters are not in the state of statistical control. The analysis shows that statistical process control methods can be applied as an early warning system capable of identifying significant equipment problems well in advance of traditional control room alarm indicators Such a system would provide operators with ample time to respond to possible emergency situations and thus improve plant safety and reliability. (author)

  8. 7 CFR 58.141 - Alternate quality control program.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Alternate quality control program. 58.141 Section 58... Service 1 Quality Specifications for Raw Milk § 58.141 Alternate quality control program. When a plant has in operation an acceptable quality program, at the producer level, which is approved by the...

  9. Quality control of 11C-carfentanil

    International Nuclear Information System (INIS)

    Zhang Xiaojun; Zhang Jinming; Tian Jiahe; Xiang Xiaohui

    2013-01-01

    To study the quality control of 11 C-Carfentanil injection, physical, chemical and biological identification were used. The chemical and radiochemical purity of 11 C-Carfentanil Injection were detected by HPLC and Flower Count system; measured the quantity of product by LC-MS, specific activity was calculated later; The PTS was used to detect endotoxin, and other quality control methods were put up to guarantee the security of its clinical application. The produce appeared colorless and transparent, the radiochemical purity was more than 98%, content of the endotoxin was less than 5 EU/mL. The result showed that 11 C-Carfentanil injection had fulfilled pharmaceutical quality control request and could be applied safely to animal experiment and clinical diagnosis. (authors)

  10. Statistical Process Control. Impact and Opportunities for Ohio.

    Science.gov (United States)

    Brown, Harold H.

    The first purpose of this study is to help the reader become aware of the evolution of Statistical Process Control (SPC) as it is being implemented and used in industry today. This is approached through the presentation of a brief historical account of SPC, from its inception through the technological miracle that has occurred in Japan. The…

  11. Water quality control system and water quality control method

    International Nuclear Information System (INIS)

    Itsumi, Sachio; Ichikawa, Nagayoshi; Uruma, Hiroshi; Yamada, Kazuya; Seki, Shuji

    1998-01-01

    In the water quality control system of the present invention, portions in contact with water comprise a metal material having a controlled content of iron or chromium, and the chromium content on the surface is increased than that of mother material in a state where compression stresses remain on the surface by mechanical polishing to form an uniform corrosion resistant coating film. In addition, equipments and/or pipelines to which a material controlling corrosion potential stably is applied on the surface are used. There are disposed a cleaning device made of a material less forming impurities, and detecting intrusion of impurities and removing them selectively depending on chemical species and/or a cleaning device for recovering drain from various kinds of equipment to feedwater, connecting a feedwater pipeline and a condensate pipeline and removing impurities and corrosion products. Then, water can be kept to neutral purified water, and the concentrations of oxygen and hydrogen in water are controlled within an optimum range to suppress occurrence of corrosion products. (N.H.)

  12. Internal quality control of PCR-based genotyping methods

    DEFF Research Database (Denmark)

    Bladbjerg, Else-Marie; Gram, Jørgen; Jespersen, Jørgen

    2002-01-01

    Internal quality control programmes for genetic analyses are needed. We have focused on quality control aspects of selected polymorphism analyses used in thrombosis research. DNA was isolated from EDTA-blood (n = 500) and analysed for 18 polymorphisms by polymerase chain reaction (PCR), i...... because of positive reagent blanks (controls (Control of data handling revealed 0.1% reading mistakes and 0.5% entry mistakes. Based on our experiences, we propose an internal quality control programme......, electrophoresis (analytical factors), result reading and entry into a database (post-analytical factors). Furthermore, we evaluated a procedure for result confirmation. Isolated DNA was of good quality (42 micrograms/ml blood, A260/A280 ratio > 1.75, negative DNAsis tests). Occasionally, results were reanalysed...

  13. 40 CFR 81.36 - Maricopa Intrastate Air Quality Control Region.

    Science.gov (United States)

    2010-07-01

    ... Quality Control Regions § 81.36 Maricopa Intrastate Air Quality Control Region. The Phoenix-Tucson Intrastate Air Quality Control Region has been renamed the Maricopa Intrastate Air Quality Control Region... 40 Protection of Environment 17 2010-07-01 2010-07-01 false Maricopa Intrastate Air Quality...

  14. A statistical model of uplink inter-cell interference with slow and fast power control mechanisms

    KAUST Repository

    Tabassum, Hina; Yilmaz, Ferkan; Dawy, Zaher; Alouini, Mohamed-Slim

    2013-01-01

    Uplink power control is in essence an interference mitigation technique that aims at minimizing the inter-cell interference (ICI) in cellular networks by reducing the transmit power levels of the mobile users while maintaining their target received signal quality levels at base stations. Power control mechanisms directly impact the interference dynamics and, thus, affect the overall achievable capacity and consumed power in cellular networks. Due to the stochastic nature of wireless channels and mobile users' locations, it is important to derive theoretical models for ICI that can capture the impact of design alternatives related to power control mechanisms. To this end, we derive and verify a novel statistical model for uplink ICI in Generalized-K composite fading environments as a function of various slow and fast power control mechanisms. The derived expressions are then utilized to quantify numerically key network performance metrics that include average resource fairness, average reduction in power consumption, and ergodic capacity. The accuracy of the derived expressions is validated via Monte-Carlo simulations. Results are generated for multiple network scenarios, and insights are extracted to assess various power control mechanisms as a function of system parameters. © 1972-2012 IEEE.

  15. A statistical model of uplink inter-cell interference with slow and fast power control mechanisms

    KAUST Repository

    Tabassum, Hina

    2013-09-01

    Uplink power control is in essence an interference mitigation technique that aims at minimizing the inter-cell interference (ICI) in cellular networks by reducing the transmit power levels of the mobile users while maintaining their target received signal quality levels at base stations. Power control mechanisms directly impact the interference dynamics and, thus, affect the overall achievable capacity and consumed power in cellular networks. Due to the stochastic nature of wireless channels and mobile users\\' locations, it is important to derive theoretical models for ICI that can capture the impact of design alternatives related to power control mechanisms. To this end, we derive and verify a novel statistical model for uplink ICI in Generalized-K composite fading environments as a function of various slow and fast power control mechanisms. The derived expressions are then utilized to quantify numerically key network performance metrics that include average resource fairness, average reduction in power consumption, and ergodic capacity. The accuracy of the derived expressions is validated via Monte-Carlo simulations. Results are generated for multiple network scenarios, and insights are extracted to assess various power control mechanisms as a function of system parameters. © 1972-2012 IEEE.

  16. The IEO Data Center Management System: Tools for quality control, analysis and access marine data

    Science.gov (United States)

    Casas, Antonia; Garcia, Maria Jesus; Nikouline, Andrei

    2010-05-01

    Since 1994 the Data Centre of the Spanish Oceanographic Institute develops system for archiving and quality control of oceanographic data. The work started in the frame of the European Marine Science & Technology Programme (MAST) when a consortium of several Mediterranean Data Centres began to work on the MEDATLAS project. Along the years, old software modules for MS DOS were rewritten, improved and migrated to Windows environment. Oceanographic data quality control includes now not only vertical profiles (mainly CTD and bottles observations) but also time series of currents and sea level observations. New powerful routines for analysis and for graphic visualization were added. Data presented originally in ASCII format were organized recently in an open source MySQL database. Nowadays, the IEO, as part of SeaDataNet Infrastructure, has designed and developed a new information system, consistent with the ISO 19115 and SeaDataNet standards, in order to manage the large and diverse marine data and information originated in Spain by different sources, and to interoperate with SeaDataNet. The system works with data stored in ASCII files (MEDATLAS, ODV) as well as data stored within the relational database. The components of the system are: 1.MEDATLAS Format and Quality Control - QCDAMAR: Quality Control of Marine Data. Main set of tools for working with data presented as text files. Includes extended quality control (searching for duplicated cruises and profiles, checking date, position, ship velocity, constant profiles, spikes, density inversion, sounding, acceptable data, impossible regional values,...) and input/output filters. - QCMareas: A set of procedures for the quality control of tide gauge data according to standard international Sea Level Observing System. These procedures include checking for unexpected anomalies in the time series, interpolation, filtering, computation of basic statistics and residuals. 2. DAMAR: A relational data base (MySql) designed to

  17. Quality control and quality assurance in individual monitoring of ionising radiations

    International Nuclear Information System (INIS)

    Dutt, J.C.; Lindborg, L.

    1994-01-01

    This paper describes the programmes and approaches that are to be considered in developing and introducing quality assurance and quality control procedures in individual monitoring services. Quality assurance and quality control in individual monitoring services are essential to maintain quality and are of increasing importance in order to meet the requirements of national regulations and international standards and guidelines. It is recommended here that all organisations offering individual monitoring services should run their services based on the principles of Quality System as given in the European Standard EN45001 and maintain a property resources QA/QC programme as an integral part of their operations. All aspects of QA/QC in individual monitoring services starting from the initial selection, installation, calibration, and operation to the final products including dose reporting, dose record keeping, dealing with customers' complaints and product liability issues have been discussed. (Author)

  18. Independent assessment to continue improvement: Implementing statistical process control at the Hanford Site

    International Nuclear Information System (INIS)

    Hu, T.A.; Lo, J.C.

    1994-11-01

    A Quality Assurance independent assessment has brought about continued improvement in the PUREX Plant surveillance program at the Department of Energy's Hanford Site. After the independent assessment, Quality Assurance personnel were closely involved in improving the surveillance program, specifically regarding storage tank monitoring. The independent assessment activities included reviewing procedures, analyzing surveillance data, conducting personnel interviews, and communicating with management. Process improvement efforts included: (1) designing data collection methods; (2) gaining concurrence between engineering and management, (3) revising procedures; and (4) interfacing with shift surveillance crews. Through this process, Statistical Process Control (SPC) was successfully implemented and surveillance management was improved. The independent assessment identified several deficiencies within the surveillance system. These deficiencies can be grouped into two areas: (1) data recording and analysis and (2) handling off-normal conditions. By using several independent assessment techniques, Quality Assurance was able to point out program weakness to senior management and present suggestions for improvements. SPC charting, as implemented by Quality Assurance, is an excellent tool for diagnosing the process, improving communication between the team members, and providing a scientific database for management decisions. In addition, the surveillance procedure was substantially revised. The goals of this revision were to (1) strengthen the role of surveillance management, engineering and operators and (2) emphasize the importance of teamwork for each individual who performs a task. In this instance we believe that the value independent assessment adds to the system is the continuous improvement activities that follow the independent assessment. Excellence in teamwork between the independent assessment organization and the auditee is the key to continuing improvement

  19. Investigating output and energy variations and their relationship to delivery QA results using Statistical Process Control for helical tomotherapy.

    Science.gov (United States)

    Binny, Diana; Mezzenga, Emilio; Lancaster, Craig M; Trapp, Jamie V; Kairn, Tanya; Crowe, Scott B

    2017-06-01

    The aims of this study were to investigate machine beam parameters using the TomoTherapy quality assurance (TQA) tool, establish a correlation to patient delivery quality assurance results and to evaluate the relationship between energy variations detected using different TQA modules. TQA daily measurement results from two treatment machines for periods of up to 4years were acquired. Analyses of beam quality, helical and static output variations were made. Variations from planned dose were also analysed using Statistical Process Control (SPC) technique and their relationship to output trends were studied. Energy variations appeared to be one of the contributing factors to delivery output dose seen in the analysis. Ion chamber measurements were reliable indicators of energy and output variations and were linear with patient dose verifications. Crown Copyright © 2017. Published by Elsevier Ltd. All rights reserved.

  20. Application of multivariate statistical techniques in the water quality assessment of Danube river, Serbia

    Directory of Open Access Journals (Sweden)

    Voza Danijela

    2015-12-01

    Full Text Available The aim of this article is to evaluate the quality of the Danube River in its course through Serbia as well as to demonstrate the possibilities for using three statistical methods: Principal Component Analysis (PCA, Factor Analysis (FA and Cluster Analysis (CA in the surface water quality management. Given that the Danube is an important trans-boundary river, thorough water quality monitoring by sampling at different distances during shorter and longer periods of time is not only ecological, but also a political issue. Monitoring was carried out at monthly intervals from January to December 2011, at 17 sampling sites. The obtained data set was treated by multivariate techniques in order, firstly, to identify the similarities and differences between sampling periods and locations, secondly, to recognize variables that affect the temporal and spatial water quality changes and thirdly, to present the anthropogenic impact on water quality parameters.

  1. Quality Control - Nike.Inc

    OpenAIRE

    Walter G. Bishop

    2017-01-01

    The purpose of this paper is to present the illustration of quality control approach, which has been adopted by several organizations, in order to manage and improve their production processes. The approach is referred as total quality management (TQM). This study will discuss the implementation of TQ, within the working environment of Nike Inc. One of the major objectives behind the implementation of TQ is to reduce or completely eliminate potential errors and flaws, within the manufacturing...

  2. How to set up and manage quality control and quality assurance

    NARCIS (Netherlands)

    Visschedijk, M.; Hendriks, R.; Nuyts, K.

    2005-01-01

    This document provides a general introduction to clarify the differences between quality control (QC) and quality assurance (QA). In addition it serves as a starting point for implementing a quality system approach within an organization. The paper offers practical guidance to the implementation of

  3. 40 CFR 81.77 - Puerto Rico Air Quality Control Region.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 17 2010-07-01 2010-07-01 false Puerto Rico Air Quality Control Region... PROGRAMS (CONTINUED) DESIGNATION OF AREAS FOR AIR QUALITY PLANNING PURPOSES Designation of Air Quality Control Regions § 81.77 Puerto Rico Air Quality Control Region. The Puerto Rico Air Quality Control Region...

  4. Solution standards for quality control of nuclear-material analytical measurements

    International Nuclear Information System (INIS)

    Clark, J.P.

    1981-01-01

    Analytical chemistry measurement control depends upon reliable solution standards. At the Savannah River Plant Control Laboratory over a thousand analytical measurements are made daily for process control, product specification, accountability, and nuclear safety. Large quantities of solution standards are required for a measurement quality control program covering the many different analytical chemistry methods. Savannah River Plant produced uranium, plutonium, neptunium, and americium metals or oxides are dissolved to prepare stock solutions for working or Quality Control Standards (QCS). Because extensive analytical effort is required to characterize or confirm these solutions, they are prepared in large quantities. These stock solutions are diluted and blended with different chemicals and/or each other to synthesize QCS that match the matrices of different process streams. The target uncertainty of a standard's reference value is 10% of the limit of error of the methods used for routine measurements. Standard Reference Materials from NBS are used according to special procedures to calibrate the methods used in measuring the uranium and plutonium standards so traceability can be established. Special precautions are required to minimize the effects of temperature, radiolysis, and evaporation. Standard reference values are periodically corrected to eliminate systematic errors caused by evaporation or decay products. Measurement control is achieved by requiring analysts to analyze a blind QCS each shift a measurement system is used on plant samples. Computer evaluation determines whether or not a measurement is within the +- 3 sigma control limits. Monthly evaluations of the QCS measurements are made to determine current bias correction factors for accountability measurements and detect significant changes in the bias and precision statistics. The evaluations are also used to plan activities for improving the reliability of the analytical chemistry measurements

  5. 21 CFR 111.105 - What must quality control personnel do?

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 2 2010-04-01 2010-04-01 false What must quality control personnel do? 111.105..., LABELING, OR HOLDING OPERATIONS FOR DIETARY SUPPLEMENTS Production and Process Control System: Requirements for Quality Control § 111.105 What must quality control personnel do? Quality control personnel must...

  6. Reducing lumber thickness variation using real-time statistical process control

    Science.gov (United States)

    Thomas M. Young; Brian H. Bond; Jan Wiedenbeck

    2002-01-01

    A technology feasibility study for reducing lumber thickness variation was conducted from April 2001 until March 2002 at two sawmills located in the southern U.S. A real-time statistical process control (SPC) system was developed that featured Wonderware human machine interface technology (HMI) with distributed real-time control charts for all sawing centers and...

  7. The Study on quality control of nuclear power installation project

    International Nuclear Information System (INIS)

    Wu Jie

    2008-01-01

    The quality planning, quality assurance and quality control are discussed by applying the quality control (QC) theory and combining the real situation of the Qinshan II project. This paper is practical and plays an active role in instruction of project quality control by applying the above QC theory and control techniques. (authors)

  8. Teaching Quality Control with Chocolate Chip Cookies

    Science.gov (United States)

    Baker, Ardith

    2014-01-01

    Chocolate chip cookies are used to illustrate the importance and effectiveness of control charts in Statistical Process Control. By counting the number of chocolate chips, creating the spreadsheet, calculating the control limits and graphing the control charts, the student becomes actively engaged in the learning process. In addition, examining…

  9. [Infection control management and practice in home care - analysis of structure quality].

    Science.gov (United States)

    Spegel, H; Höller, C; Randzio, O; Liebl, B; Herr, C

    2013-02-01

    Surveillance of infection control management and practices in home care is an important task of the public health service. While infection control aspects in residential homes for the aged and nursing are increasingly being discussed this subject has been poorly recognised in home care. The aim of this study was to identify problems in hygiene regarding the transmission of infectious diseases as well as quality assessment in home care. Based on the results of this study implications for infection control in home care facilities for public health services should be developed. Statistical analyses were performed on the primary quality assessment data of home care facilities collected by the medical service of health insurances via computer-assisted personal interviews between March 2006 and March 2009. Structure quality in 194 home care facilities was analysed as well as human resources and organisational conditions. Analyses were also done in the context of the clients' risk factor load. All analyses were performed by stratifying for the size of the home care services. To assess how the involved characteristics vary according to the size of the home care services chi-square tests and non-parametric tests were calculated. About 80% of the assessed home care services disposed of an infection control management plan. Compared to larger services smaller home care services, especially services with less than 10 clients had a poor structure in infection control management and practice. They also carried a higher load of risk factors in clients. The larger services had significantly less human resources. Surveillance of infection control management and practices by the public health services should focus on the structure of the smaller home care services. At the same time smaller home care services should be supported by offering training for the staff or counselling regarding hygiene-related aspects. Furthermore, the outcome quality of the larger home care services with

  10. Internal control reporting and accounting quality : Insight "comply-or-explain" internal control regime

    OpenAIRE

    Cao Thi Thanh, Huyen; Cheung, Tina

    2010-01-01

    Nowadays, there exist two reporting regimes, rules-based and principle-based (comply-or-explain). In the rules-based environment, researchers have studied the relationship between internal control quality and accounting quality. Prior studies have suggested that reports on internal control are an effective way for investors to evaluate the quality of the firm‟s internal control. By having a sound system of internal control, it creates reliance upon the firm‟s financial reporting. Therefore, t...

  11. SQC: secure quality control for meta-analysis of genome-wide association studies.

    Science.gov (United States)

    Huang, Zhicong; Lin, Huang; Fellay, Jacques; Kutalik, Zoltán; Hubaux, Jean-Pierre

    2017-08-01

    Due to the limited power of small-scale genome-wide association studies (GWAS), researchers tend to collaborate and establish a larger consortium in order to perform large-scale GWAS. Genome-wide association meta-analysis (GWAMA) is a statistical tool that aims to synthesize results from multiple independent studies to increase the statistical power and reduce false-positive findings of GWAS. However, it has been demonstrated that the aggregate data of individual studies are subject to inference attacks, hence privacy concerns arise when researchers share study data in GWAMA. In this article, we propose a secure quality control (SQC) protocol, which enables checking the quality of data in a privacy-preserving way without revealing sensitive information to a potential adversary. SQC employs state-of-the-art cryptographic and statistical techniques for privacy protection. We implement the solution in a meta-analysis pipeline with real data to demonstrate the efficiency and scalability on commodity machines. The distributed execution of SQC on a cluster of 128 cores for one million genetic variants takes less than one hour, which is a modest cost considering the 10-month time span usually observed for the completion of the QC procedure that includes timing of logistics. SQC is implemented in Java and is publicly available at https://github.com/acs6610987/secureqc. jean-pierre.hubaux@epfl.ch. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  12. The results of a quality-control programme in mammography

    International Nuclear Information System (INIS)

    Ramsdale, M.L.; Hiles, P.A.

    1989-01-01

    With the introduction of a breast screening programme in the UK, quality assurance in mammography is of paramount importance in assuring optimum imaging performance with low dose. Quality control checks are an essential part of the quality-assurance system. A quality-control programme at a breast screening clinic is described. Daily checks include film sensitometry for X-ray processor control and radiography of a lucite phantom to monitor the consistency of the X-ray machine automatic exposure control. Weekly checks include additional measurements on the performance of the automatic exposure control for different breast thickness and an overall assessment of image quality using a prototype mammography test phantom. The test phantom measures low-contrast sensitivity, high-contrast resolution and small-detail visibility. The results of the quality-control programme are presented with particular attention paid to tolerances and limiting values. (author)

  13. PENGURANGAN DEFECT PADA PRODUK SEPATU DENGAN MENGINTEGRASIKAN STATISTICAL PROCESS CONTROL (SPC DAN ROOT CAUSE ANALYSIS (RCA STUDI KASUS PT. XYZ

    Directory of Open Access Journals (Sweden)

    Moch. Teguh Fajrin

    2018-04-01

    Full Text Available PT. XYZ is a foreign capital investment companies are located in one area of town of Sidoarjo which has more than 7000 employees in it and of course want to have a high quality product . PT footwear products. XYZ is the world quality products that promote the quality and comfort of the wearer. Problems that occur in the production process of PT . XYZ of 7 -injection engine that has the most number of engine defect is 5.6 and 7. Quality products at heavily influence the shoe production to consumers. If damaged, then the product can not be distributed to consumers .Therefore , the quality of the shoes must be maintained for the successful marketing of the product. From our previous improvements made in reducing the defect in the production process , but the result is less than optimal , so do research that aims to make improvements again with another method . The method used in this research is to apply the methods of Statistical Processing Control (SPC and Root Cause Analysis (RCA. The results of the check sheet analysis and Pareto diagram can be seen the frequency of product defects (defects over roughing most occurred in March 2016 as many as 345 shoes (62% of the total production of 489 products. Then the second frequency in February 2016 as many as 214 shoes (38% of the total production of 357 products. And from the image Control Chart P can be concluded that the data are in a state of uncontrolled. Because of all these 50 data, there is one data point that is outside the control limits (out of control, on the 4th point the data has a value of the proportion of 1, beyond the limits Upper Control Limit (UCL is equal to 0.938. During the dots located within the boundaries of a controller, a process is under controlled conditions, and no action required. However, one point that lies outside the control limit is interpreted as the fact that the quality control in the production process (injection PT. XYZ uncontrolled or still experiencing

  14. A quality assessment of randomized controlled trial reports in endodontics.

    Science.gov (United States)

    Lucena, C; Souza, E M; Voinea, G C; Pulgar, R; Valderrama, M J; De-Deus, G

    2017-03-01

    To assess the quality of the randomized clinical trial (RCT) reports published in Endodontics between 1997 and 2012. Retrieval of RCTs in Endodontics was based on a search of the Thomson Reuters Web of Science (WoS) database (March 2013). Quality evaluation was performed using a checklist based on the Jadad criteria, CONSORT (Consolidated Standards of Reporting Trials) statement and SPIRIT (Standard Protocol Items: Recommendations for Interventional Trials). Descriptive statistics were used for frequency distribution of data. Student's t-test and Welch test were used to identify the influence of certain trial characteristics upon report quality (α = 0.05). A total of 89 RCTs were evaluated, and several methodological flaws were found: only 45% had random sequence generation at low risk of bias, 75% did not provide information on allocation concealment, and 19% were nonblinded designs. Regarding statistics, only 55% of the RCTs performed adequate sample size estimations, only 16% presented confidence intervals, and 25% did not provide the exact P-value. Also, 2% of the articles used no statistical tests, and in 87% of the RCTs, the information provided was insufficient to determine whether the statistical methodology applied was appropriate or not. Significantly higher scores were observed for multicentre trials (P = 0.023), RCTs signed by more than 5 authors (P = 0.03), articles belonging to journals ranked above the JCR median (P = 0.03), and articles complying with the CONSORT guidelines (P = 0.000). The quality of RCT reports in key areas for internal validity of the study was poor. Several measures, such as compliance with the CONSORT guidelines, are important in order to raise the quality of RCTs in Endodontics. © 2016 International Endodontic Journal. Published by John Wiley & Sons Ltd.

  15. K-type geomagnetic index nowcast with data quality control

    Directory of Open Access Journals (Sweden)

    René Warnant

    2011-07-01

    Full Text Available

    A nowcast system for operational estimation of a proxy K-type geomagnetic index is presented. The system is based on a fully automated computer procedure for real-time digital magnetogram data acquisition that includes screening of the dataset and removal of the outliers, estimation of the solar regular variation (SR of the geomagnetic field, calculation of the index, and issuing of an alert if storm-level activity is indicated. This is a time-controlled (rather than event-driven system that delivers the regular output of: the index value, the estimated quality flag, and eventually, an alert. The novel features provided are first, the strict control of the data input and processing, and second, the increased frequency of production of the index (every 1 h. Such quality control and increased time resolution have been found to be of crucial importance for various applications, e.g. ionospheric monitoring, that are of particular interest to us and to users of our service. The nowcast system operability, accuracy and precision have been tested with instantaneous measurements from recent years. A statistical comparison between the nowcast and the definitive index values shows that the average root-mean-square error is smaller than 1 KU. The system is now operational at the site of the Geophysical Centre of the Royal Meteorological Institute in Dourbes (50.1ºN, 4.6ºE, and it is being used for alerting users when geomagnetic storms take place.

  16. Adaptive statistical iterative reconstruction for volume-rendered computed tomography portovenography. Improvement of image quality

    International Nuclear Information System (INIS)

    Matsuda, Izuru; Hanaoka, Shohei; Akahane, Masaaki

    2010-01-01

    Adaptive statistical iterative reconstruction (ASIR) is a reconstruction technique for computed tomography (CT) that reduces image noise. The purpose of our study was to investigate whether ASIR improves the quality of volume-rendered (VR) CT portovenography. Institutional review board approval, with waived consent, was obtained. A total of 19 patients (12 men, 7 women; mean age 69.0 years; range 25-82 years) suspected of having liver lesions underwent three-phase enhanced CT. VR image sets were prepared with both the conventional method and ASIR. The required time to make VR images was recorded. Two radiologists performed independent qualitative evaluations of the image sets. The Wilcoxon signed-rank test was used for statistical analysis. Contrast-noise ratios (CNRs) of the portal and hepatic vein were also evaluated. Overall image quality was significantly improved by ASIR (P<0.0001 and P=0.0155 for each radiologist). ASIR enhanced CNRs of the portal and hepatic vein significantly (P<0.0001). The time required to create VR images was significantly shorter with ASIR (84.7 vs. 117.1 s; P=0.014). ASIR enhances CNRs and improves image quality in VR CT portovenography. It also shortens the time required to create liver VR CT portovenographs. (author)

  17. 40 CFR 75.59 - Certification, quality assurance, and quality control record provisions.

    Science.gov (United States)

    2010-07-01

    ... specified in Equation A-9 in appendix A to this part; (D) Statistical “t” value used in calculations; (E... monitoring systems, the coefficient or “K” factor or other mathematical algorithm used to adjust the... application, or periodically if a different method is used for annual quality assurance testing). (2) For each...

  18. Control of quality in mammography

    International Nuclear Information System (INIS)

    2006-10-01

    The present protocol of quality control/quality assurance in mammography is the result of the work of two regional projects realised in Latin America within the frame of ARCAL with the support of the IAEA. The first is ARCAL LV (RLA/6/043) project on quality assurance/quality control in mammography studies which analysed the present situation of the mammography in the member countries of the project which include: Bolivia, Colombia, Costa Rica, Cuba, El Salvador, Guatemala, Nicaragua, Panama, Paraguay, Peru, Dominican Republic and Republic of Venezuela and the second is ARCAL XLIX (RLA/9/035) project, whose members were Brazil, Colombia, Cuba, Chile, Mexico, and Peru, worked the application of Basic Safety Standards for the protection against ionising radiation with the aim to improve radiation protection in X-ray diagnosis medical practices through the implementation of the Basic Safety Standards (BSS) related to x-ray diagnosis in selected hospitals located in each country involved in the project. The work of both projects had been consolidated and harmonized in the present publication

  19. The study on quality control of bedside CR examination

    International Nuclear Information System (INIS)

    Yang Xufeng; Luo Xiaomei; Xu Qiaolan; Wu Tengfang; Wen Xingwei

    2007-01-01

    Objective: To study the quality controll of bedside CR examination and improves the imaging quality. Methods: X-ray examination with CR system were performed on 3,300 patients. All CR cassettes were encoded. The imaging plate and cassettes were cleaned regularly. Results: With and without quality control, the percentage of first-rate film was 58.2% and 51%, the second-rate film was 40% and 45.5%, the third-rate film was 1.3% and 2%, respectively. Corxespondingly, the ratio of re-examination decreased from 1.5% to 0.5% after quality control, and imaging quality was stable. Conclusion: The quality control of bedside CR examination can improve the image quality as well as lighten the labor of radiographers. (authors)

  20. TU-FG-201-05: Varian MPC as a Statistical Process Control Tool

    International Nuclear Information System (INIS)

    Carver, A; Rowbottom, C

    2016-01-01

    Purpose: Quality assurance in radiotherapy requires the measurement of various machine parameters to ensure they remain within permitted values over time. In Truebeam release 2.0 the Machine Performance Check (MPC) was released allowing beam output and machine axis movements to be assessed in a single test. We aim to evaluate the Varian Machine Performance Check (MPC) as a tool for Statistical Process Control (SPC). Methods: Varian’s MPC tool was used on three Truebeam and one EDGE linac for a period of approximately one year. MPC was commissioned against independent systems. After this period the data were reviewed to determine whether or not the MPC was useful as a process control tool. Analyses on individual tests were analysed using Shewhart control plots, using Matlab for analysis. Principal component analysis was used to determine if a multivariate model was of any benefit in analysing the data. Results: Control charts were found to be useful to detect beam output changes, worn T-nuts and jaw calibration issues. Upper and lower control limits were defined at the 95% level. Multivariate SPC was performed using Principal Component Analysis. We found little evidence of clustering beyond that which might be naively expected such as beam uniformity and beam output. Whilst this makes multivariate analysis of little use it suggests that each test is giving independent information. Conclusion: The variety of independent parameters tested in MPC makes it a sensitive tool for routine machine QA. We have determined that using control charts in our QA programme would rapidly detect changes in machine performance. The use of control charts allows large quantities of tests to be performed on all linacs without visual inspection of all results. The use of control limits alerts users when data are inconsistent with previous measurements before they become out of specification. A. Carver has received a speaker’s honorarium from Varian

  1. Quality assurance/quality control, reliability and availability of nuclear power plants

    International Nuclear Information System (INIS)

    Kueffer, K.

    1981-01-01

    In a first part this lectures will present a survey on nuclear power production and plant performance in the Western World and discuss key parameters such as load factors and non-availability. Some main reasons for reliable performance of nuclear power plants are given. The second part of this lecture deals with the question how quality assurance and quality control measures do directly influence plant reliability, availability and, thus, economy. Derived from worldwide experience gained from operating nuclear power plants, it may be concluded that the implementation of an overall quality assurance programme does not only satisfy safety requirements set forth by the nuclear regulatory bodies, but has also a considerable impact on plant reliability and availability. A positive effect on these figures will be achieved if the established quality assurance programme provides for a coordinated approach to all activities affecting quality. It is discussed how the quality of a product should be controlled and what kind of quality assurance measures by performed examples are given to demonstrate that the expenditure for maintenance work on components will decrease if planned and systematic quality assurance actions have been implemented during all procurement stages. (orig./RW)

  2. 40 CFR 81.51 - Portland Interstate Air Quality Control Region.

    Science.gov (United States)

    2010-07-01

    ... Quality Control Regions § 81.51 Portland Interstate Air Quality Control Region. The Portland Interstate Air Quality Control Region (Oregon-Washington) has been revised to consist of the territorial area... Portland Interstate Air Quality Control Region (Oregon-Washington) will be referred to by Washington...

  3. Statistical transformation and the interpretation of inpatient glucose control data.

    Science.gov (United States)

    Saulnier, George E; Castro, Janna C; Cook, Curtiss B

    2014-03-01

    To introduce a statistical method of assessing hospital-based non-intensive care unit (non-ICU) inpatient glucose control. Point-of-care blood glucose (POC-BG) data from hospital non-ICUs were extracted for January 1 through December 31, 2011. Glucose data distribution was examined before and after Box-Cox transformations and compared to normality. Different subsets of data were used to establish upper and lower control limits, and exponentially weighted moving average (EWMA) control charts were constructed from June, July, and October data as examples to determine if out-of-control events were identified differently in nontransformed versus transformed data. A total of 36,381 POC-BG values were analyzed. In all 3 monthly test samples, glucose distributions in nontransformed data were skewed but approached a normal distribution once transformed. Interpretation of out-of-control events from EWMA control chart analyses also revealed differences. In the June test data, an out-of-control process was identified at sample 53 with nontransformed data, whereas the transformed data remained in control for the duration of the observed period. Analysis of July data demonstrated an out-of-control process sooner in the transformed (sample 55) than nontransformed (sample 111) data, whereas for October, transformed data remained in control longer than nontransformed data. Statistical transformations increase the normal behavior of inpatient non-ICU glycemic data sets. The decision to transform glucose data could influence the interpretation and conclusions about the status of inpatient glycemic control. Further study is required to determine whether transformed versus nontransformed data influence clinical decisions or evaluation of interventions.

  4. Assessment of Near-Bottom Water Quality of Southwestern Coast of Sarawak, Borneo, Malaysia: A Multivariate Statistical Approach

    Directory of Open Access Journals (Sweden)

    Chen-Lin Soo

    2017-01-01

    Full Text Available The study on Sarawak coastal water quality is scarce, not to mention the application of the multivariate statistical approach to investigate the spatial variation of water quality and to identify the pollution source in Sarawak coastal water. Hence, the present study aimed to evaluate the spatial variation of water quality along the coastline of the southwestern region of Sarawak using multivariate statistical techniques. Seventeen physicochemical parameters were measured at 11 stations along the coastline with approximately 225 km length. The coastal water quality showed spatial heterogeneity where the cluster analysis grouped the 11 stations into four different clusters. Deterioration in coastal water quality has been observed in different regions of Sarawak corresponding to land use patterns in the region. Nevertheless, nitrate-nitrogen exceeded the guideline value at all sampling stations along the coastline. The principal component analysis (PCA has determined a reduced number of five principal components that explained 89.0% of the data set variance. The first PC indicated that the nutrients were the dominant polluting factors, which is attributed to the domestic, agricultural, and aquaculture activities, followed by the suspended solids in the second PC which are related to the logging activities.

  5. Quality assurance in education: The role of ICT and quality control ...

    African Journals Online (AJOL)

    Quality assurance in education is perceived in this paper to be a product of the impact of information and communication technologies as well as the statutory control measures especially in tertiary institutions in Nigeria. The paper reviews the concept of quality and quality assurance and their general application to ...

  6. Quality control of scintillation cameras (planar and SPECT)

    International Nuclear Information System (INIS)

    Shaekhoon, E.S.

    2008-01-01

    Regular quality control is one of the corner stones of nuclear medicine and a prerequisite for adequate diagnostic imaging. Many papers have been published on quality control of planar and SPECT imaging system up to now, however only minor attenuation has been given to the assessment of the performance of imaging systems. In this research we are going to discuss a comprehensive set of test procedures including regular quality control. Our purpose is to go through analysis of the methods and results then to test our hypothesis which state that there is strong relationship between regular and proper evaluation of quality control and the continuity of better medical services in nuclear medicine department. The selection of the tests is discussed and the tests are described, then results are presented. In addition action thresholds are proposed. The quality control tests can be applied to systems with either a moving detector or a moving image table, and to both detector with a large field of view and detectors with a small field of view. The tests presented on this research do not required special phantoms or sources other than those used for quality control of stationary gamma camera and SPECT. They can be applied for acceptance testing and for performance testing in a regular quality assurance program. The data has been evaluated based on me diso software in comparing with IAEA expert software and system specification within the reference values. Our final results confirm our hypothesis, there are some comments about the characteristics and performance of this system that being observed and solved, then a departmental protocol for routine quality control (Q.C) has being established.(Author)

  7. PACS quality control and automatic problem notifier

    Science.gov (United States)

    Honeyman-Buck, Janice C.; Jones, Douglas; Frost, Meryll M.; Staab, Edward V.

    1997-05-01

    One side effect of installing a clinical PACS Is that users become dependent upon the technology and in some cases it can be very difficult to revert back to a film based system if components fail. The nature of system failures range from slow deterioration of function as seen in the loss of monitor luminance through sudden catastrophic loss of the entire PACS networks. This paper describes the quality control procedures in place at the University of Florida and the automatic notification system that alerts PACS personnel when a failure has happened or is anticipated. The goal is to recover from a failure with a minimum of downtime and no data loss. Routine quality control is practiced on all aspects of PACS, from acquisition, through network routing, through display, and including archiving. Whenever possible, the system components perform self and between platform checks for active processes, file system status, errors in log files, and system uptime. When an error is detected or a exception occurs, an automatic page is sent to a pager with a diagnostic code. Documentation on each code, trouble shooting procedures, and repairs are kept on an intranet server accessible only to people involved in maintaining the PACS. In addition to the automatic paging system for error conditions, acquisition is assured by an automatic fax report sent on a daily basis to all technologists acquiring PACS images to be used as a cross check that all studies are archived prior to being removed from the acquisition systems. Daily quality control is preformed to assure that studies can be moved from each acquisition and contrast adjustment. The results of selected quality control reports will be presented. The intranet documentation server will be described with the automatic pager system. Monitor quality control reports will be described and the cost of quality control will be quantified. As PACS is accepted as a clinical tool, the same standards of quality control must be established

  8. Quality by control: Towards model predictive control of mammalian cell culture bioprocesses.

    Science.gov (United States)

    Sommeregger, Wolfgang; Sissolak, Bernhard; Kandra, Kulwant; von Stosch, Moritz; Mayer, Martin; Striedner, Gerald

    2017-07-01

    The industrial production of complex biopharmaceuticals using recombinant mammalian cell lines is still mainly built on a quality by testing approach, which is represented by fixed process conditions and extensive testing of the end-product. In 2004 the FDA launched the process analytical technology initiative, aiming to guide the industry towards advanced process monitoring and better understanding of how critical process parameters affect the critical quality attributes. Implementation of process analytical technology into the bio-production process enables moving from the quality by testing to a more flexible quality by design approach. The application of advanced sensor systems in combination with mathematical modelling techniques offers enhanced process understanding, allows on-line prediction of critical quality attributes and subsequently real-time product quality control. In this review opportunities and unsolved issues on the road to a successful quality by design and dynamic control implementation are discussed. A major focus is directed on the preconditions for the application of model predictive control for mammalian cell culture bioprocesses. Design of experiments providing information about the process dynamics upon parameter change, dynamic process models, on-line process state predictions and powerful software environments seem to be a prerequisite for quality by control realization. © 2017 The Authors. Biotechnology Journal published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. Proposed quality control protocol of a dual energy bone densitometer from Spanish protocol for quality control of radiology

    International Nuclear Information System (INIS)

    Saez, F.; Benito, M. A.; Collado, P.; Saez, M.

    2011-01-01

    In this paper we propose additional testing to complete the tests by the Spanish Protocol for Quality Control of Diagnostic Radiology, taking into account the particular characteristics of these units, and including these tests in the estimation of patient dose dose assessment area. There is also the possibility to independently verify the quality control tests that are done automatically.

  10. Statistical Control of Measurement Quality; Controle Statistique de la Qualite de la Mesure; Statisticheskim kontrol' kachestva izmerenij; Control Estadistico de la Calidad de las Mediciones

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, C. A. [Battelle Memorial Institute, Richland, WA (United States)

    1966-02-15

    Effective nuclear materials management, and hence design and operation of associated material control systems, depend heavily on the quality of the quantitative data on which they are based. Information concerning the reliability of the measurement methods employed is essential to both the determination of data requirements and the evaluation of results obtained. Any method of analysis should be (1) relatively free from bias and (2) reproducible, or, in more usual terminology, precise. Many statistical techniques are available to evaluate and control the reproducibility of analytical results. Economical and effective experimental designs have been developed for the segregation of different sources of measurement error. Procedures have been developed or adapted tot use in maintaining and controlling the precision of routine measurements. All of these techniques require that at least some measurements must be duplicated, but duplication of all measurements can be justified only when the detection of every gross error, or mistake, is extremely important. Three types of measurement bias can be considered: (1) bias relative to a standard, (2) bias relative to prior experience, and (3) bias relative to a group. The first refers to the degree to which the measurements obtained deviate systematically from some ''standard'' which is unbiased either (1) by definition, or (2) because all known sources of bias have been removed. The second in concerned with the presence of systematic differences over a period of time. The third type of bias concerns the relationship between different physical entities or individuals at a given time. Recent developments in statistical methodology applicable to the evaluation of all three types of bias are discussed. Examples of the use of the statistical techniques discussed on Hanford data are presented. (author) [French] La gestion efficace des matieres nucleaires et, par consequent, la conception et le fonctionnement des systemes de controle

  11. Analytical approaches to quality assurance and quality control in rangeland monitoring data

    Science.gov (United States)

    Producing quality data to support land management decisions is the goal of every rangeland monitoring program. However, the results of quality assurance (QA) and quality control (QC) efforts to improve data quality are rarely reported. The purpose of QA and QC is to prevent and describe non-sampling...

  12. AUTOMATION OF THE SYSTEM OF INTERNAL LABORATORY QUALITY CONTROL

    Directory of Open Access Journals (Sweden)

    V. Z. Stetsyuk

    2015-05-01

    Full Text Available Quality control system base d on the principles of standardi zation of all phases of laboratory testing and analysis of internal laboratory quality control and external quality assessment. For the detection accuracy of the results of laboratory tests, carried out internally between the laboratory and laboratory quality control. Under internal laboratory quality control we understand measurement results of each analysis in each anal ytical series rendered directly in the lab every day. The purpose of internal laboratory control - identifying and eliminating unacceptable deviations from standard perfor mance test in the laboratory, i.e. identifying and eliminating harmful analytical errors. The solutions to these problems by implementing automated systems - software that allows you to optimize analytical laboratory research stage of the procedure by automatically creating process control charts was shown.

  13. Validation of an image quality index: its correlation with quality control parameters

    International Nuclear Information System (INIS)

    Cabrejas, M.L.C.; Giannone, C.A.; Arashiro, J.G.; Cabrejas, R.C.

    2002-01-01

    Objective and Rationale: To validate a new image quality index (the Performance Index: PI) that assesses detectability of simulated lesions with a phantom. This index, presumably must depend markedly on quality control (QC) parameters as tomographic uniformity (Unif), Centre of Rotation (COR) and Spatial resolution (FWHM). The simultaneous effects of the QC parameters may explain much of the variation in the PIs; i.e. they may be predictors of the PI values. Methods: An overall performance phantom containing 3 sections was used. The first uniform section was used to determine tomographic uniformity. From the analysis of the slices corresponding to the second section containing 8 cold cylindrical simulated lesions of different diameters (range 7 mm - 17 mm), the number of true and false positives are determined and from these a new Performance Index (PI) is defined as the ratio between the positive predictive value and the sensitivity (expressed as its complement adding a constant to avoid a singularity). A point source located on the top of the phantom was used to determine the Centre of Rotation and the Spatial Resolution expressed by the FWHM in mm. 40 nuclear medicine labs participate at the survey. Standard multiple regression analysis between the Performance Index, as dependent variable, and FWHM, COR and Unif as independent variables was performed to evaluate the influence of the QC parameters on the PI values. Results: It is shown that resolution and COR are both predictors of the PIs, with statistical significance for the multiple correlation co-efficient R. However the addition of the variable tomographic uniformity to the model, does not improve the prediction of PIs. Moreover, the regression model lacks overall statistical significance. Regression summary for dependent variable Performance Index is presented. Conclusions: We confirm that the new Performance Index (PI), depends on QC parameters as COR and Spatial resolution. Those labs whose PIs are out

  14. Using Statistical Process Control Methods to Classify Pilot Mental Workloads

    National Research Council Canada - National Science Library

    Kudo, Terence

    2001-01-01

    .... These include cardiac, ocular, respiratory, and brain activity measures. The focus of this effort is to apply statistical process control methodology on different psychophysiological features in an attempt to classify pilot mental workload...

  15. 14 CFR 21.147 - Changes in quality control system.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Changes in quality control system. 21.147 Section 21.147 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION... quality control system. After the issue of a production certificate, each change to the quality control...

  16. 21 CFR 111.117 - What quality control operations are required for equipment, instruments, and controls?

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 2 2010-04-01 2010-04-01 false What quality control operations are required for equipment, instruments, and controls? 111.117 Section 111.117 Food and Drugs FOOD AND DRUG ADMINISTRATION... and Process Control System: Requirements for Quality Control § 111.117 What quality control operations...

  17. Metrology and quality control handbook

    International Nuclear Information System (INIS)

    Hofmann, D.

    1983-01-01

    This book tries to present the fundamentals of metrology and quality control in brief surveys. Compromises had to be made in order to reduce the material available to a sensible volume for the sake of clarity. This becomes evident by the following two restrictions which had to made: First, in dealing with the theoretical principles of metrology and quality control, mere reference had to be made in many cases to the great variety of special literature without discussing it to explain further details. Second, in dealing with the application of metrology and quality control techniques in practice, only the basic qantities of the International System of Units (SI) could be taken into account as a rule. Some readers will note that many special measuring methods and equipment known to them are not included in this book. I do hope, however, that this short-coming will show to have a positive effect, too. This book will show the reader how to find the basic quantities and units from the derived quantities and units, and the steps that are necessary to solve any kind of measuring task. (orig./RW) [de

  18. New insight into the comparative power of quality-control rules that use control observations within a single analytical run.

    Science.gov (United States)

    Parvin, C A

    1993-03-01

    The error detection characteristics of quality-control (QC) rules that use control observations within a single analytical run are investigated. Unlike the evaluation of QC rules that span multiple analytical runs, most of the fundamental results regarding the performance of QC rules applied within a single analytical run can be obtained from statistical theory, without the need for simulation studies. The case of two control observations per run is investigated for ease of graphical display, but the conclusions can be extended to more than two control observations per run. Results are summarized in a graphical format that offers many interesting insights into the relations among the various QC rules. The graphs provide heuristic support to the theoretical conclusions that no QC rule is best under all error conditions, but the multirule that combines the mean rule and a within-run standard deviation rule offers an attractive compromise.

  19. MRI quality control: six imagers studied using eleven unified image quality parameters

    International Nuclear Information System (INIS)

    Ihalainen, T.; Sipilae, O.; Savolainen, S.

    2004-01-01

    Quality control of the magnetic resonance imagers of different vendors in the clinical environment is non-harmonised, and comparing the performance is difficult. The purpose of this study was to develop and apply a harmonised long-term quality control protocol for the six imagers in our organisation in order to assure that they fulfil the same basic image quality requirements. The same Eurospin phantom set and identical imaging parameters were used with each imager. Values of 11 comparable parameters describing the image quality were measured. Automatic image analysis software was developed to objectively analyse the images. The results proved that the imagers were operating at a performance level adequate for clinical imaging. Some deficiencies were detected in image uniformity and geometry. The automated analysis of the Eurospin phantom images was successful. The measurements were successfully repeated after 2 weeks on one imager and after half a year on all imagers. As an objective way of examining the image quality, this kind of comparable and objective quality control of different imagers is considered as an essential step towards harmonisation of the clinical MRI studies through a large hospital organisation. (orig.)

  20. Intercalibration study. Net of quality control of waters of the Department of Antioquia

    International Nuclear Information System (INIS)

    Parra M, C.M; Mejia Z, G.M.

    1999-01-01

    The norm ISO 5725 has set a series of statistical procedures for the evaluation of results for an intercalibration study which of course is a fundamental support for the setting of a quality control program that must be implement by every laboratory seeking accreditation. In the present paper the implementation of such procedures is shown for an exercise classified to be as of a uniform level. The chosen parameter was suspended solids which is included in the fees of the retributive rates set by the Ministerio del Medio Ambiente in Colombia. The exercise was done by the laboratories that are members of the Analytical Control of Water Web in the Department of Antioquia

  1. A statistical approach to estimating effects of performance shaping factors on human error probabilities of soft controls

    International Nuclear Information System (INIS)

    Kim, Yochan; Park, Jinkyun; Jung, Wondea; Jang, Inseok; Hyun Seong, Poong

    2015-01-01

    Despite recent efforts toward data collection for supporting human reliability analysis, there remains a lack of empirical basis in determining the effects of performance shaping factors (PSFs) on human error probabilities (HEPs). To enhance the empirical basis regarding the effects of the PSFs, a statistical methodology using a logistic regression and stepwise variable selection was proposed, and the effects of the PSF on HEPs related with the soft controls were estimated through the methodology. For this estimation, more than 600 human error opportunities related to soft controls in a computerized control room were obtained through laboratory experiments. From the eight PSF surrogates and combinations of these variables, the procedure quality, practice level, and the operation type were identified as significant factors for screen switch and mode conversion errors. The contributions of these significant factors to HEPs were also estimated in terms of a multiplicative form. The usefulness and limitation of the experimental data and the techniques employed are discussed herein, and we believe that the logistic regression and stepwise variable selection methods will provide a way to estimate the effects of PSFs on HEPs in an objective manner. - Highlights: • It is necessary to develop an empirical basis for the effects of the PSFs on the HEPs. • A statistical method using a logistic regression and variable selection was proposed. • The effects of PSFs on the HEPs of soft controls were empirically investigated. • The significant factors were identified and their effects were estimated

  2. European quality assurance and quality control for cut-off walls and caps

    International Nuclear Information System (INIS)

    Jefferis, S.A.

    1997-01-01

    Cut-off walls and caps both may be seriously compromised by small areas of substandard materials or work. Quality assurance/quality control is therefore of crucial importance and the paper sets out the issues that need to be addressed when designing a quality plan for a containment. Consideration is given to the purpose of the containment, the parameters to be controlled, specifications and standards and tests on raw and manufactured materials and on the in-situ containment. It is not the purpose of the paper to give detailed test procedures but rather to identify the questions that must be answered to develop a quality plan

  3. A new instrument for statistical process control of thermoset molding

    International Nuclear Information System (INIS)

    Day, D.R.; Lee, H.L.; Shepard, D.D.; Sheppard, N.F.

    1991-01-01

    The recent development of a rugged ceramic mold mounted dielectric sensor and high speed dielectric instrumentation now enables monitoring and statistical process control of production molding over thousands of runs. In this work special instrumentation and software (ICAM-1000) was utilized that automatically extracts critical point during the molding process including flow point, viscosity minimum gel inflection, and reaction endpoint. In addition, other sensors were incorporated to measure temperature and pressure. The critical point as well as temperature and pressure were then recorded during normal production and then plotted in the form of statistical process control (SPC) charts. Experiments have been carried out in RIM, SMC, and RTM type molding operations. The influence of temperature, pressure chemistry, and other variables has been investigated. In this paper examples of both RIM and SMC are discussed

  4. Computer-supported quality control in X-ray diagnosis

    International Nuclear Information System (INIS)

    Maier, W.; Klotz, E.

    1989-01-01

    Quality control of X-ray facilities in radiological departments of large hospitals is possible only if the instrumentation used for measurements is interfaced to a computer. The central computer helps to organize the measurements as well as analyse and record the results. It can also be connected to a densitometer and camera for evaluating radiographs of test devices. Other quality control tests are supported by a mobile station with equipment for non-invasive dosimetry measurements. Experience with a computer-supported system in quality control of film and film processing is described and the evaluation methods of ANSI and the German industrial standard DIN are compared. The disadvantage of these methods is the exclusion of film quality parameters, which can make processing control almost worthless. (author)

  5. Quality control of radiation therapy in clinical trials

    International Nuclear Information System (INIS)

    Kramer, S.; Lustig, R.; Grundy, G.

    1983-01-01

    The RTOG is a group of participating institutions which has a major interest in furthering clinical radiation oncology. They have formulated protocols for clinical investigation in which radiation therapy is the major modality of treatment. In addition, other modalities, such as chemotherapy, radiation sensitizers, and hyperthermia, are used in combined approach to cancer. Quality control in all aspects of patient management is necessary to insure quality data. These areas include evaluation of pathology, physics, and dosimetry, and clinical patient data. Quality control is both time consuming and expensive. However, by dividing these tasks into various levels and time frames, by using computerized data-control mechanisms, and by employing appropriate levels of ancillary personnel expertise, quality control can improve compliance and decrease the cost of investigational trials

  6. Secondary Control for Voltage Quality Enhancement in Microgrids

    DEFF Research Database (Denmark)

    Savaghebi, Mehdi; Jalilian, Alireza; Vasquez, Juan Carlos

    2012-01-01

    In this paper, a hierarchical control scheme is proposed for enhancement of sensitive load bus (SLB) voltage quality in microgrids. The control structure consists of primary and secondary levels. The primary control level comprises distributed generators (DGs) local controllers. Each of these con......In this paper, a hierarchical control scheme is proposed for enhancement of sensitive load bus (SLB) voltage quality in microgrids. The control structure consists of primary and secondary levels. The primary control level comprises distributed generators (DGs) local controllers. Each...

  7. Shipping/Receiving and Quality Control

    Data.gov (United States)

    Federal Laboratory Consortium — Shipping receiving, quality control, large and precise inspection and CMM machines. Coordinate Measuring Machines, including "scanning" probes, optical comparators,...

  8. Quality control of the activity meter

    International Nuclear Information System (INIS)

    Rodrigues, Marlon da Silva Brandão; Sá, Lídia Vasconcelos de

    2017-01-01

    Objective: To carry out a comparative analysis of national and international standards regarding the quality control of the activity meter used in Nuclear Medicine Services in Brazil. Material and methods: Quality control protocols from the International Atomic Energy Agency (IAEA), American Association of Physicists in Medicine (AAPM) and International Electrotechnical Commission (IEC) were pointed out and compared with requirements from both regulatory Brazilian agencies, National Surveillance Agency (ANVISA) and National Nuclear Energy Commission (CNEN). Results: The daily routine tests recommended by the regulatory agencies do not have significant differences; in contrast the tests with higher periodicities like (accuracy, linearity and precision) have differences discrepant. Conclusion: In view of the comparative analysis carried out, it is suggested that the national recommendations for the quality control tests of the activity meter should be checked and evaluated, with emphasis on the semiannual and annual periodicity tests. (author)

  9. Sensometrics for Food Quality Control

    DEFF Research Database (Denmark)

    Brockhoff, Per B.

    2011-01-01

    The industrial development of innovative and succesful food items and the measuring of food quality in general is difficult without actually letting human beings evaluate the products using their senses at some point in the process. The use of humans as measurement instruments calls for special...... attention in the modelling and data analysis phase. In this paper the focus is on sensometrics – the „metric“ side of the sensory science field. The sensometrics field is introduced and related to the fields of statistics, chemometrics and psychometrics. Some of the most commonly used sensory testing...

  10. Quality assurance programme and quality control

    International Nuclear Information System (INIS)

    Alvarez de Buergo, L.

    1979-01-01

    The paper analyses the requirements for the quality assurance and control in nuclear power plant projects which are needed to achieve safe, reliable and economic plants. The author describes the structure for the establishment of a nuclear programme at the national level and the participation of the different bodies involved in a nuclear power plant project. The paper ends with the study of a specific case in Spain. (NEA) [fr

  11. Quality control of imaging devices

    International Nuclear Information System (INIS)

    Soni, P.S.

    1992-01-01

    Quality assurance in nuclear medicine refers collectively to all aspects of a nuclear medicine service. It would include patient scheduling, radiopharmaceutical preparation and dispensing, radiation protection of patients, staff and general public, preventive maintenance and the care of instruments, methodology, data interpretation and records keeping, and many other small things which contribute directly or indirectly to the overall quality of a nuclear medicine service in a hospital. Quality Control, on the other hand, refers to a signal component of the system and is usually applied in relation to a specific instrument and its performance

  12. Assessment of Surface Water Quality Using Multivariate Statistical Techniques in the Terengganu River Basin

    International Nuclear Information System (INIS)

    Aminu Ibrahim; Hafizan Juahir; Mohd Ekhwan Toriman; Mustapha, A.; Azman Azid; Isiyaka, H.A.

    2015-01-01

    Multivariate Statistical techniques including cluster analysis, discriminant analysis, and principal component analysis/factor analysis were applied to investigate the spatial variation and pollution sources in the Terengganu river basin during 5 years of monitoring 13 water quality parameters at thirteen different stations. Cluster analysis (CA) classified 13 stations into 2 clusters low polluted (LP) and moderate polluted (MP) based on similar water quality characteristics. Discriminant analysis (DA) rendered significant data reduction with 4 parameters (pH, NH 3 -NL, PO 4 and EC) and correct assignation of 95.80 %. The PCA/ FA applied to the data sets, yielded in five latent factors accounting 72.42 % of the total variance in the water quality data. The obtained varifactors indicate that parameters in charge for water quality variations are mainly related to domestic waste, industrial, runoff and agricultural (anthropogenic activities). Therefore, multivariate techniques are important in environmental management. (author)

  13. Adaptive statistical iterative reconstruction use for radiation dose reduction in pediatric lower-extremity CT: impact on diagnostic image quality.

    Science.gov (United States)

    Shah, Amisha; Rees, Mitchell; Kar, Erica; Bolton, Kimberly; Lee, Vincent; Panigrahy, Ashok

    2018-06-01

    For the past several years, increased levels of imaging radiation and cumulative radiation to children has been a significant concern. Although several measures have been taken to reduce radiation dose during computed tomography (CT) scan, the newer dose reduction software adaptive statistical iterative reconstruction (ASIR) has been an effective technique in reducing radiation dose. To our knowledge, no studies are published that assess the effect of ASIR on extremity CT scans in children. To compare radiation dose, image noise, and subjective image quality in pediatric lower extremity CT scans acquired with and without ASIR. The study group consisted of 53 patients imaged on a CT scanner equipped with ASIR software. The control group consisted of 37 patients whose CT images were acquired without ASIR. Image noise, Computed Tomography Dose Index (CTDI) and dose length product (DLP) were measured. Two pediatric radiologists rated the studies in subjective categories: image sharpness, noise, diagnostic acceptability, and artifacts. The CTDI (p value = 0.0184) and DLP (p value ASIR compared with non-ASIR studies. However, the subjective ratings for sharpness (p ASIR images (p ASIR CT studies. Adaptive statistical iterative reconstruction reduces radiation dose for lower extremity CTs in children, but at the expense of diagnostic imaging quality. Further studies are warranted to determine the specific utility of ASIR for pediatric musculoskeletal CT imaging.

  14. 7 CFR 58.523 - Laboratory and quality control tests.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Laboratory and quality control tests. 58.523 Section... Service 1 Operations and Operating Procedures § 58.523 Laboratory and quality control tests. (a) Quality control tests shall be made on samples as often as necessary to determine the shelf-life and stability of...

  15. 21 CFR 640.56 - Quality control test for potency.

    Science.gov (United States)

    2010-04-01

    ... quality control test for potency may be performed by a clinical laboratory which meets the standards of... 21 Food and Drugs 7 2010-04-01 2010-04-01 false Quality control test for potency. 640.56 Section...) BIOLOGICS ADDITIONAL STANDARDS FOR HUMAN BLOOD AND BLOOD PRODUCTS Cryoprecipitate § 640.56 Quality control...

  16. Training and support to improve ICD coding quality: A controlled before-and-after impact evaluation.

    Science.gov (United States)

    Dyers, Robin; Ward, Grant; Du Plooy, Shane; Fourie, Stephanus; Evans, Juliet; Mahomed, Hassan

    2017-05-24

    The proposed National Health Insurance policy for South Africa (SA) requires hospitals to maintain high-quality International Statistical Classification of Diseases (ICD) codes for patient records. While considerable strides had been made to improve ICD coding coverage by digitising the discharge process in the Western Cape Province, further intervention was required to improve data quality. The aim of this controlled before-and-after study was to evaluate the impact of a clinician training and support initiative to improve ICD coding quality. To compare ICD coding quality between two central hospitals in the Western Cape before and after the implementation of a training and support initiative for clinicians at one of the sites. The difference in differences in data quality between the intervention site and the control site was calculated. Multiple logistic regression was also used to determine the odds of data quality improvement after the intervention and to adjust for potential differences between the groups. The intervention had a positive impact of 38.0% on ICD coding completeness over and above changes that occurred at the control site. Relative to the baseline, patient records at the intervention site had a 6.6 (95% confidence interval 3.5 - 16.2) adjusted odds ratio of having a complete set of ICD codes for an admission episode after the introduction of the training and support package. The findings on impact on ICD coding accuracy were not significant. There is sufficient pragmatic evidence that a training and support package will have a considerable positive impact on ICD coding completeness in the SA setting.

  17. Web quality control for lectures: Supercourse and Amazon.com.

    Science.gov (United States)

    Linkov, Faina; LaPorte, Ronald; Lovalekar, Mita; Dodani, Sunita

    2005-12-01

    Peer review has been at the corner stone of quality control of the biomedical journals in the past 300 years. With the emergency of the Internet, new models of quality control and peer review are emerging. However, such models are poorly investigated. We would argue that the popular system of quality control used in Amazon.com offers a way to ensure continuous quality improvement in the area of research communications on the Internet. Such system is providing an interesting alternative to the traditional peer review approaches used in the biomedical journals and challenges the traditional paradigms of scientific publishing. This idea is being explored in the context of Supercourse, a library of 2,350 prevention lectures, shared for free by faculty members from over 150 countries. Supercourse is successfully utilizing quality control approaches that are similar to Amazon.com model. Clearly, the existing approaches and emerging alternatives for quality control in scientific communications needs to be assessed scientifically. Rapid explosion of internet technologies could be leveraged to produce better, more cost effective systems for quality control in the biomedical publications and across all sciences.

  18. 40 CFR 81.111 - Georgetown Intrastate Air Quality Control Region.

    Science.gov (United States)

    2010-07-01

    ... Quality Control Regions § 81.111 Georgetown Intrastate Air Quality Control Region. The Georgetown Intrastate Air Quality Control Region (South Carolina) consists of the territorial area encompassed by the... 40 Protection of Environment 17 2010-07-01 2010-07-01 false Georgetown Intrastate Air Quality...

  19. 40 CFR 81.107 - Greenwood Intrastate Air Quality Control Region.

    Science.gov (United States)

    2010-07-01

    ... Quality Control Regions § 81.107 Greenwood Intrastate Air Quality Control Region. The Greenwood Intrastate Air Quality Control Region (South Carolina) consists of the territorial area encompassed by the... 40 Protection of Environment 17 2010-07-01 2010-07-01 false Greenwood Intrastate Air Quality...

  20. 40 CFR 81.108 - Columbia Intrastate Air Quality Control Region.

    Science.gov (United States)

    2010-07-01

    ... Quality Control Regions § 81.108 Columbia Intrastate Air Quality Control Region. The Columbia Intrastate Air Quality Control Region (South Carolina) consists of the territorial area encompassed by the... 40 Protection of Environment 17 2010-07-01 2010-07-01 false Columbia Intrastate Air Quality...

  1. 40 CFR 81.109 - Florence Intrastate Air Quality Control Region.

    Science.gov (United States)

    2010-07-01

    ... Quality Control Regions § 81.109 Florence Intrastate Air Quality Control Region. The Florence Intrastate Air Quality Control Region (South Carolina) consists of the territorial area encompassed by the... 40 Protection of Environment 17 2010-07-01 2010-07-01 false Florence Intrastate Air Quality...

  2. 40 CFR 81.35 - Louisville Interstate Air Quality Control Region.

    Science.gov (United States)

    2010-07-01

    ... Quality Control Regions § 81.35 Louisville Interstate Air Quality Control Region. The Louisville Interstate Air Quality Control Region (Kentucky-Indiana) consists of the territorial area encompassed by the... 40 Protection of Environment 17 2010-07-01 2010-07-01 false Louisville Interstate Air Quality...

  3. Application of classical versus bayesian statistical control charts to on-line radiological monitoring

    International Nuclear Information System (INIS)

    DeVol, T.A.; Gohres, A.A.; Williams, C.L.

    2009-01-01

    False positive and false negative incidence rates of radiological monitoring data from classical and Bayesian statistical process control chart techniques are compared. The on-line monitoring for illicit radioactive material with no false positives or false negatives is the goal of homeland security monitoring, but is unrealistic. However, statistical fluctuations in the detector signal, short detection times, large source to detector distances, and shielding effects make distinguishing between a radiation source and natural background particularly difficult. Experimental time series data were collected using a 1' x 1' LaCl 3 (Ce) based scintillation detector (Scionix, Orlando, FL) under various simulated conditions. Experimental parameters include radionuclide (gamma-ray) energy, activity, density thickness (source to detector distance and shielding), time, and temperature. All statistical algorithms were developed using MATLAB TM . The Shewhart (3-σ) control chart and the cumulative sum (CUSUM) control chart are the classical procedures adopted, while the Bayesian technique is the Shiryayev-Roberts (S-R) control chart. The Shiryayev-Roberts method was the best method for controlling the number of false positive detects, followed by the CUSUM method. However, The Shiryayev-Roberts method, used without modification, resulted in one of the highest false negative incidence rates independent of the signal strength. Modification of The Shiryayev-Roberts statistical analysis method reduced the number of false negatives, but resulted in an increase in the false positive incidence rate. (author)

  4. [Pharmaceutical product quality control and good manufacturing practices].

    Science.gov (United States)

    Hiyama, Yukio

    2010-01-01

    This report describes the roles of Good Manufacturing Practices (GMP) in pharmaceutical product quality control. There are three keys to pharmaceutical product quality control. They are specifications, thorough product characterization during development, and adherence to GMP as the ICH Q6A guideline on specifications provides the most important principles in its background section. Impacts of the revised Pharmaceutical Affairs Law (rPAL) which became effective in 2005 on product quality control are discussed. Progress of ICH discussion for Pharmaceutical Development (Q8), Quality Risk Management (Q9) and Pharmaceutical Quality System (Q10) are reviewed. In order to reconstruct GMP guidelines and GMP inspection system in the regulatory agencies under the new paradigm by rPAL and the ICH, a series of Health Science studies were conducted. For GMP guidelines, product GMP guideline, technology transfer guideline, laboratory control guideline and change control system guideline were written. For the GMP inspection system, inspection check list, inspection memo and inspection scenario were proposed also by the Health Science study groups. Because pharmaceutical products and their raw materials are manufactured and distributed internationally, collaborations with other national authorities are highly desired. In order to enhance the international collaborations, consistent establishment of GMP inspection quality system throughout Japan will be essential.

  5. Family Control and Earnings Quality

    Directory of Open Access Journals (Sweden)

    Carolina Bona Sánchez

    2007-06-01

    Full Text Available El trabajo analiza la relación entre el control familiar y la calidad de la información contable en un contexto en el que el tradicional conflicto de agencia entre directivos y accionistas se desplaza a la divergencia de intereses entre accionistas controladores y minoritarios. Los resultados alcanzados muestran que, en comparación con las no familiares, las empresas de naturaleza familiar divulgan unos resultados de mayor calidad, tanto en términos de menores ajustes por devengo discrecionales como de mayor capacidad de los componentes actuales del resultado para predecir los cash flows futuros. Además, el aumento en los derechos de voto en manos de la familia controladora incrementa la calidad de los resultados contables. La evidencia obtenida se muestra consistente con la presencia de un efecto reputación/vinculación a largo plazo asociado a la empresa familiar. Adicionalmente, el trabajo refleja que a medida que disminuye la divergencia entre los derechos de voto y de cash flow en manos de la familia controladora, aumenta la calidad de la información contable.PALABRAS CLAVE: derechos de voto, divergencia, empresa familiar, calidad delresultado, reputación, beneficios privados.This work examines the relationship between family control and earnings quality in a context where the salient agency problem shifts away from the classical divergence between managers and shareholders to conflicts between the controlling owner and minority shareholders. The results reveal that, compared to non-family firms, family firms reveal higher earnings quality in terms of both lower discretionary accruals and greater predictability of future cash flows. They also show a positive relationship between the level of voting rights held by the controlling family and earnings quality. The evidence is consistent with the presence of a reputation/long-term involvement effect associated with the family firm. Moreover, the work reflects that, as the divergence

  6. 40 CFR 81.42 - Chattanooga Interstate Air Quality Control Region.

    Science.gov (United States)

    2010-07-01

    ... Quality Control Regions § 81.42 Chattanooga Interstate Air Quality Control Region. The Chattanooga Interstate Air Quality Control Region (Georgia-Tennessee) has been revised to consist of the territorial area... 40 Protection of Environment 17 2010-07-01 2010-07-01 false Chattanooga Interstate Air Quality...

  7. Quality control activities in the environmental radiology laboratory

    International Nuclear Information System (INIS)

    Llaurado, M.; Quesada, D.; Rauret, G.; Tent, J.; Zapata, D.

    2006-01-01

    During the last twenty years many analytical laboratories have implemented quality assurance systems. A quality system implementation requires documentation of all activities (technical and management), evaluation of these activities and its continual improvement. Implementation and adequate management of all the elements a quality system includes are not enough to guarantee quality of the analytical results generated at a time. That is the aim of a group of specific activities labelled as quality control activities. The Laboratori de Radiologia Ambiental (Environmental Radiology Laboratory; LRA) at the University of Barcelona was created in 1984 to carry out part of the quality control assays of the Environmental Radiology Monitoring Programs around some of the Spanish nuclear power plants, which are developed by the Servei Catala d'Activitats Energetiques (SCAR) and the Consejo de Seguridad Nuclear (CSN), organisations responsible for nuclear security and radiological protection. In these kind of laboratories, given the importance of the results they give, quality control activities become an essential aspect. In order to guarantee the quality of its analytical results, the LRA Direction decided to adopt the international standard UNE-EN ISO/IEC 17025 for its internal quality system and to accreditate some of the assays it carries out. In such as system, it is established, the laboratory shall monitor the validity of tests undertaken and data shall be recorded in such a way that trends are detectable. The present work shows the activities carried out in this way by the LRA, which are: Equipment control activities which in the special case of radiochemical techniques include measurement of backgrounds and blanks as well as periodical control of efficiency and resolution. Activities to assure the specifications settled by method validation, which are testing of reference materials and periodical analysis of control samples. Evaluation of the laboratory work quality

  8. Quality control and analysis of radiotracer compounds

    International Nuclear Information System (INIS)

    Sheppard, G.; Thomson, R.

    1977-01-01

    Special emphasis was on the problems and errors possible in quality control and analysis. The principles underlying quality control were outlined, and analytical techniques applicable to radiotracers were described. Chapter concluded with a selection of examples showing the effects of impurities on the use of radiotracers. The subject of quality control and analysis was treated from the viewpoint of the user and those research workers who need to synthesize and analyze their own radiochemicals. The quality characteristics for radiotracers are of two kinds, valuable or attributive. These were discussed in the chapter. For counting low radioactive concentration, scintillation techniques are in general use, whereas ionization techniques are now used mainly for the measurement of high radioactive concentrations or large quantities of radioactivity, for scanning chromatograms, and a number of very specific purposes. Determination of radionuclidic purity was discussed. Use of radiotracers in pharmaceuticals were presented. 4 figures, 6 tables

  9. MEASUREMENT OF QUALITY MANAGEMENT SYSTEM PERFORMANCE IN MEAT PROCESSING

    Directory of Open Access Journals (Sweden)

    Elena S. Voloshina

    2017-01-01

    Full Text Available Modern methods aimed to ensure the quality of foods require to implement and certify quality management systems in processing plants. In this case, to measure the effectiveness of existing QMS is often a very difficult task for the leadership due to the fragmentation of the measured metrics, or even lack thereof. This points to the relevance of the conducted research.The criteria for effectiveness assessment of the production process of meat processing plants with the use of scaling methods and Shewhart control charts are presented in the article. The authors developed and presented the formulae for the calculation of single indicators used for the further comprehensive assessment. The algorithm of statistical evaluation of the process controllability, which allows in an accessible form to estimate the statistical control of production processes and to organize statistical quality control in the development of quality management systems, is presented The proposed procedure is based on a process approach, the essence of which is the application of the Deming cycle: “Plan — Do — Check — Act”, which makes it easy to integrate it into any existing quality management system.

  10. Quality control procedures in positron tomography

    International Nuclear Information System (INIS)

    Spinks, T.; Jones, T.; Heather, J.; Gilardi, M.

    1989-01-01

    The derivation of physiological parameters in positron tomography relies on accurate calibration of the tomograph. Normally, the calibration relates image pixel count density to the count rate from an external blood counter per unit activity concentration in each device. The quality control of the latter is simple and relies on detector stability assessed by measurement of a standard source of similar geometry to a blood sample. The quality control of the tomographic data depends on (i) detector stability, (ii) uniformity of calibration and normalisation sources and (iii) reproducibility of the attenuation correction procedure. A quality control procedure has been developed for an 8 detector ring (15 transaxial plane) tomograph in which detector response is assessed by acquiring data from retractable transmission ring sources. These are scanned daily and a print out of detector efficiencies is produced as well as changes from a given data. This provides the raw data from which decisions on recalibration or renormalisation are made. (orig.)

  11. 21 CFR 862.1660 - Quality control material (assayed and unassayed).

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Quality control material (assayed and unassayed... Test Systems § 862.1660 Quality control material (assayed and unassayed). (a) Identification. A quality... that may arise from reagent or analytical instrument variation. A quality control material (assayed and...

  12. Guideline implementation in clinical practice: Use of statistical process control charts as visual feedback devices

    Directory of Open Access Journals (Sweden)

    Fahad A Al-Hussein

    2009-01-01

    Conclusions: A process of audits in the context of statistical process control is necessary for any improvement in the implementation of guidelines in primary care. Statistical process control charts are an effective means of visual feedback to the care providers.

  13. Register-based statistics statistical methods for administrative data

    CERN Document Server

    Wallgren, Anders

    2014-01-01

    This book provides a comprehensive and up to date treatment of  theory and practical implementation in Register-based statistics. It begins by defining the area, before explaining how to structure such systems, as well as detailing alternative approaches. It explains how to create statistical registers, how to implement quality assurance, and the use of IT systems for register-based statistics. Further to this, clear details are given about the practicalities of implementing such statistical methods, such as protection of privacy and the coordination and coherence of such an undertaking. Thi

  14. Quality control in the radioactive waste management

    International Nuclear Information System (INIS)

    Rzyski, B.M.

    1989-01-01

    Radioactive waste management as in industrial activities must mantain in all steps a quality control programme. This control extended from materials acquisition, for waste treatment, to the package deposition is one of the most important activities because it aims to observe the waste acceptance criteria in repositories and allows to guarantee the security of the nuclear facilities. In this work basic knowledges about quality control in waste management and some examples of adopted procedures in other countries are given. (author) [pt

  15. Assessment of air quality benefits from national air pollution control policies in China. Part II: Evaluation of air quality predictions and air quality benefits assessment

    Science.gov (United States)

    Wang, Litao; Jang, Carey; Zhang, Yang; Wang, Kai; Zhang, Qiang; Streets, David; Fu, Joshua; Lei, Yu; Schreifels, Jeremy; He, Kebin; Hao, Jiming; Lam, Yun-Fat; Lin, Jerry; Meskhidze, Nicholas; Voorhees, Scott; Evarts, Dale; Phillips, Sharon

    2010-09-01

    Following the meteorological evaluation in Part I, this Part II paper presents the statistical evaluation of air quality predictions by the U.S. Environmental Protection Agency (U.S. EPA)'s Community Multi-Scale Air Quality (Models-3/CMAQ) model for the four simulated months in the base year 2005. The surface predictions were evaluated using the Air Pollution Index (API) data published by the China Ministry of Environmental Protection (MEP) for 31 capital cities and daily fine particulate matter (PM 2.5, particles with aerodiameter less than or equal to 2.5 μm) observations of an individual site in Tsinghua University (THU). To overcome the shortage in surface observations, satellite data are used to assess the column predictions including tropospheric nitrogen dioxide (NO 2) column abundance and aerosol optical depth (AOD). The result shows that CMAQ gives reasonably good predictions for the air quality. The air quality improvement that would result from the targeted sulfur dioxide (SO 2) and nitrogen oxides (NO x) emission controls in China were assessed for the objective year 2010. The results show that the emission controls can lead to significant air quality benefits. SO 2 concentrations in highly polluted areas of East China in 2010 are estimated to be decreased by 30-60% compared to the levels in the 2010 Business-As-Usual (BAU) case. The annual PM 2.5 can also decline by 3-15 μg m -3 (4-25%) due to the lower SO 2 and sulfate concentrations. If similar controls are implemented for NO x emissions, NO x concentrations are estimated to decrease by 30-60% as compared with the 2010 BAU scenario. The annual mean PM 2.5 concentrations will also decline by 2-14 μg m -3 (3-12%). In addition, the number of ozone (O 3) non-attainment areas in the northern China is projected to be much lower, with the maximum 1-h average O 3 concentrations in the summer reduced by 8-30 ppb.

  16. Statistical applications for chemistry, manufacturing and controls (CMC) in the pharmaceutical industry

    CERN Document Server

    Burdick, Richard K; Pfahler, Lori B; Quiroz, Jorge; Sidor, Leslie; Vukovinsky, Kimberly; Zhang, Lanju

    2017-01-01

    This book examines statistical techniques that are critically important to Chemistry, Manufacturing, and Control (CMC) activities. Statistical methods are presented with a focus on applications unique to the CMC in the pharmaceutical industry. The target audience consists of statisticians and other scientists who are responsible for performing statistical analyses within a CMC environment. Basic statistical concepts are addressed in Chapter 2 followed by applications to specific topics related to development and manufacturing. The mathematical level assumes an elementary understanding of statistical methods. The ability to use Excel or statistical packages such as Minitab, JMP, SAS, or R will provide more value to the reader. The motivation for this book came from an American Association of Pharmaceutical Scientists (AAPS) short course on statistical methods applied to CMC applications presented by four of the authors. One of the course participants asked us for a good reference book, and the only book recomm...

  17. Advanced spot quality analysis in two-colour microarray experiments

    Directory of Open Access Journals (Sweden)

    Vetter Guillaume

    2008-09-01

    Full Text Available Abstract Background Image analysis of microarrays and, in particular, spot quantification and spot quality control, is one of the most important steps in statistical analysis of microarray data. Recent methods of spot quality control are still in early age of development, often leading to underestimation of true positive microarray features and, consequently, to loss of important biological information. Therefore, improving and standardizing the statistical approaches of spot quality control are essential to facilitate the overall analysis of microarray data and subsequent extraction of biological information. Findings We evaluated the performance of two image analysis packages MAIA and GenePix (GP using two complementary experimental approaches with a focus on the statistical analysis of spot quality factors. First, we developed control microarrays with a priori known fluorescence ratios to verify the accuracy and precision of the ratio estimation of signal intensities. Next, we developed advanced semi-automatic protocols of spot quality evaluation in MAIA and GP and compared their performance with available facilities of spot quantitative filtering in GP. We evaluated these algorithms for standardised spot quality analysis in a whole-genome microarray experiment assessing well-characterised transcriptional modifications induced by the transcription regulator SNAI1. Using a set of RT-PCR or qRT-PCR validated microarray data, we found that the semi-automatic protocol of spot quality control we developed with MAIA allowed recovering approximately 13% more spots and 38% more differentially expressed genes (at FDR = 5% than GP with default spot filtering conditions. Conclusion Careful control of spot quality characteristics with advanced spot quality evaluation can significantly increase the amount of confident and accurate data resulting in more meaningful biological conclusions.

  18. [Post-marketing reevaluation for potential quality risk and quality control in clinical application of traditional Chinese medicines].

    Science.gov (United States)

    Li, Hong-jiao; He, Li-yun; Liu, Bao-yan

    2015-06-01

    The effective quality control in clinical practices is an effective guarantee for the authenticity and scientificity of the findings. The post-marketing reevaluation for traditional Chinese medicines (TCM) focuses on the efficacy, adverse reaction, combined medication and effective dose of drugs in the market by expanded clinical trials, and requires a larger sample size and a wider range of patients. Therefore, this increases the difficulty of quality control in clinical practices. With the experience in quality control in clinical practices for the post-marketing reevaluation for Kangbingdu oral for cold, researchers in this study reviewed the study purpose, project, scheme design and clinical practice process from an overall point of view, analyzed the study characteristics of the post-marketing reevaluation for TCMs and the quality control risks, designed the quality control contents with quality impacting factors, defined key review contents and summarized the precautions in clinical practices, with the aim to improve the efficiency of quality control of clinical practices. This study can provide reference to clinical units and quality control-related personnel in the post-marketing reevaluation for TCMs.

  19. Adaptive statistical iterative reconstruction: reducing dose while preserving image quality in the pediatric head CT examination

    Energy Technology Data Exchange (ETDEWEB)

    McKnight, Colin D.; Watcharotone, Kuanwong; Ibrahim, Mohannad; Christodoulou, Emmanuel; Baer, Aaron H.; Parmar, Hemant A. [University of Michigan, Department of Radiology, Ann Arbor, MI (United States)

    2014-08-15

    Over the last decade there has been escalating concern regarding the increasing radiation exposure stemming from CT exams, particularly in children. Adaptive statistical iterative reconstruction (ASIR) is a relatively new and promising tool to reduce radiation dose while preserving image quality. While encouraging results have been found in adult head and chest and body imaging, validation of this technique in pediatric population is limited. The objective of our study was to retrospectively compare the image quality and radiation dose of pediatric head CT examinations obtained with ASIR compared to pediatric head CT examinations without ASIR in a large patient population. Retrospective analysis was performed on 82 pediatric head CT examinations. This group included 33 pediatric head CT examinations obtained with ASIR and 49 pediatric head CT examinations without ASIR. Computed tomography dose index (CTDI{sub vol}) was recorded on all examinations. Quantitative analysis consisted of standardized measurement of attenuation and the standard deviation at the bilateral centrum semiovale and cerebellar white matter to evaluate objective noise. Qualitative analysis consisted of independent assessment by two radiologists in a blinded manner of gray-white differentiation, sharpness and overall diagnostic quality. The average CTDI{sub vol} value of the ASIR group was 21.8 mGy (SD = 4.0) while the average CTDI{sub vol} for the non-ASIR group was 29.7 mGy (SD = 13.8), reflecting a statistically significant reduction in CTDI{sub vol} in the ASIR group (P < 0.01). There were statistically significant reductions in CTDI for the 3- to 12-year-old ASIR group as compared to the 3- to 12-year-old non-ASIR group (21.5 mGy vs. 30.0 mGy; P = 0.004) as well as statistically significant reductions in CTDI for the >12-year-old ASIR group as compared to the >12-year-old non-ASIR group (29.7 mGy vs. 49.9 mGy; P = 0.0002). Quantitative analysis revealed no significant difference in the

  20. The results of a quality-control programme in mammography

    International Nuclear Information System (INIS)

    Ramsdale, M.L.; Hiles, P.A.

    1989-01-01

    A quality-control programme at a breast screening clinic is described. Daily checks include film sensitometry for X-ray processor control and radiography of a lucite phantom to monitor the consistency of the X-ray machine automatic exposure control. Weekly checks include additional measurements on the performance of the automatic exposure control for different breast thickness and an overall assessment of image quality using a prototype mammography test phantom. The test phantom measures low-contrast sensitivity, high-control resolution and small-detail visibility. The results of the quality-control programme are presented with particular attention paid to tolerances and limiting values. (author)

  1. Automatic analysis of image quality control for Image Guided Radiation Therapy (IGRT) devices in external radiotherapy

    International Nuclear Information System (INIS)

    Torfeh, Tarraf

    2009-01-01

    On-board imagers mounted on a radiotherapy treatment machine are very effective devices that improve the geometric accuracy of radiation delivery. However, a precise and regular quality control program is required in order to achieve this objective. Our purpose consisted of developing software tools dedicated to an automatic image quality control of IGRT devices used in external radiotherapy: 2D-MV mode for measuring patient position during the treatment using high energy images, 2D-kV mode (low energy images) and 3D Cone Beam Computed Tomography (CBCT) MV or kV mode, used for patient positioning before treatment. Automated analysis of the Winston and Lutz test was also proposed. This test is used for the evaluation of the mechanical aspects of treatment machines on which additional constraints are carried out due to the on-board imagers additional weights. Finally, a technique of generating digital phantoms in order to assess the performance of the proposed software tools is described. Software tools dedicated to an automatic quality control of IGRT devices allow reducing by a factor of 100 the time spent by the medical physics team to analyze the results of controls while improving their accuracy by using objective and reproducible analysis and offering traceability through generating automatic monitoring reports and statistical studies. (author) [fr

  2. Supervised pelvic floor muscle training versus attention-control massage treatment in patients with faecal incontinence: Statistical analysis plan for a randomised controlled trial.

    Science.gov (United States)

    Ussing, Anja; Dahn, Inge; Due, Ulla; Sørensen, Michael; Petersen, Janne; Bandholm, Thomas

    2017-12-01

    Faecal incontinence affects approximately 8-9% of the adult population. The condition is surrounded by taboo; it can have a devastating impact on quality of life and lead to major limitations in daily life. Pelvic floor muscle training in combination with information and fibre supplements is recommended as first-line treatment for faecal incontinence. Despite this, the effect of pelvic floor muscle training for faecal incontinence is unclear. No previous trials have investigated the efficacy of supervised pelvic floor muscle training in combination with conservative treatment and compared this to an attention-control massage treatment including conservative treatment. The aim of this trial is to investigate if 16 weeks of supervised pelvic floor muscle training in combination with conservative treatment is superior to attention-control massage treatment and conservative treatment in patients with faecal incontinence. Randomised, controlled, superiority trial with two parallel arms. 100 participants with faecal incontinence will be randomised to either (1) individually supervised pelvic floor muscle training and conservative treatment or (2) attention-control massage treatment and conservative treatment. The primary outcome is participants' rating of symptom changes after 16 weeks of treatment using the Patient Global Impression of Improvement Scale. Secondary outcomes are the Vaizey Incontinence Score, the Fecal Incontinence Severity Index, the Fecal Incontinence Quality of Life Scale, a 14-day bowel diary, anorectal manometry and rectal capacity measurements. Follow-up assessment at 36 months will be conducted. This paper describes and discusses the rationale, the methods and in particular the statistical analysis plan of this trial.

  3. Statistical Process Control. A Summary. FEU/PICKUP Project Report.

    Science.gov (United States)

    Owen, M.; Clark, I.

    A project was conducted to develop a curriculum and training materials to be used in training industrial operatives in statistical process control (SPC) techniques. During the first phase of the project, questionnaires were sent to 685 companies (215 of which responded) to determine where SPC was being used, what type of SPC firms needed, and how…

  4. 40 CFR 81.104 - Central Pennsylvania Intrastate Air Quality Control Region.

    Science.gov (United States)

    2010-07-01

    ... Quality Control Region. 81.104 Section 81.104 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.104 Central Pennsylvania Intrastate Air Quality Control Region. The Central Pennsylvania Intrastate Air Quality Control Region consists of the territorial area encompassed by...

  5. 40 CFR 81.43 - Metropolitan Toledo Interstate Air Quality Control Region.

    Science.gov (United States)

    2010-07-01

    ... Quality Control Region. 81.43 Section 81.43 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.43 Metropolitan Toledo Interstate Air Quality Control Region. The Metropolitan Toledo Interstate Air Quality Control Region (Ohio-Michigan) consists of the territorial area...

  6. 40 CFR 81.31 - Metropolitan Providence Interstate Air Quality Control Region.

    Science.gov (United States)

    2010-07-01

    ... Quality Control Region. 81.31 Section 81.31 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.31 Metropolitan Providence Interstate Air Quality Control Region. The Metropolitan Providence Interstate Air Quality Control Region (Rhode Island-Massachusetts) consists of the...

  7. 40 CFR 81.90 - Androscoggin Valley Interstate Air Quality Control Region.

    Science.gov (United States)

    2010-07-01

    ... Quality Control Region. 81.90 Section 81.90 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.90 Androscoggin Valley Interstate Air Quality Control Region. The Androscoggin Valley Interstate Air Quality Control Region (Maine-New Hampshire) consists of the territorial...

  8. 40 CFR 81.78 - Metropolitan Portland Intrastate Air Quality Control Region.

    Science.gov (United States)

    2010-07-01

    ... Quality Control Region. 81.78 Section 81.78 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.78 Metropolitan Portland Intrastate Air Quality Control Region. The Metropolitan Portland Intrastate Air Quality Control Region (Maine) consists of the territorial area...

  9. 40 CFR 81.30 - Southeastern Wisconsin Intrastate Air Quality Control Region.

    Science.gov (United States)

    2010-07-01

    ... Quality Control Region. 81.30 Section 81.30 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.30 Southeastern Wisconsin Intrastate Air Quality Control Region. The Metropolitan Milwaukee Intrastate Air Quality Control Region (Wisconsin) has been renamed the Southeastern...

  10. 40 CFR 81.16 - Metropolitan Denver Intrastate Air Quality Control Region.

    Science.gov (United States)

    2010-07-01

    ... Quality Control Region. 81.16 Section 81.16 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.16 Metropolitan Denver Intrastate Air Quality Control Region. The Metropolitan Denver Intrastate Air Quality Control Region (Colorado) consists of the territorial area...

  11. 40 CFR 81.47 - Central Oklahoma Intrastate Air Quality Control Region.

    Science.gov (United States)

    2010-07-01

    ... Quality Control Region. 81.47 Section 81.47 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.47 Central Oklahoma Intrastate Air Quality Control Region. The Metropolitan Oklahoma Intrastate Air Quality Control Region has been renamed the Central Oklahoma Intrastate...

  12. 40 CFR 81.29 - Metropolitan Indianapolis Intrastate Air Quality Control Region.

    Science.gov (United States)

    2010-07-01

    ... Air Quality Control Region. 81.29 Section 81.29 Protection of Environment ENVIRONMENTAL PROTECTION... Designation of Air Quality Control Regions § 81.29 Metropolitan Indianapolis Intrastate Air Quality Control Region. The Metropolitan Indianapolis Intrastate Air Quality Control Region consists of the territorial...

  13. 40 CFR 81.101 - Metropolitan Dubuque Interstate Air Quality Control Region.

    Science.gov (United States)

    2010-07-01

    ... Quality Control Region. 81.101 Section 81.101 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.101 Metropolitan Dubuque Interstate Air Quality Control Region. The Metropolitan Dubuque Interstate Air Quality Control Region (Illinois-Iowa-Wisconsin) consists of the...

  14. 40 CFR 81.79 - Northeastern Oklahoma Intrastate Air Quality Control Region.

    Science.gov (United States)

    2010-07-01

    ... Quality Control Region. 81.79 Section 81.79 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.79 Northeastern Oklahoma Intrastate Air Quality Control Region. The Metropolitan Tulsa Intrastate Air Quality Control Region has been renamed the Northeastern Oklahoma Intrastate...

  15. 40 CFR 81.24 - Niagara Frontier Intrastate Air Quality Control Region.

    Science.gov (United States)

    2010-07-01

    ... Quality Control Region. 81.24 Section 81.24 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.24 Niagara Frontier Intrastate Air Quality Control Region. The Niagara Frontier Intrastate Air Quality Control Region (New York) consists of the territorial area...

  16. 40 CFR 81.106 - Greenville-Spartanburg Intrastate Air Quality Control Region.

    Science.gov (United States)

    2010-07-01

    ... Quality Control Region. 81.106 Section 81.106 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.106 Greenville-Spartanburg Intrastate Air Quality Control Region. The Greenville-Spartanburg Intrastate Air Quality Control Region (South Carolina) consists of the territorial...

  17. 40 CFR 81.44 - Metropolitan Memphis Interstate Air Quality Control Region.

    Science.gov (United States)

    2010-07-01

    ... Quality Control Region. 81.44 Section 81.44 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.44 Metropolitan Memphis Interstate Air Quality Control Region. The Metropolitan Memphis Interstate Air Quality Control Region (Arkansas-Mississippi-Tennessee) consists of the...

  18. 40 CFR 81.19 - Metropolitan Boston Intrastate Air Quality Control Region.

    Science.gov (United States)

    2010-07-01

    ... Quality Control Region. 81.19 Section 81.19 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.19 Metropolitan Boston Intrastate Air Quality Control Region. The Metropolitan Boston Intrastate Air Quality Control Region (Massachusetts) consists of the territorial area...

  19. 40 CFR 81.28 - Metropolitan Baltimore Intrastate Air Quality Control Region.

    Science.gov (United States)

    2010-07-01

    ... Quality Control Region. 81.28 Section 81.28 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.28 Metropolitan Baltimore Intrastate Air Quality Control Region. The Metropolitan Baltimore Intrastate Air Quality Control Region (Maryland) consists of the territorial area...

  20. 40 CFR 81.119 - Western Tennessee Intrastate Air Quality Control Region.

    Science.gov (United States)

    2010-07-01

    ... Quality Control Region. 81.119 Section 81.119 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.119 Western Tennessee Intrastate Air Quality Control Region. The Western Tennessee Intrastate Air Quality Control Region consists of the territorial area encompassed by...

  1. 40 CFR 81.89 - Metropolitan Cheyenne Intrastate Air Quality Control Region.

    Science.gov (United States)

    2010-07-01

    ... Quality Control Region. 81.89 Section 81.89 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.89 Metropolitan Cheyenne Intrastate Air Quality Control Region. The Metropolitan Cheyenne Intrastate Air Quality Control Region (Wyoming) consists of the territorial area...

  2. 40 CFR 81.87 - Metropolitan Boise Intrastate Air Quality Control Region.

    Science.gov (United States)

    2010-07-01

    ... Quality Control Region. 81.87 Section 81.87 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.87 Metropolitan Boise Intrastate Air Quality Control Region. The Metropolitan Boise Intrastate Air Quality Control Region (Idaho) consists of the territorial area encompassed...

  3. 40 CFR 81.23 - Southwest Pennsylvania Intrastate Air Quality Control Region.

    Science.gov (United States)

    2010-07-01

    ... Quality Control Region. 81.23 Section 81.23 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.23 Southwest Pennsylvania Intrastate Air Quality Control Region. The Southwest Pennsylvania Intrastate Air Quality Control Region is redesignated to consist of the territorial...

  4. 40 CFR 81.75 - Metropolitan Charlotte Interstate Air Quality Control Region.

    Science.gov (United States)

    2010-07-01

    ... Quality Control Region. 81.75 Section 81.75 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.75 Metropolitan Charlotte Interstate Air Quality Control Region. The Metropolitan Charlotte Interstate Air Quality Control Region (North Carolina-South Carolina) has been revised...

  5. 40 CFR 81.120 - Middle Tennessee Intrastate Air Quality Control Region.

    Science.gov (United States)

    2010-07-01

    ... Quality Control Region. 81.120 Section 81.120 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.120 Middle Tennessee Intrastate Air Quality Control Region. The Middle Tennessee Intrastate Air Quality Control Region consists of the territorial area encompassed by...

  6. Quality controls in integrative approaches to detect errors and inconsistencies in biological databases

    Directory of Open Access Journals (Sweden)

    Ghisalberti Giorgio

    2010-12-01

    Full Text Available Numerous biomolecular data are available, but they are scattered in many databases and only some of them are curated by experts. Most available data are computationally derived and include errors and inconsistencies. Effective use of available data in order to derive new knowledge hence requires data integration and quality improvement. Many approaches for data integration have been proposed. Data warehousing seams to be the most adequate when comprehensive analysis of integrated data is required. This makes it the most suitable also to implement comprehensive quality controls on integrated data. We previously developed GFINDer (http://www.bioinformatics.polimi.it/GFINDer/, a web system that supports scientists in effectively using available information. It allows comprehensive statistical analysis and mining of functional and phenotypic annotations of gene lists, such as those identified by high-throughput biomolecular experiments. GFINDer backend is composed of a multi-organism genomic and proteomic data warehouse (GPDW. Within the GPDW, several controlled terminologies and ontologies, which describe gene and gene product related biomolecular processes, functions and phenotypes, are imported and integrated, together with their associations with genes and proteins of several organisms. In order to ease maintaining updated the GPDW and to ensure the best possible quality of data integrated in subsequent updating of the data warehouse, we developed several automatic procedures. Within them, we implemented numerous data quality control techniques to test the integrated data for a variety of possible errors and inconsistencies. Among other features, the implemented controls check data structure and completeness, ontological data consistency, ID format and evolution, unexpected data quantification values, and consistency of data from single and multiple sources. We use the implemented controls to analyze the quality of data available from several

  7. Quality in the fabrication process

    International Nuclear Information System (INIS)

    Romano, A.; Aguirre, F.

    2010-01-01

    Enusa commitment to quality in the manufacture process materializes in the application of the most advanced product quality control technologies such as not-destructive inspection techniques, like artificial vision, X-ray or UT inspection, or process parameter statistical control systems. Quality inspectors are trained and certified by the main National Quality Organizations and receive periodic training under a formal company training program that constantly updates their qualification. Fabrication quality control reliability is based on a strategy that prioritizes redundancy of critical inspection equipment's and inspection personnel knowledge polyvalence. Furthermore, improvement in fabrication quality is obtained by a systematic application of the six sigma methodology where added value is created in projects integrating crosscutting company knowledge, reinforcing the global company vision that the fuel business is based on quality. (Author)

  8. HPLC for quality control of polyimides

    Science.gov (United States)

    Young, P. R.; Sykes, G. F.

    1979-01-01

    High Pressure Liquid Chromatography (HPLC) as a quality control tool for polyimide resins and prepregs are presented. A data base to help establish accept/reject criteria for these materials was developed. This work is intended to supplement, not replace, standard quality control tests normally conducted on incoming resins and prepregs. To help achieve these objectives, the HPLC separation of LARC-160 polyimide precursor resin was characterized. Room temperature resin aging effects were studied. Graphite reinforced composites made from fresh and aged resin were fabricated and tested to determine if changes observed by HPLC were significant.

  9. Enhancing Network Quality using Baseband Frequency Hopping, Downlink Power Control and DTX in a Live GSM Network

    DEFF Research Database (Denmark)

    Nielsen, Thomas Toftegaard; Wagard, Jeroen; Skjærris, Søren

    1998-01-01

    Baseband frequency hopping in the combination with downlink power control and discontinuous transmission has been investigated as a quality improving feature in a live GSM network. Using the dropped call rate and the frame erasure rate to measure the network quality, the use of frequency hopping...... to the statistical inaccuracy with discontinuous transmission in GSM and maybe due to poor performance of the mobile stations, was encountered. The current status is therefore to reject the use of downlink discontinuous transmission until more information about the performance of the mobile stations is found...

  10. 40 CFR 81.117 - Southeast Missouri Intrastate Air Quality Control Region.

    Science.gov (United States)

    2010-07-01

    ... Quality Control Region. 81.117 Section 81.117 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.117 Southeast Missouri Intrastate Air Quality Control Region. The Southeast Missouri Intrastate Air Quality Control Region consists of the territorial area encompassed by the...

  11. 40 CFR 81.45 - Metropolitan Atlanta Intrastate Air Quality Control Region.

    Science.gov (United States)

    2010-07-01

    ... Quality Control Region. 81.45 Section 81.45 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.45 Metropolitan Atlanta Intrastate Air Quality Control Region. The Metropolitan Atlanta Intrastate Air Quality Control Region (Georgia) has been revised to consist of the...

  12. 40 CFR 81.123 - Southeastern Oklahoma Intrastate Air Quality Control Region.

    Science.gov (United States)

    2010-07-01

    ... Quality Control Region. 81.123 Section 81.123 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.123 Southeastern Oklahoma Intrastate Air Quality Control Region. The Southeastern Oklahoma Intrastate Air Quality Control Region consists of the territorial area encompassed by the...

  13. 40 CFR 81.98 - Burlington-Keokuk Interstate Air Quality Control Region.

    Science.gov (United States)

    2010-07-01

    ... Quality Control Region. 81.98 Section 81.98 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.98 Burlington-Keokuk Interstate Air Quality Control Region. The Burlington-Keokuk Interstate Air Quality Control Region (Illinois-Iowa) is revised to consist of the...

  14. 40 CFR 81.49 - Southeast Florida Intrastate Air Quality Control Region.

    Science.gov (United States)

    2010-07-01

    ... Quality Control Region. 81.49 Section 81.49 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.49 Southeast Florida Intrastate Air Quality Control Region. The Southeast Florida Intrastate Air Quality Control Region is redesignated to consist of the territorial area...

  15. 40 CFR 81.59 - Cumberland-Keyser Interstate Air Quality Control Region.

    Science.gov (United States)

    2010-07-01

    ... Quality Control Region. 81.59 Section 81.59 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.59 Cumberland-Keyser Interstate Air Quality Control Region. The Cumberland-Keyser Interstate Air Quality Control Region (Maryland-West Virginia) has been revised to consist...

  16. 40 CFR 81.20 - Metropolitan Cincinnati Interstate Air Quality Control Region.

    Science.gov (United States)

    2010-07-01

    ... Quality Control Region. 81.20 Section 81.20 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.20 Metropolitan Cincinnati Interstate Air Quality Control Region. The Metropolitan Cincinnati Interstate Air Quality Control Region (Ohio-Kentucky-Indiana) is revised to consist of...

  17. 40 CFR 81.97 - Southwest Florida Intrastate Air Quality Control Region.

    Science.gov (United States)

    2010-07-01

    ... Quality Control Region. 81.97 Section 81.97 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.97 Southwest Florida Intrastate Air Quality Control Region. The Southwest Florida Intrastate Air Quality Control Region consists of the territorial area encompassed by the...

  18. 40 CFR 81.116 - Northern Missouri Intrastate Air Quality Control Region.

    Science.gov (United States)

    2010-07-01

    ... Quality Control Region. 81.116 Section 81.116 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.116 Northern Missouri Intrastate Air Quality Control Region. The Northern Missouri Intrastate Air Quality Control Region consists of the territorial area encompassed by the...

  19. 40 CFR 81.67 - Lake Michigan Intrastate Air Quality Control Region.

    Science.gov (United States)

    2010-07-01

    ... Quality Control Regions § 81.67 Lake Michigan Intrastate Air Quality Control Region. The Menominee-Escanaba (Michigan)-Marinette (Wisconsin) Interstate Air Quality Control Region has been renamed the Lake Michigan Intrastate Air Quality Control Region (Wisconsin) and revised to consist of the territorial area...

  20. 40 CFR 81.34 - Metropolitan Dayton Intrastate Air Quality Control Region.

    Science.gov (United States)

    2010-07-01

    ... Quality Control Region. 81.34 Section 81.34 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.34 Metropolitan Dayton Intrastate Air Quality Control Region. The Metropolitan Dayton Intrastate Air Quality Control Region consists of the territorial area encompassed by the...