WorldWideScience

Sample records for statistically controlling additional

  1. Multivariate Statistical Process Control Charts: An Overview

    OpenAIRE

    Bersimis, Sotiris; Psarakis, Stelios; Panaretos, John

    2006-01-01

    In this paper we discuss the basic procedures for the implementation of multivariate statistical process control via control charting. Furthermore, we review multivariate extensions for all kinds of univariate control charts, such as multivariate Shewhart-type control charts, multivariate CUSUM control charts and multivariate EWMA control charts. In addition, we review unique procedures for the construction of multivariate control charts, based on multivariate statistical techniques such as p...

  2. Statistical learning methods: Basics, control and performance

    Energy Technology Data Exchange (ETDEWEB)

    Zimmermann, J. [Max-Planck-Institut fuer Physik, Foehringer Ring 6, 80805 Munich (Germany)]. E-mail: zimmerm@mppmu.mpg.de

    2006-04-01

    The basics of statistical learning are reviewed with a special emphasis on general principles and problems for all different types of learning methods. Different aspects of controlling these methods in a physically adequate way will be discussed. All principles and guidelines will be exercised on examples for statistical learning methods in high energy and astrophysics. These examples prove in addition that statistical learning methods very often lead to a remarkable performance gain compared to the competing classical algorithms.

  3. Statistical learning methods: Basics, control and performance

    International Nuclear Information System (INIS)

    Zimmermann, J.

    2006-01-01

    The basics of statistical learning are reviewed with a special emphasis on general principles and problems for all different types of learning methods. Different aspects of controlling these methods in a physically adequate way will be discussed. All principles and guidelines will be exercised on examples for statistical learning methods in high energy and astrophysics. These examples prove in addition that statistical learning methods very often lead to a remarkable performance gain compared to the competing classical algorithms

  4. A power comparison of generalized additive models and the spatial scan statistic in a case-control setting

    Directory of Open Access Journals (Sweden)

    Ozonoff Al

    2010-07-01

    Full Text Available Abstract Background A common, important problem in spatial epidemiology is measuring and identifying variation in disease risk across a study region. In application of statistical methods, the problem has two parts. First, spatial variation in risk must be detected across the study region and, second, areas of increased or decreased risk must be correctly identified. The location of such areas may give clues to environmental sources of exposure and disease etiology. One statistical method applicable in spatial epidemiologic settings is a generalized additive model (GAM which can be applied with a bivariate LOESS smoother to account for geographic location as a possible predictor of disease status. A natural hypothesis when applying this method is whether residential location of subjects is associated with the outcome, i.e. is the smoothing term necessary? Permutation tests are a reasonable hypothesis testing method and provide adequate power under a simple alternative hypothesis. These tests have yet to be compared to other spatial statistics. Results This research uses simulated point data generated under three alternative hypotheses to evaluate the properties of the permutation methods and compare them to the popular spatial scan statistic in a case-control setting. Case 1 was a single circular cluster centered in a circular study region. The spatial scan statistic had the highest power though the GAM method estimates did not fall far behind. Case 2 was a single point source located at the center of a circular cluster and Case 3 was a line source at the center of the horizontal axis of a square study region. Each had linearly decreasing logodds with distance from the point. The GAM methods outperformed the scan statistic in Cases 2 and 3. Comparing sensitivity, measured as the proportion of the exposure source correctly identified as high or low risk, the GAM methods outperformed the scan statistic in all three Cases. Conclusions The GAM

  5. A power comparison of generalized additive models and the spatial scan statistic in a case-control setting.

    Science.gov (United States)

    Young, Robin L; Weinberg, Janice; Vieira, Verónica; Ozonoff, Al; Webster, Thomas F

    2010-07-19

    A common, important problem in spatial epidemiology is measuring and identifying variation in disease risk across a study region. In application of statistical methods, the problem has two parts. First, spatial variation in risk must be detected across the study region and, second, areas of increased or decreased risk must be correctly identified. The location of such areas may give clues to environmental sources of exposure and disease etiology. One statistical method applicable in spatial epidemiologic settings is a generalized additive model (GAM) which can be applied with a bivariate LOESS smoother to account for geographic location as a possible predictor of disease status. A natural hypothesis when applying this method is whether residential location of subjects is associated with the outcome, i.e. is the smoothing term necessary? Permutation tests are a reasonable hypothesis testing method and provide adequate power under a simple alternative hypothesis. These tests have yet to be compared to other spatial statistics. This research uses simulated point data generated under three alternative hypotheses to evaluate the properties of the permutation methods and compare them to the popular spatial scan statistic in a case-control setting. Case 1 was a single circular cluster centered in a circular study region. The spatial scan statistic had the highest power though the GAM method estimates did not fall far behind. Case 2 was a single point source located at the center of a circular cluster and Case 3 was a line source at the center of the horizontal axis of a square study region. Each had linearly decreasing logodds with distance from the point. The GAM methods outperformed the scan statistic in Cases 2 and 3. Comparing sensitivity, measured as the proportion of the exposure source correctly identified as high or low risk, the GAM methods outperformed the scan statistic in all three Cases. The GAM permutation testing methods provide a regression

  6. Frontiers in statistical quality control

    CERN Document Server

    Wilrich, Peter-Theodor

    2004-01-01

    This volume treats the four main categories of Statistical Quality Control: General SQC Methodology, On-line Control including Sampling Inspection and Statistical Process Control, Off-line Control with Data Analysis and Experimental Design, and, fields related to Reliability. Experts with international reputation present their newest contributions.

  7. Statistical Control Charts: Performances of Short Term Stock Trading in Croatia

    Directory of Open Access Journals (Sweden)

    Dumičić Ksenija

    2015-03-01

    Full Text Available Background: The stock exchange, as a regulated financial market, in modern economies reflects their economic development level. The stock market indicates the mood of investors in the development of a country and is an important ingredient for growth. Objectives: This paper aims to introduce an additional statistical tool used to support the decision-making process in stock trading, and it investigate the usage of statistical process control (SPC methods into the stock trading process. Methods/Approach: The individual (I, exponentially weighted moving average (EWMA and cumulative sum (CUSUM control charts were used for gaining trade signals. The open and the average prices of CROBEX10 index stocks on the Zagreb Stock Exchange were used in the analysis. The statistical control charts capabilities for stock trading in the short-run were analysed. Results: The statistical control chart analysis pointed out too many signals to buy or sell stocks. Most of them are considered as false alarms. So, the statistical control charts showed to be not so much useful in stock trading or in a portfolio analysis. Conclusions: The presence of non-normality and autocorellation has great impact on statistical control charts performances. It is assumed that if these two problems are solved, the use of statistical control charts in a portfolio analysis could be greatly improved.

  8. Statistical Engine Knock Control

    DEFF Research Database (Denmark)

    Stotsky, Alexander A.

    2008-01-01

    A new statistical concept of the knock control of a spark ignition automotive engine is proposed . The control aim is associated with the statistical hy pothesis test which compares the threshold value to the average value of the max imal amplitud e of the knock sensor signal at a given freq uency....... C ontrol algorithm which is used for minimization of the regulation error realizes a simple count-up-count-d own logic. A new ad aptation algorithm for the knock d etection threshold is also d eveloped . C onfi d ence interval method is used as the b asis for ad aptation. A simple statistical mod el...... which includ es generation of the amplitud e signals, a threshold value d etermination and a knock sound mod el is d eveloped for evaluation of the control concept....

  9. Frontiers in statistical quality control 11

    CERN Document Server

    Schmid, Wolfgang

    2015-01-01

    The main focus of this edited volume is on three major areas of statistical quality control: statistical process control (SPC), acceptance sampling and design of experiments. The majority of the papers deal with statistical process control, while acceptance sampling and design of experiments are also treated to a lesser extent. The book is organized into four thematic parts, with Part I addressing statistical process control. Part II is devoted to acceptance sampling. Part III covers the design of experiments, while Part IV discusses related fields. The twenty-three papers in this volume stem from The 11th International Workshop on Intelligent Statistical Quality Control, which was held in Sydney, Australia from August 20 to August 23, 2013. The event was hosted by Professor Ross Sparks, CSIRO Mathematics, Informatics and Statistics, North Ryde, Australia and was jointly organized by Professors S. Knoth, W. Schmid and Ross Sparks. The papers presented here were carefully selected and reviewed by the scientifi...

  10. Multivariate Statistical Process Control

    DEFF Research Database (Denmark)

    Kulahci, Murat

    2013-01-01

    As sensor and computer technology continues to improve, it becomes a normal occurrence that we confront with high dimensional data sets. As in many areas of industrial statistics, this brings forth various challenges in statistical process control (SPC) and monitoring for which the aim...... is to identify “out-of-control” state of a process using control charts in order to reduce the excessive variation caused by so-called assignable causes. In practice, the most common method of monitoring multivariate data is through a statistic akin to the Hotelling’s T2. For high dimensional data with excessive...... amount of cross correlation, practitioners are often recommended to use latent structures methods such as Principal Component Analysis to summarize the data in only a few linear combinations of the original variables that capture most of the variation in the data. Applications of these control charts...

  11. Statistical Inference at Work: Statistical Process Control as an Example

    Science.gov (United States)

    Bakker, Arthur; Kent, Phillip; Derry, Jan; Noss, Richard; Hoyles, Celia

    2008-01-01

    To characterise statistical inference in the workplace this paper compares a prototypical type of statistical inference at work, statistical process control (SPC), with a type of statistical inference that is better known in educational settings, hypothesis testing. Although there are some similarities between the reasoning structure involved in…

  12. Frontiers in statistical quality control

    CERN Document Server

    Wilrich, Peter-Theodor

    2001-01-01

    The book is a collection of papers presented at the 5th International Workshop on Intelligent Statistical Quality Control in Würzburg, Germany. Contributions deal with methodology and successful industrial applications. They can be grouped in four catagories: Sampling Inspection, Statistical Process Control, Data Analysis and Process Capability Studies and Experimental Design.

  13. Statistical Techniques for Project Control

    CERN Document Server

    Badiru, Adedeji B

    2012-01-01

    A project can be simple or complex. In each case, proven project management processes must be followed. In all cases of project management implementation, control must be exercised in order to assure that project objectives are achieved. Statistical Techniques for Project Control seamlessly integrates qualitative and quantitative tools and techniques for project control. It fills the void that exists in the application of statistical techniques to project control. The book begins by defining the fundamentals of project management then explores how to temper quantitative analysis with qualitati

  14. Statistical Process Control for KSC Processing

    Science.gov (United States)

    Ford, Roger G.; Delgado, Hector; Tilley, Randy

    1996-01-01

    The 1996 Summer Faculty Fellowship Program and Kennedy Space Center (KSC) served as the basis for a research effort into statistical process control for KSC processing. The effort entailed several tasks and goals. The first was to develop a customized statistical process control (SPC) course for the Safety and Mission Assurance Trends Analysis Group. The actual teaching of this course took place over several weeks. In addition, an Internet version of the same course complete with animation and video excerpts from the course when it was taught at KSC was developed. The application of SPC to shuttle processing took up the rest of the summer research project. This effort entailed the evaluation of SPC use at KSC, both present and potential, due to the change in roles for NASA and the Single Flight Operations Contractor (SFOC). Individual consulting on SPC use was accomplished as well as an evaluation of SPC software for KSC use in the future. A final accomplishment of the orientation of the author to NASA changes, terminology, data format, and new NASA task definitions will allow future consultation when the needs arise.

  15. Matched case-control studies: a review of reported statistical methodology

    Directory of Open Access Journals (Sweden)

    Niven DJ

    2012-04-01

    Full Text Available Daniel J Niven1, Luc R Berthiaume2, Gordon H Fick1, Kevin B Laupland11Department of Critical Care Medicine, Peter Lougheed Centre, Calgary, 2Department of Community Health Sciences, University of Calgary, Calgary, Alberta, CanadaBackground: Case-control studies are a common and efficient means of studying rare diseases or illnesses with long latency periods. Matching of cases and controls is frequently employed to control the effects of known potential confounding variables. The analysis of matched data requires specific statistical methods.Methods: The objective of this study was to determine the proportion of published, peer reviewed matched case-control studies that used statistical methods appropriate for matched data. Using a comprehensive set of search criteria we identified 37 matched case-control studies for detailed analysis.Results: Among these 37 articles, only 16 studies were analyzed with proper statistical techniques (43%. Studies that were properly analyzed were more likely to have included case patients with cancer and cardiovascular disease compared to those that did not use proper statistics (10/16 or 63%, versus 5/21 or 24%, P = 0.02. They were also more likely to have matched multiple controls for each case (14/16 or 88%, versus 13/21 or 62%, P = 0.08. In addition, studies with properly analyzed data were more likely to have been published in a journal with an impact factor listed in the top 100 according to the Journal Citation Reports index (12/16 or 69%, versus 1/21 or 5%, P ≤ 0.0001.Conclusion: The findings of this study raise concern that the majority of matched case-control studies report results that are derived from improper statistical analyses. This may lead to errors in estimating the relationship between a disease and exposure, as well as the incorrect adaptation of emerging medical literature.Keywords: case-control, matched, dependent data, statistics

  16. Application of Statistical Process Control (SPC in it´s Quality control

    Directory of Open Access Journals (Sweden)

    Carlos Hernández-Pedrera

    2015-12-01

    Full Text Available The overall objective of this paper is to use the SPC to assess the possibility of improving the process of obtaining a sanitary device. As specific objectives we set out to identify the variables to be analyzed to enter the statistical control of process (SPC, analyze possible errors and variations indicated by the control charts in addition to evaluate and compare the results achieved with the study of SPC before and after monitoring direct in the production line were used sampling methods and laboratory replacement to determine the quality of the finished product, then statistical methods were applied seeking to emphasize the importance and contribution from its application to monitor corrective actions and support processes in production. It was shown that the process is under control because the results were found within established control limits. There is a tendency to be displaced toward one end of the boundary, the distribution exceeds the limits, creating the possibility that under certain conditions the process is out of control, the results also showed that the process being within the limits of quality control is operating far from the optimal conditions. In any of the study situations were obtained products outside the limits of weight and discoloration but defective products were obtained.

  17. Statistical process control in wine industry using control cards

    OpenAIRE

    Dimitrieva, Evica; Atanasova-Pacemska, Tatjana; Pacemska, Sanja

    2013-01-01

    This paper is based on the research of the technological process of automatic filling of bottles of wine in winery in Stip, Republic of Macedonia. The statistical process control using statistical control card is created. The results and recommendations for improving the process are discussed.

  18. Statistical process control charts for monitoring military injuries.

    Science.gov (United States)

    Schuh, Anna; Canham-Chervak, Michelle; Jones, Bruce H

    2017-12-01

    An essential aspect of an injury prevention process is surveillance, which quantifies and documents injury rates in populations of interest and enables monitoring of injury frequencies, rates and trends. To drive progress towards injury reduction goals, additional tools are needed. Statistical process control charts, a methodology that has not been previously applied to Army injury monitoring, capitalise on existing medical surveillance data to provide information to leadership about injury trends necessary for prevention planning and evaluation. Statistical process control Shewhart u-charts were created for 49 US Army installations using quarterly injury medical encounter rates, 2007-2015, for active duty soldiers obtained from the Defense Medical Surveillance System. Injuries were defined according to established military injury surveillance recommendations. Charts display control limits three standard deviations (SDs) above and below an installation-specific historical average rate determined using 28 data points, 2007-2013. Charts are available in Army strategic management dashboards. From 2007 to 2015, Army injury rates ranged from 1254 to 1494 unique injuries per 1000 person-years. Installation injury rates ranged from 610 to 2312 injuries per 1000 person-years. Control charts identified four installations with injury rates exceeding the upper control limits at least once during 2014-2015, rates at three installations exceeded the lower control limit at least once and 42 installations had rates that fluctuated around the historical mean. Control charts can be used to drive progress towards injury reduction goals by indicating statistically significant increases and decreases in injury rates. Future applications to military subpopulations, other health outcome metrics and chart enhancements are suggested. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  19. Control cards as a statistical quality control resource

    Directory of Open Access Journals (Sweden)

    Aleksandar Živan Drenovac

    2013-02-01

    Full Text Available Normal 0 false false false MicrosoftInternetExplorer4 This paper proves that applying of statistical methods can significantly contribute increasing of products and services quality, as well as increasing of institutions rating. Determining of optimal, apropos anticipatory and limitary values, is based on sample`s statistical analyze. Control cards represent very confident instrument, which is simple for use and efficient for control of process, by which process is maintained in set borders. Thus, control cards can be applied in quality control of procesess of weapons and military equipment production, maintenance of technical systems, as well as for seting of standards and increasing of quality level for many other activities.

  20. Statistical methods for quality assurance basics, measurement, control, capability, and improvement

    CERN Document Server

    Vardeman, Stephen B

    2016-01-01

    This undergraduate statistical quality assurance textbook clearly shows with real projects, cases and data sets how statistical quality control tools are used in practice. Among the topics covered is a practical evaluation of measurement effectiveness for both continuous and discrete data. Gauge Reproducibility and Repeatability methodology (including confidence intervals for Repeatability, Reproducibility and the Gauge Capability Ratio) is thoroughly developed. Process capability indices and corresponding confidence intervals are also explained. In addition to process monitoring techniques, experimental design and analysis for process improvement are carefully presented. Factorial and Fractional Factorial arrangements of treatments and Response Surface methods are covered. Integrated throughout the book are rich sets of examples and problems that help readers gain a better understanding of where and how to apply statistical quality control tools. These large and realistic problem sets in combination with the...

  1. A new instrument for statistical process control of thermoset molding

    International Nuclear Information System (INIS)

    Day, D.R.; Lee, H.L.; Shepard, D.D.; Sheppard, N.F.

    1991-01-01

    The recent development of a rugged ceramic mold mounted dielectric sensor and high speed dielectric instrumentation now enables monitoring and statistical process control of production molding over thousands of runs. In this work special instrumentation and software (ICAM-1000) was utilized that automatically extracts critical point during the molding process including flow point, viscosity minimum gel inflection, and reaction endpoint. In addition, other sensors were incorporated to measure temperature and pressure. The critical point as well as temperature and pressure were then recorded during normal production and then plotted in the form of statistical process control (SPC) charts. Experiments have been carried out in RIM, SMC, and RTM type molding operations. The influence of temperature, pressure chemistry, and other variables has been investigated. In this paper examples of both RIM and SMC are discussed

  2. Memory-type control charts in statistical process control

    NARCIS (Netherlands)

    Abbas, N.

    2012-01-01

    Control chart is the most important statistical tool to manage the business processes. It is a graph of measurements on a quality characteristic of the process on the vertical axis plotted against time on the horizontal axis. The graph is completed with control limits that cause variation mark. Once

  3. Statistical process control in nursing research.

    Science.gov (United States)

    Polit, Denise F; Chaboyer, Wendy

    2012-02-01

    In intervention studies in which randomization to groups is not possible, researchers typically use quasi-experimental designs. Time series designs are strong quasi-experimental designs but are seldom used, perhaps because of technical and analytic hurdles. Statistical process control (SPC) is an alternative analytic approach to testing hypotheses about intervention effects using data collected over time. SPC, like traditional statistical methods, is a tool for understanding variation and involves the construction of control charts that distinguish between normal, random fluctuations (common cause variation), and statistically significant special cause variation that can result from an innovation. The purpose of this article is to provide an overview of SPC and to illustrate its use in a study of a nursing practice improvement intervention. Copyright © 2011 Wiley Periodicals, Inc.

  4. Statistical study of chemical additives effects in the waste cementation

    International Nuclear Information System (INIS)

    Tello, Cledola C.O. de; Diniz, Paula S.; Haucz, Maria J.A.

    1997-01-01

    This paper presents the statistical study, that was carried out to analyse the chemical additives effect in the waste cementation process. Three different additives from two industries were tested: set accelerator, set retarder and super plasticizers, in cemented pates with and without bentonite. The experiments were planned in accordance with the 2 3 factorial design, so that the effect of each type of additive, its quantity and manufacturer in cemented paste and specimens could be evaluated. The results showed that the use of these can improve the cementation process and the product. The admixture quantity and the association with bentonite were the most important factors affecting the process and product characteristics. (author). 4 refs., 9 figs., 4 tabs

  5. Between and beyond additivity and non-additivity : the statistical modelling of genotype by environment interaction in plant breeding

    OpenAIRE

    Eeuwijk, van, F.A.

    1996-01-01

    In plant breeding it is a common observation to see genotypes react differently to environmental changes. This phenomenon is called genotype by environment interaction. Many statistical approaches for analysing genotype by environment interaction rely heavily on the analysis of variance model. Genotype by environment interaction is then taken to be equivalent to non-additivity. This thesis criticizes the analysis of variance approach. Modelling genotype by environment interaction by non-addit...

  6. Applicability of statistical process control techniques

    NARCIS (Netherlands)

    Schippers, W.A.J.

    1998-01-01

    This paper concerns the application of Process Control Techniques (PCTs) for the improvement of the technical performance of discrete production processes. Successful applications of these techniques, such as Statistical Process Control Techniques (SPC), can be found in the literature. However, some

  7. Methods and applications of statistics in engineering, quality control, and the physical sciences

    CERN Document Server

    Balakrishnan, N

    2011-01-01

    Inspired by the Encyclopedia of Statistical Sciences, Second Edition (ESS2e), this volume presents a concise, well-rounded focus on the statistical concepts and applications that are essential for understanding gathered data in the fields of engineering, quality control, and the physical sciences. The book successfully upholds the goals of ESS2e by combining both previously-published and newly developed contributions written by over 100 leading academics, researchers, and practitioner in a comprehensive, approachable format. The result is a succinct reference that unveils modern, cutting-edge approaches to acquiring and analyzing data across diverse subject areas within these three disciplines, including operations research, chemistry, physics, the earth sciences, electrical engineering, and quality assurance. In addition, techniques related to survey methodology, computational statistics, and operations research are discussed, where applicable. Topics of coverage include: optimal and stochastic control, arti...

  8. Using Statistical Process Control to Enhance Student Progression

    Science.gov (United States)

    Hanna, Mark D.; Raichura, Nilesh; Bernardes, Ednilson

    2012-01-01

    Public interest in educational outcomes has markedly increased in the most recent decade; however, quality management and statistical process control have not deeply penetrated the management of academic institutions. This paper presents results of an attempt to use Statistical Process Control (SPC) to identify a key impediment to continuous…

  9. A computerized diagnostic system for nuclear plant control rooms based on statistical quality control

    International Nuclear Information System (INIS)

    Heising, C.D.; Grenzebach, W.S.

    1990-01-01

    In engineering science, statistical quality control techniques have traditionally been applied to control manufacturing processes. An application to commercial nuclear power plant maintenance and control is presented that can greatly improve safety. As a demonstration of such an approach to plant maintenance and control, a specific system is analyzed: the reactor coolant pumps of the St. Lucie Unit 2 nuclear power plant located in Florida. A 30-day history of the four pumps prior to a plant shutdown caused by pump failure and a related fire within the containment was analyzed. Statistical quality control charts of recorded variables were constructed for each pump, which were shown to go out of statistical control many days before the plant trip. The analysis shows that statistical process control methods can be applied as an early warning system capable of identifying significant equipment problems well in advance of traditional control room alarm indicators

  10. Applying Statistical Process Control to Clinical Data: An Illustration.

    Science.gov (United States)

    Pfadt, Al; And Others

    1992-01-01

    Principles of statistical process control are applied to a clinical setting through the use of control charts to detect changes, as part of treatment planning and clinical decision-making processes. The logic of control chart analysis is derived from principles of statistical inference. Sample charts offer examples of evaluating baselines and…

  11. Applying Statistical Process Quality Control Methodology to Educational Settings.

    Science.gov (United States)

    Blumberg, Carol Joyce

    A subset of Statistical Process Control (SPC) methodology known as Control Charting is introduced. SPC methodology is a collection of graphical and inferential statistics techniques used to study the progress of phenomena over time. The types of control charts covered are the null X (mean), R (Range), X (individual observations), MR (moving…

  12. [Statistical process control applied to intensity modulated radiotherapy pretreatment controls with portal dosimetry].

    Science.gov (United States)

    Villani, N; Gérard, K; Marchesi, V; Huger, S; François, P; Noël, A

    2010-06-01

    The first purpose of this study was to illustrate the contribution of statistical process control for a better security in intensity modulated radiotherapy (IMRT) treatments. This improvement is possible by controlling the dose delivery process, characterized by pretreatment quality control results. So, it is necessary to put under control portal dosimetry measurements (currently, the ionisation chamber measurements were already monitored by statistical process control thanks to statistical process control tools). The second objective was to state whether it is possible to substitute ionisation chamber with portal dosimetry in order to optimize time devoted to pretreatment quality control. At Alexis-Vautrin center, pretreatment quality controls in IMRT for prostate and head and neck treatments were performed for each beam of each patient. These controls were made with an ionisation chamber, which is the reference detector for the absolute dose measurement, and with portal dosimetry for the verification of dose distribution. Statistical process control is a statistical analysis method, coming from industry, used to control and improve the studied process quality. It uses graphic tools as control maps to follow-up process, warning the operator in case of failure, and quantitative tools to evaluate the process toward its ability to respect guidelines: this is the capability study. The study was performed on 450 head and neck beams and on 100 prostate beams. Control charts, showing drifts, both slow and weak, and also both strong and fast, of mean and standard deviation have been established and have shown special cause introduced (manual shift of the leaf gap of the multileaf collimator). Correlation between dose measured at one point, given with the EPID and the ionisation chamber has been evaluated at more than 97% and disagreement cases between the two measurements were identified. The study allowed to demonstrate the feasibility to reduce the time devoted to

  13. Statistical process control applied to intensity modulated radiotherapy pretreatment controls with portal dosimetry

    International Nuclear Information System (INIS)

    Villani, N.; Noel, A.; Villani, N.; Gerard, K.; Marchesi, V.; Huger, S.; Noel, A.; Francois, P.

    2010-01-01

    Purpose The first purpose of this study was to illustrate the contribution of statistical process control for a better security in intensity modulated radiotherapy (I.M.R.T.) treatments. This improvement is possible by controlling the dose delivery process, characterized by pretreatment quality control results. So, it is necessary to put under control portal dosimetry measurements (currently, the ionisation chamber measurements were already monitored by statistical process control thanks to statistical process control tools). The second objective was to state whether it is possible to substitute ionisation chamber with portal dosimetry in order to optimize time devoted to pretreatment quality control. Patients and methods At Alexis-Vautrin center, pretreatment quality controls in I.M.R.T. for prostate and head and neck treatments were performed for each beam of each patient. These controls were made with an ionisation chamber, which is the reference detector for the absolute dose measurement, and with portal dosimetry for the verification of dose distribution. Statistical process control is a statistical analysis method, coming from industry, used to control and improve the studied process quality. It uses graphic tools as control maps to follow-up process, warning the operator in case of failure, and quantitative tools to evaluate the process toward its ability to respect guidelines: this is the capability study. The study was performed on 450 head and neck beams and on 100 prostate beams. Results Control charts, showing drifts, both slow and weak, and also both strong and fast, of mean and standard deviation have been established and have shown special cause introduced (manual shift of the leaf gap of the multi-leaf collimator). Correlation between dose measured at one point, given with the E.P.I.D. and the ionisation chamber has been evaluated at more than 97% and disagreement cases between the two measurements were identified. Conclusion The study allowed to

  14. Statistical process control for serially correlated data

    NARCIS (Netherlands)

    Wieringa, Jakob Edo

    1999-01-01

    Statistical Process Control (SPC) aims at quality improvement through reduction of variation. The best known tool of SPC is the control chart. Over the years, the control chart has proved to be a successful practical technique for monitoring process measurements. However, its usefulness in practice

  15. Persuasiveness of Statistics and Patients’ and Mothers’ Narratives in Human Papillomavirus Vaccine Recommendation Messages: A Randomized Controlled Study in Japan

    Directory of Open Access Journals (Sweden)

    Tsuyoshi Okuhara

    2018-04-01

    Full Text Available BackgroundThe human papillomavirus (HPV vaccination percentage among age-eligible girls in Japan is only in the single digits. This signals the need for effective vaccine communication tactics. This study aimed to examine the influence of statistical data and narrative HPV vaccination recommendation massages on recipients’ vaccination intentions.MethodsThis randomized controlled study covered 1,432 mothers who had daughters aged 12–16 years. It compared message persuasiveness among four conditions: statistical messages only; narrative messages of a patient who experienced cervical cancer, in addition to statistical messages; narrative messages of a mother whose daughter experienced cervical cancer, in addition to statistical messages; and a control. Vaccination intentions to have one’s daughter(s receive the HPV vaccine before and after reading intervention materials were assessed. Statistical analysis was conducted using analysis of variance with Tukey’s test or Games–Howell post hoc test, and analysis of covariance with Bonferroni correction.ResultsVaccination intentions after intervention in the three intervention conditions were higher than the control condition (p < 0.001. A mother’s narrative messages in addition to statistical messages increased HPV vaccination intention the most of all tested intervention conditions. A significant difference in the estimated means of intention with the covariate adjustment for baseline value (i.e., intention before intervention was found between a mother’s narrative messages in addition to statistical messages and statistical messages only (p = 0.040.DiscussionMothers’ narrative messages may be persuasive when targeting mothers for promoting HPV vaccination. This may be because mothers can easily relate to and identify with communications from other mothers. However, for effective HPV vaccine communication, further studies are needed to understand more about persuasive

  16. 77 FR 46096 - Statistical Process Controls for Blood Establishments; Public Workshop

    Science.gov (United States)

    2012-08-02

    ...] Statistical Process Controls for Blood Establishments; Public Workshop AGENCY: Food and Drug Administration... workshop entitled: ``Statistical Process Controls for Blood Establishments.'' The purpose of this public workshop is to discuss the implementation of statistical process controls to validate and monitor...

  17. Improving Instruction Using Statistical Process Control.

    Science.gov (United States)

    Higgins, Ronald C.; Messer, George H.

    1990-01-01

    Two applications of statistical process control to the process of education are described. Discussed are the use of prompt feedback to teachers and prompt feedback to students. A sample feedback form is provided. (CW)

  18. Robust Control Methods for On-Line Statistical Learning

    Directory of Open Access Journals (Sweden)

    Capobianco Enrico

    2001-01-01

    Full Text Available The issue of controlling that data processing in an experiment results not affected by the presence of outliers is relevant for statistical control and learning studies. Learning schemes should thus be tested for their capacity of handling outliers in the observed training set so to achieve reliable estimates with respect to the crucial bias and variance aspects. We describe possible ways of endowing neural networks with statistically robust properties by defining feasible error criteria. It is convenient to cast neural nets in state space representations and apply both Kalman filter and stochastic approximation procedures in order to suggest statistically robustified solutions for on-line learning.

  19. Radiographic rejection index using statistical process control

    International Nuclear Information System (INIS)

    Savi, M.B.M.B.; Camozzato, T.S.C.; Soares, F.A.P.; Nandi, D.M.

    2015-01-01

    The Repeat Analysis Index (IRR) is one of the items contained in the Quality Control Program dictated by brazilian law of radiological protection and should be performed frequently, at least every six months. In order to extract more and better information of IRR, this study presents the Statistical Quality Control applied to reject rate through Statistical Process Control (Control Chart for Attributes ρ - GC) and the Pareto Chart (GP). Data collection was performed for 9 months and the last four months of collection was given on a daily basis. The Limits of Control (LC) were established and Minitab 16 software used to create the charts. IRR obtained for the period was corresponding to 8.8% ± 2,3% and the generated charts analyzed. Relevant information such as orders for X-ray equipment and processors were crossed to identify the relationship between the points that exceeded the control limits and the state of equipment at the time. The GC demonstrated ability to predict equipment failures, as well as the GP showed clearly what causes are recurrent in IRR. (authors) [pt

  20. Nonparametric predictive inference in statistical process control

    NARCIS (Netherlands)

    Arts, G.R.J.; Coolen, F.P.A.; Laan, van der P.

    2000-01-01

    New methods for statistical process control are presented, where the inferences have a nonparametric predictive nature. We consider several problems in process control in terms of uncertainties about future observable random quantities, and we develop inferences for these random quantities hased on

  1. Nonparametric predictive inference in statistical process control

    NARCIS (Netherlands)

    Arts, G.R.J.; Coolen, F.P.A.; Laan, van der P.

    2004-01-01

    Statistical process control (SPC) is used to decide when to stop a process as confidence in the quality of the next item(s) is low. Information to specify a parametric model is not always available, and as SPC is of a predictive nature, we present a control chart developed using nonparametric

  2. Management of Uncertainty by Statistical Process Control and a Genetic Tuned Fuzzy System

    Directory of Open Access Journals (Sweden)

    Stephan Birle

    2016-01-01

    Full Text Available In food industry, bioprocesses like fermentation often are a crucial part of the manufacturing process and decisive for the final product quality. In general, they are characterized by highly nonlinear dynamics and uncertainties that make it difficult to control these processes by the use of traditional control techniques. In this context, fuzzy logic controllers offer quite a straightforward way to control processes that are affected by nonlinear behavior and uncertain process knowledge. However, in order to maintain process safety and product quality it is necessary to specify the controller performance and to tune the controller parameters. In this work, an approach is presented to establish an intelligent control system for oxidoreductive yeast propagation as a representative process biased by the aforementioned uncertainties. The presented approach is based on statistical process control and fuzzy logic feedback control. As the cognitive uncertainty among different experts about the limits that define the control performance as still acceptable may differ a lot, a data-driven design method is performed. Based upon a historic data pool statistical process corridors are derived for the controller inputs control error and change in control error. This approach follows the hypothesis that if the control performance criteria stay within predefined statistical boundaries, the final process state meets the required quality definition. In order to keep the process on its optimal growth trajectory (model based reference trajectory a fuzzy logic controller is used that alternates the process temperature. Additionally, in order to stay within the process corridors, a genetic algorithm was applied to tune the input and output fuzzy sets of a preliminarily parameterized fuzzy controller. The presented experimental results show that the genetic tuned fuzzy controller is able to keep the process within its allowed limits. The average absolute error to the

  3. Use of statistical process control in the production of blood components

    DEFF Research Database (Denmark)

    Magnussen, K; Quere, S; Winkel, P

    2008-01-01

    Introduction of statistical process control in the setting of a small blood centre was tested, both on the regular red blood cell production and specifically to test if a difference was seen in the quality of the platelets produced, when a change was made from a relatively large inexperienced...... by an experienced staff with four technologists. We applied statistical process control to examine if time series of quality control values were in statistical control. Leucocyte count in red blood cells was out of statistical control. Platelet concentration and volume of the platelets produced by the occasional...... occasional component manufacturing staff to an experienced regular manufacturing staff. Production of blood products is a semi-automated process in which the manual steps may be difficult to control. This study was performed in an ongoing effort to improve the control and optimize the quality of the blood...

  4. Quality assurance and statistical control

    DEFF Research Database (Denmark)

    Heydorn, K.

    1991-01-01

    In scientific research laboratories it is rarely possible to use quality assurance schemes, developed for large-scale analysis. Instead methods have been developed to control the quality of modest numbers of analytical results by relying on statistical control: Analysis of precision serves...... to detect analytical errors by comparing the a priori precision of the analytical results with the actual variability observed among replicates or duplicates. The method relies on the chi-square distribution to detect excess variability and is quite sensitive even for 5-10 results. Interference control...... serves to detect analytical bias by comparing results obtained by two different analytical methods, each relying on a different detection principle and therefore exhibiting different influence from matrix elements; only 5-10 sets of results are required to establish whether a regression line passes...

  5. Statistical process control: A feasibility study of the application of time-series measurement in early neurorehabilitation after acquired brain injury.

    Science.gov (United States)

    Markovic, Gabriela; Schult, Marie-Louise; Bartfai, Aniko; Elg, Mattias

    2017-01-31

    Progress in early cognitive recovery after acquired brain injury is uneven and unpredictable, and thus the evaluation of rehabilitation is complex. The use of time-series measurements is susceptible to statistical change due to process variation. To evaluate the feasibility of using a time-series method, statistical process control, in early cognitive rehabilitation. Participants were 27 patients with acquired brain injury undergoing interdisciplinary rehabilitation of attention within 4 months post-injury. The outcome measure, the Paced Auditory Serial Addition Test, was analysed using statistical process control. Statistical process control identifies if and when change occurs in the process according to 3 patterns: rapid, steady or stationary performers. The statistical process control method was adjusted, in terms of constructing the baseline and the total number of measurement points, in order to measure a process in change. Statistical process control methodology is feasible for use in early cognitive rehabilitation, since it provides information about change in a process, thus enabling adjustment of the individual treatment response. Together with the results indicating discernible subgroups that respond differently to rehabilitation, statistical process control could be a valid tool in clinical decision-making. This study is a starting-point in understanding the rehabilitation process using a real-time-measurements approach.

  6. Effective control of complex turbulent dynamical systems through statistical functionals.

    Science.gov (United States)

    Majda, Andrew J; Qi, Di

    2017-05-30

    Turbulent dynamical systems characterized by both a high-dimensional phase space and a large number of instabilities are ubiquitous among complex systems in science and engineering, including climate, material, and neural science. Control of these complex systems is a grand challenge, for example, in mitigating the effects of climate change or safe design of technology with fully developed shear turbulence. Control of flows in the transition to turbulence, where there is a small dimension of instabilities about a basic mean state, is an important and successful discipline. In complex turbulent dynamical systems, it is impossible to track and control the large dimension of instabilities, which strongly interact and exchange energy, and new control strategies are needed. The goal of this paper is to propose an effective statistical control strategy for complex turbulent dynamical systems based on a recent statistical energy principle and statistical linear response theory. We illustrate the potential practical efficiency and verify this effective statistical control strategy on the 40D Lorenz 1996 model in forcing regimes with various types of fully turbulent dynamics with nearly one-half of the phase space unstable.

  7. Using Statistical Process Control to Make Data-Based Clinical Decisions.

    Science.gov (United States)

    Pfadt, Al; Wheeler, Donald J.

    1995-01-01

    Statistical process control (SPC), which employs simple statistical tools and problem-solving techniques such as histograms, control charts, flow charts, and Pareto charts to implement continual product improvement procedures, can be incorporated into human service organizations. Examples illustrate use of SPC procedures to analyze behavioral data…

  8. Septic tank additive impacts on microbial populations.

    Science.gov (United States)

    Pradhan, S; Hoover, M T; Clark, G H; Gumpertz, M; Wollum, A G; Cobb, C; Strock, J

    2008-01-01

    Environmental health specialists, other onsite wastewater professionals, scientists, and homeowners have questioned the effectiveness of septic tank additives. This paper describes an independent, third-party, field scale, research study of the effects of three liquid bacterial septic tank additives and a control (no additive) on septic tank microbial populations. Microbial populations were measured quarterly in a field study for 12 months in 48 full-size, functioning septic tanks. Bacterial populations in the 48 septic tanks were statistically analyzed with a mixed linear model. Additive effects were assessed for three septic tank maintenance levels (low, intermediate, and high). Dunnett's t-test for tank bacteria (alpha = .05) indicated that none of the treatments were significantly different, overall, from the control at the statistical level tested. In addition, the additives had no significant effects on septic tank bacterial populations at any of the septic tank maintenance levels. Additional controlled, field-based research iswarranted, however, to address additional additives and experimental conditions.

  9. An easy and low cost option for economic statistical process control ...

    African Journals Online (AJOL)

    An easy and low cost option for economic statistical process control using Excel. ... in both economic and economic statistical designs of the X-control chart. ... in this paper and the numerical examples illustrated are executed on this program.

  10. A Statistical Project Control Tool for Engineering Managers

    Science.gov (United States)

    Bauch, Garland T.

    2001-01-01

    This slide presentation reviews the use of a Statistical Project Control Tool (SPCT) for managing engineering projects. A literature review pointed to a definition of project success, (i.e., A project is successful when the cost, schedule, technical performance, and quality satisfy the customer.) The literature review also pointed to project success factors, and traditional project control tools, and performance measures that are detailed in the report. The essential problem is that with resources becoming more limited, and an increasing number or projects, project failure is increasing, there is a limitation of existing methods and systematic methods are required. The objective of the work is to provide a new statistical project control tool for project managers. Graphs using the SPCT method plotting results of 3 successful projects and 3 failed projects are reviewed, with success and failure being defined by the owner.

  11. Manufacturing Squares: An Integrative Statistical Process Control Exercise

    Science.gov (United States)

    Coy, Steven P.

    2016-01-01

    In the exercise, students in a junior-level operations management class are asked to manufacture a simple product. Given product specifications, they must design a production process, create roles and design jobs for each team member, and develop a statistical process control plan that efficiently and effectively controls quality during…

  12. Statistic techniques of process control for MTR type

    International Nuclear Information System (INIS)

    Oliveira, F.S.; Ferrufino, F.B.J.; Santos, G.R.T.; Lima, R.M.

    2002-01-01

    This work aims at introducing some improvements on the fabrication of MTR type fuel plates, applying statistic techniques of process control. The work was divided into four single steps and their data were analyzed for: fabrication of U 3 O 8 fuel plates; fabrication of U 3 Si 2 fuel plates; rolling of small lots of fuel plates; applying statistic tools and standard specifications to perform a comparative study of these processes. (author)

  13. Using Paper Helicopters to Teach Statistical Process Control

    Science.gov (United States)

    Johnson, Danny J.

    2011-01-01

    This hands-on project uses a paper helicopter to teach students how to distinguish between common and special causes of variability when developing and using statistical process control charts. It allows the student to experience a process that is out-of-control due to imprecise or incomplete product design specifications and to discover how the…

  14. Net analyte signal based statistical quality control

    NARCIS (Netherlands)

    Skibsted, E.T.S.; Boelens, H.F.M.; Westerhuis, J.A.; Smilde, A.K.; Broad, N.W.; Rees, D.R.; Witte, D.T.

    2005-01-01

    Net analyte signal statistical quality control (NAS-SQC) is a new methodology to perform multivariate product quality monitoring based on the net analyte signal approach. The main advantage of NAS-SQC is that the systematic variation in the product due to the analyte (or property) of interest is

  15. Applied Behavior Analysis and Statistical Process Control?

    Science.gov (United States)

    Hopkins, B. L.

    1995-01-01

    Incorporating statistical process control (SPC) methods into applied behavior analysis is discussed. It is claimed that SPC methods would likely reduce applied behavior analysts' intimate contacts with problems and would likely yield poor treatment and research decisions. Cases and data presented by Pfadt and Wheeler (1995) are cited as examples.…

  16. Statistical Process Control: Going to the Limit for Quality.

    Science.gov (United States)

    Training, 1987

    1987-01-01

    Defines the concept of statistical process control, a quality control method used especially in manufacturing. Generally, concept users set specific standard levels that must be met. Makes the point that although employees work directly with the method, management is responsible for its success within the plant. (CH)

  17. Application of statistical process control to qualitative molecular diagnostic assays.

    Directory of Open Access Journals (Sweden)

    Cathal P O'brien

    2014-11-01

    Full Text Available Modern pathology laboratories and in particular high throughput laboratories such as clinical chemistry have developed a reliable system for statistical process control. Such a system is absent from the majority of molecular laboratories and where present is confined to quantitative assays. As the inability to apply statistical process control to assay is an obvious disadvantage this study aimed to solve this problem by using a frequency estimate coupled with a confidence interval calculation to detect deviations from an expected mutation frequency. The results of this study demonstrate the strengths and weaknesses of this approach and highlight minimum sample number requirements. Notably, assays with low mutation frequencies and detection of small deviations from an expected value require greater samples with a resultant protracted time to detection. Modelled laboratory data was also used to highlight how this approach might be applied in a routine molecular laboratory. This article is the first to describe the application of statistical process control to qualitative laboratory data.

  18. Assessment of the GPC Control Quality Using Non–Gaussian Statistical Measures

    Directory of Open Access Journals (Sweden)

    Domański Paweł D.

    2017-06-01

    Full Text Available This paper presents an alternative approach to the task of control performance assessment. Various statistical measures based on Gaussian and non-Gaussian distribution functions are evaluated. The analysis starts with the review of control error histograms followed by their statistical analysis using probability distribution functions. Simulation results obtained for a control system with the generalized predictive controller algorithm are considered. The proposed approach using Cauchy and Lévy α-stable distributions shows robustness against disturbances and enables effective control loop quality evaluation. Tests of the predictive algorithm prove its ability to detect the impact of the main controller parameters, such as the model gain, the dynamics or the prediction horizon.

  19. Statistical quality control a loss minimization approach

    CERN Document Server

    Trietsch, Dan

    1999-01-01

    While many books on quality espouse the Taguchi loss function, they do not examine its impact on statistical quality control (SQC). But using the Taguchi loss function sheds new light on questions relating to SQC and calls for some changes. This book covers SQC in a way that conforms with the need to minimize loss. Subjects often not covered elsewhere include: (i) measurements, (ii) determining how many points to sample to obtain reliable control charts (for which purpose a new graphic tool, diffidence charts, is introduced), (iii) the connection between process capability and tolerances, (iv)

  20. Statistical process control for residential treated wood

    Science.gov (United States)

    Patricia K. Lebow; Timothy M. Young; Stan Lebow

    2017-01-01

    This paper is the first stage of a study that attempts to improve the process of manufacturing treated lumber through the use of statistical process control (SPC). Analysis of industrial and auditing agency data sets revealed there are differences between the industry and agency probability density functions (pdf) for normalized retention data. Resampling of batches of...

  1. Multivariate statistical process control of a continuous pharmaceutical twin-screw granulation and fluid bed drying process.

    Science.gov (United States)

    Silva, A F; Sarraguça, M C; Fonteyne, M; Vercruysse, J; De Leersnyder, F; Vanhoorne, V; Bostijn, N; Verstraeten, M; Vervaet, C; Remon, J P; De Beer, T; Lopes, J A

    2017-08-07

    A multivariate statistical process control (MSPC) strategy was developed for the monitoring of the ConsiGma™-25 continuous tablet manufacturing line. Thirty-five logged variables encompassing three major units, being a twin screw high shear granulator, a fluid bed dryer and a product control unit, were used to monitor the process. The MSPC strategy was based on principal component analysis of data acquired under normal operating conditions using a series of four process runs. Runs with imposed disturbances in the dryer air flow and temperature, in the granulator barrel temperature, speed and liquid mass flow and in the powder dosing unit mass flow were utilized to evaluate the model's monitoring performance. The impact of the imposed deviations to the process continuity was also evaluated using Hotelling's T 2 and Q residuals statistics control charts. The influence of the individual process variables was assessed by analyzing contribution plots at specific time points. Results show that the imposed disturbances were all detected in both control charts. Overall, the MSPC strategy was successfully developed and applied. Additionally, deviations not associated with the imposed changes were detected, mainly in the granulator barrel temperature control. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. An introduction to statistical process control in research proteomics.

    Science.gov (United States)

    Bramwell, David

    2013-12-16

    Statistical process control is a well-established and respected method which provides a general purpose, and consistent framework for monitoring and improving the quality of a process. It is routinely used in many industries where the quality of final products is critical and is often required in clinical diagnostic laboratories [1,2]. To date, the methodology has been little utilised in research proteomics. It has been shown to be capable of delivering quantitative QC procedures for qualitative clinical assays [3] making it an ideal methodology to apply to this area of biological research. To introduce statistical process control as an objective strategy for quality control and show how it could be used to benefit proteomics researchers and enhance the quality of the results they generate. We demonstrate that rules which provide basic quality control are easy to derive and implement and could have a major impact on data quality for many studies. Statistical process control is a powerful tool for investigating and improving proteomics research work-flows. The process of characterising measurement systems and defining control rules forces the exploration of key questions that can lead to significant improvements in performance. This work asserts that QC is essential to proteomics discovery experiments. Every experimenter must know the current capabilities of their measurement system and have an objective means for tracking and ensuring that performance. Proteomic analysis work-flows are complicated and multi-variate. QC is critical for clinical chemistry measurements and huge strides have been made in ensuring the quality and validity of results in clinical biochemistry labs. This work introduces some of these QC concepts and works to bridge their use from single analyte QC to applications in multi-analyte systems. This article is part of a Special Issue entitled: Standardization and Quality Control in Proteomics. Copyright © 2013 The Author. Published by Elsevier

  3. Statistical physics of human beings in games: Controlled experiments

    International Nuclear Information System (INIS)

    Liang Yuan; Huang Ji-Ping

    2014-01-01

    It is important to know whether the laws or phenomena in statistical physics for natural systems with non-adaptive agents still hold for social human systems with adaptive agents, because this implies whether it is possible to study or understand social human systems by using statistical physics originating from natural systems. For this purpose, we review the role of human adaptability in four kinds of specific human behaviors, namely, normal behavior, herd behavior, contrarian behavior, and hedge behavior. The approach is based on controlled experiments in the framework of market-directed resource-allocation games. The role of the controlled experiments could be at least two-fold: adopting the real human decision-making process so that the system under consideration could reflect the performance of genuine human beings; making it possible to obtain macroscopic physical properties of a human system by tuning a particular factor of the system, thus directly revealing cause and effect. As a result, both computer simulations and theoretical analyses help to show a few counterparts of some laws or phenomena in statistical physics for social human systems: two-phase phenomena or phase transitions, entropy-related phenomena, and a non-equilibrium steady state. This review highlights the role of human adaptability in these counterparts, and makes it possible to study or understand some particular social human systems by means of statistical physics coming from natural systems. (topical review - statistical physics and complex systems)

  4. Pengendalian Kualitas Kertas Dengan Menggunakan Statistical Process Control di Paper Machine 3

    Directory of Open Access Journals (Sweden)

    Vera Devani

    2017-01-01

    Full Text Available Purpose of this research is to determine types and causes of defects commonly found in Paper Machine 3 by using statistical process control (SPC method.  Statistical process control (SPC is a technique for solving problems and is used to monitor, control, analyze, manage and improve products and processes using statistical methods.  Based on Pareto Diagrams, wavy defect is found as the most frequent defect, which is 81.7%.  Human factor, meanwhile, is found as the main cause of defect, primarily due to lack of understanding on machinery and lack of training both leading to errors in data input.

  5. Pitch Motion Stabilization by Propeller Speed Control Using Statistical Controller Design

    DEFF Research Database (Denmark)

    Nakatani, Toshihiko; Blanke, Mogens; Galeazzi, Roberto

    2006-01-01

    This paper describes dynamics analysis of a small training boat and a possibility of ship pitch stabilization by control of propeller speed. After upgrading the navigational system of an actual small training boat, in order to identify the model of the ship, the real data collected by sea trials...... were used for statistical analysis and system identification. This analysis shows that the pitching motion is indeed influenced by engine speed and it is suggested that there exists a possibility of reducing the pitching motion by properly controlling the engine throttle. Based on this observation...

  6. Development of nuclear power plant online monitoring system using statistical quality control

    International Nuclear Information System (INIS)

    An, Sang Ha

    2006-02-01

    Statistical Quality Control techniques have been applied to many aspects of industrial engineering. An application to nuclear power plant maintenance and control is also presented that can greatly improve plant safety. As a demonstration of such an approach, a specific system is analyzed: the reactor coolant pumps (RCP) and the fouling resistance of heat exchanger. This research uses Shewart X-bar, R charts, Cumulative Sum charts (CUSUM), and Sequential Probability Ratio Test (SPRT) to analyze the process for the state of statistical control. And we made Control Chart Analyzer (CCA) to support these analyses that can make a decision of error in process. The analysis shows that statistical process control methods can be applied as an early warning system capable of identifying significant equipment problems well in advance of traditional control room alarm indicators. Such a system would provide operators with enough time to respond to possible emergency situations and thus improve plant safety and reliability

  7. Image-guided radiotherapy quality control: Statistical process control using image similarity metrics.

    Science.gov (United States)

    Shiraishi, Satomi; Grams, Michael P; Fong de Los Santos, Luis E

    2018-05-01

    The purpose of this study was to demonstrate an objective quality control framework for the image review process. A total of 927 cone-beam computed tomography (CBCT) registrations were retrospectively analyzed for 33 bilateral head and neck cancer patients who received definitive radiotherapy. Two registration tracking volumes (RTVs) - cervical spine (C-spine) and mandible - were defined, within which a similarity metric was calculated and used as a registration quality tracking metric over the course of treatment. First, sensitivity to large misregistrations was analyzed for normalized cross-correlation (NCC) and mutual information (MI) in the context of statistical analysis. The distribution of metrics was obtained for displacements that varied according to a normal distribution with standard deviation of σ = 2 mm, and the detectability of displacements greater than 5 mm was investigated. Then, similarity metric control charts were created using a statistical process control (SPC) framework to objectively monitor the image registration and review process. Patient-specific control charts were created using NCC values from the first five fractions to set a patient-specific process capability limit. Population control charts were created using the average of the first five NCC values for all patients in the study. For each patient, the similarity metrics were calculated as a function of unidirectional translation, referred to as the effective displacement. Patient-specific action limits corresponding to 5 mm effective displacements were defined. Furthermore, effective displacements of the ten registrations with the lowest similarity metrics were compared with a three dimensional (3DoF) couch displacement required to align the anatomical landmarks. Normalized cross-correlation identified suboptimal registrations more effectively than MI within the framework of SPC. Deviations greater than 5 mm were detected at 2.8σ and 2.1σ from the mean for NCC and MI

  8. Statistical Process Control Charts for Measuring and Monitoring Temporal Consistency of Ratings

    Science.gov (United States)

    Omar, M. Hafidz

    2010-01-01

    Methods of statistical process control were briefly investigated in the field of educational measurement as early as 1999. However, only the use of a cumulative sum chart was explored. In this article other methods of statistical quality control are introduced and explored. In particular, methods in the form of Shewhart mean and standard deviation…

  9. [Critical of the additive model of the randomized controlled trial].

    Science.gov (United States)

    Boussageon, Rémy; Gueyffier, François; Bejan-Angoulvant, Theodora; Felden-Dominiak, Géraldine

    2008-01-01

    Randomized, double-blind, placebo-controlled clinical trials are currently the best way to demonstrate the clinical effectiveness of drugs. Its methodology relies on the method of difference (John Stuart Mill), through which the observed difference between two groups (drug vs placebo) can be attributed to the pharmacological effect of the drug being tested. However, this additive model can be questioned in the event of statistical interactions between the pharmacological and the placebo effects. Evidence in different domains has shown that the placebo effect can influence the effect of the active principle. This article evaluates the methodological, clinical and epistemological consequences of this phenomenon. Topics treated include extrapolating results, accounting for heterogeneous results, demonstrating the existence of several factors in the placebo effect, the necessity to take these factors into account for given symptoms or pathologies, as well as the problem of the "specific" effect.

  10. A Total Quality-Control Plan with Right-Sized Statistical Quality-Control.

    Science.gov (United States)

    Westgard, James O

    2017-03-01

    A new Clinical Laboratory Improvement Amendments option for risk-based quality-control (QC) plans became effective in January, 2016. Called an Individualized QC Plan, this option requires the laboratory to perform a risk assessment, develop a QC plan, and implement a QC program to monitor ongoing performance of the QC plan. Difficulties in performing a risk assessment may limit validity of an Individualized QC Plan. A better alternative is to develop a Total QC Plan including a right-sized statistical QC procedure to detect medically important errors. Westgard Sigma Rules provides a simple way to select the right control rules and the right number of control measurements. Copyright © 2016 Elsevier Inc. All rights reserved.

  11. COMPARISON OF STATISTICALLY CONTROLLED MACHINING SOLUTIONS OF TITANIUM ALLOYS USING USM

    Directory of Open Access Journals (Sweden)

    R. Singh

    2010-06-01

    Full Text Available The purpose of the present investigation is to compare the statistically controlled machining solution of titanium alloys using ultrasonic machining (USM. In this study, the previously developed Taguchi model for USM of titanium and its alloys has been investigated and compared. Relationships between the material removal rate, tool wear rate, surface roughness and other controllable machining parameters (power rating, tool type, slurry concentration, slurry type, slurry temperature and slurry size have been deduced. The results of this study suggest that at the best settings of controllable machining parameters for titanium alloys (based upon the Taguchi design, the machining solution with USM is statistically controlled, which is not observed for other settings of input parameters on USM.

  12. Interrupted Time Series Versus Statistical Process Control in Quality Improvement Projects.

    Science.gov (United States)

    Andersson Hagiwara, Magnus; Andersson Gäre, Boel; Elg, Mattias

    2016-01-01

    To measure the effect of quality improvement interventions, it is appropriate to use analysis methods that measure data over time. Examples of such methods include statistical process control analysis and interrupted time series with segmented regression analysis. This article compares the use of statistical process control analysis and interrupted time series with segmented regression analysis for evaluating the longitudinal effects of quality improvement interventions, using an example study on an evaluation of a computerized decision support system.

  13. Statistical process control methods allow the analysis and improvement of anesthesia care.

    Science.gov (United States)

    Fasting, Sigurd; Gisvold, Sven E

    2003-10-01

    Quality aspects of the anesthetic process are reflected in the rate of intraoperative adverse events. The purpose of this report is to illustrate how the quality of the anesthesia process can be analyzed using statistical process control methods, and exemplify how this analysis can be used for quality improvement. We prospectively recorded anesthesia-related data from all anesthetics for five years. The data included intraoperative adverse events, which were graded into four levels, according to severity. We selected four adverse events, representing important quality and safety aspects, for statistical process control analysis. These were: inadequate regional anesthesia, difficult emergence from general anesthesia, intubation difficulties and drug errors. We analyzed the underlying process using 'p-charts' for statistical process control. In 65,170 anesthetics we recorded adverse events in 18.3%; mostly of lesser severity. Control charts were used to define statistically the predictable normal variation in problem rate, and then used as a basis for analysis of the selected problems with the following results: Inadequate plexus anesthesia: stable process, but unacceptably high failure rate; Difficult emergence: unstable process, because of quality improvement efforts; Intubation difficulties: stable process, rate acceptable; Medication errors: methodology not suited because of low rate of errors. By applying statistical process control methods to the analysis of adverse events, we have exemplified how this allows us to determine if a process is stable, whether an intervention is required, and if quality improvement efforts have the desired effect.

  14. Statistical physics of human beings in games: Controlled experiments

    Science.gov (United States)

    Liang, Yuan; Huang, Ji-Ping

    2014-07-01

    It is important to know whether the laws or phenomena in statistical physics for natural systems with non-adaptive agents still hold for social human systems with adaptive agents, because this implies whether it is possible to study or understand social human systems by using statistical physics originating from natural systems. For this purpose, we review the role of human adaptability in four kinds of specific human behaviors, namely, normal behavior, herd behavior, contrarian behavior, and hedge behavior. The approach is based on controlled experiments in the framework of market-directed resource-allocation games. The role of the controlled experiments could be at least two-fold: adopting the real human decision-making process so that the system under consideration could reflect the performance of genuine human beings; making it possible to obtain macroscopic physical properties of a human system by tuning a particular factor of the system, thus directly revealing cause and effect. As a result, both computer simulations and theoretical analyses help to show a few counterparts of some laws or phenomena in statistical physics for social human systems: two-phase phenomena or phase transitions, entropy-related phenomena, and a non-equilibrium steady state. This review highlights the role of human adaptability in these counterparts, and makes it possible to study or understand some particular social human systems by means of statistical physics coming from natural systems.

  15. Sampling methods to the statistical control of the production of blood components.

    Science.gov (United States)

    Pereira, Paulo; Seghatchian, Jerard; Caldeira, Beatriz; Santos, Paula; Castro, Rosa; Fernandes, Teresa; Xavier, Sandra; de Sousa, Gracinda; de Almeida E Sousa, João Paulo

    2017-12-01

    The control of blood components specifications is a requirement generalized in Europe by the European Commission Directives and in the US by the AABB standards. The use of a statistical process control methodology is recommended in the related literature, including the EDQM guideline. The control reliability is dependent of the sampling. However, a correct sampling methodology seems not to be systematically applied. Commonly, the sampling is intended to comply uniquely with the 1% specification to the produced blood components. Nevertheless, on a purely statistical viewpoint, this model could be argued not to be related to a consistent sampling technique. This could be a severe limitation to detect abnormal patterns and to assure that the production has a non-significant probability of producing nonconforming components. This article discusses what is happening in blood establishments. Three statistical methodologies are proposed: simple random sampling, sampling based on the proportion of a finite population, and sampling based on the inspection level. The empirical results demonstrate that these models are practicable in blood establishments contributing to the robustness of sampling and related statistical process control decisions for the purpose they are suggested for. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Statistical process control using optimized neural networks: a case study.

    Science.gov (United States)

    Addeh, Jalil; Ebrahimzadeh, Ata; Azarbad, Milad; Ranaee, Vahid

    2014-09-01

    The most common statistical process control (SPC) tools employed for monitoring process changes are control charts. A control chart demonstrates that the process has altered by generating an out-of-control signal. This study investigates the design of an accurate system for the control chart patterns (CCPs) recognition in two aspects. First, an efficient system is introduced that includes two main modules: feature extraction module and classifier module. In the feature extraction module, a proper set of shape features and statistical feature are proposed as the efficient characteristics of the patterns. In the classifier module, several neural networks, such as multilayer perceptron, probabilistic neural network and radial basis function are investigated. Based on an experimental study, the best classifier is chosen in order to recognize the CCPs. Second, a hybrid heuristic recognition system is introduced based on cuckoo optimization algorithm (COA) algorithm to improve the generalization performance of the classifier. The simulation results show that the proposed algorithm has high recognition accuracy. Copyright © 2013 ISA. Published by Elsevier Ltd. All rights reserved.

  17. Microcontroller based automatic liquid poison addition control system

    International Nuclear Information System (INIS)

    Kapatral, R.S.; Ananthakrishnan, T.S.; Pansare, M.G.

    1989-01-01

    Microcontrollers are finding increasing applications in instrumentation where complex digital circuits can be substituted by a compact and simple circuit, thus enhancing the reliability. In addition to this, intelligence and flexibility can be incorporated. For applications not requiring large amount of read/write memory (RAM), microcontrollers are ideally suited since they contain programmable memory (Eprom), parallel input/output lines, data memory, programmable timers and serial interface ports in one chip. This paper describes the design of automatic liquid poison addition control system (ALPAS) using intel's 8 bit microcontroller 8751, which is used to generate complex timing control sequence signals for liquid poison addition to the moderator in a nuclear reactor. ALPAS monitors digital inputs coming from protection system and regulating system of a nuclear reactor and provides control signals for liquid poison addition for long term safe shutdown of the reactor after reactor trip and helps the regulating system to reduce the power of the reactor during operation. Special hardware and software features have been incorporated to improve performance and fault detection. (author)

  18. Statistical Process Control. Impact and Opportunities for Ohio.

    Science.gov (United States)

    Brown, Harold H.

    The first purpose of this study is to help the reader become aware of the evolution of Statistical Process Control (SPC) as it is being implemented and used in industry today. This is approached through the presentation of a brief historical account of SPC, from its inception through the technological miracle that has occurred in Japan. The…

  19. An easy and low cost option for economic statistical process control ...

    African Journals Online (AJOL)

    a large number of nonconforming products are manufactured. ... size, n, sampling interval, h, and control limit parameter, k, that minimize the ...... [11] Montgomery DC, 2001, Introduction to statistical quality control, 4th Edition, John Wiley, New.

  20. Optimization of Biodiesel-Diesel Blended Fuel Properties and Engine Performance with Ether Additive Using Statistical Analysis and Response Surface Methods

    Directory of Open Access Journals (Sweden)

    Obed M. Ali

    2015-12-01

    Full Text Available In this study, the fuel properties and engine performance of blended palm biodiesel-diesel using diethyl ether as additive have been investigated. The properties of B30 blended palm biodiesel-diesel fuel were measured and analyzed statistically with the addition of 2%, 4%, 6% and 8% (by volume diethyl ether additive. The engine tests were conducted at increasing engine speeds from 1500 rpm to 3500 rpm and under constant load. Optimization of independent variables was performed using the desirability approach of the response surface methodology (RSM with the goal of minimizing emissions and maximizing performance parameters. The experiments were designed using a statistical tool known as design of experiments (DoE based on RSM.

  1. Additive Feed Forward Control with Neural Networks

    DEFF Research Database (Denmark)

    Sørensen, O.

    1999-01-01

    This paper demonstrates a method to control a non-linear, multivariable, noisy process using trained neural networks. The basis for the method is a trained neural network controller acting as the inverse process model. A training method for obtaining such an inverse process model is applied....... A suitable 'shaped' (low-pass filtered) reference is used to overcome problems with excessive control action when using a controller acting as the inverse process model. The control concept is Additive Feed Forward Control, where the trained neural network controller, acting as the inverse process model......, is placed in a supplementary pure feed-forward path to an existing feedback controller. This concept benefits from the fact, that an existing, traditional designed, feedback controller can be retained without any modifications, and after training the connection of the neural network feed-forward controller...

  2. Reducing lumber thickness variation using real-time statistical process control

    Science.gov (United States)

    Thomas M. Young; Brian H. Bond; Jan Wiedenbeck

    2002-01-01

    A technology feasibility study for reducing lumber thickness variation was conducted from April 2001 until March 2002 at two sawmills located in the southern U.S. A real-time statistical process control (SPC) system was developed that featured Wonderware human machine interface technology (HMI) with distributed real-time control charts for all sawing centers and...

  3. Grain Structure Control of Additively Manufactured Metallic Materials

    Directory of Open Access Journals (Sweden)

    Fuyao Yan

    2017-11-01

    Full Text Available Grain structure control is challenging for metal additive manufacturing (AM. Grain structure optimization requires the control of grain morphology with grain size refinement, which can improve the mechanical properties of additive manufactured components. This work summarizes methods to promote fine equiaxed grains in both the additive manufacturing process and subsequent heat treatment. Influences of temperature gradient, solidification velocity and alloy composition on grain morphology are discussed. Equiaxed solidification is greatly promoted by introducing a high density of heterogeneous nucleation sites via powder rate control in the direct energy deposition (DED technique or powder surface treatment for powder-bed techniques. Grain growth/coarsening during post-processing heat treatment can be restricted by presence of nano-scale oxide particles formed in-situ during AM. Grain refinement of martensitic steels can also be achieved by cyclic austenitizing in post-processing heat treatment. Evidently, new alloy powder design is another sustainable method enhancing the capability of AM for high-performance components with desirable microstructures.

  4. The statistical process control methods - SPC

    Directory of Open Access Journals (Sweden)

    Floreková Ľubica

    1998-03-01

    Full Text Available Methods of statistical evaluation of quality – SPC (item 20 of the documentation system of quality control of ISO norm, series 900 of various processes, products and services belong amongst basic qualitative methods that enable us to analyse and compare data pertaining to various quantitative parameters. Also they enable, based on the latter, to propose suitable interventions with the aim of improving these processes, products and services. Theoretical basis and applicatibily of the principles of the: - diagnostics of a cause and effects, - Paret analysis and Lorentz curve, - number distribution and frequency curves of random variable distribution, - Shewhart regulation charts, are presented in the contribution.

  5. Implementation of statistical process control for proteomic experiments via LC MS/MS.

    Science.gov (United States)

    Bereman, Michael S; Johnson, Richard; Bollinger, James; Boss, Yuval; Shulman, Nick; MacLean, Brendan; Hoofnagle, Andrew N; MacCoss, Michael J

    2014-04-01

    Statistical process control (SPC) is a robust set of tools that aids in the visualization, detection, and identification of assignable causes of variation in any process that creates products, services, or information. A tool has been developed termed Statistical Process Control in Proteomics (SProCoP) which implements aspects of SPC (e.g., control charts and Pareto analysis) into the Skyline proteomics software. It monitors five quality control metrics in a shotgun or targeted proteomic workflow. None of these metrics require peptide identification. The source code, written in the R statistical language, runs directly from the Skyline interface, which supports the use of raw data files from several of the mass spectrometry vendors. It provides real time evaluation of the chromatographic performance (e.g., retention time reproducibility, peak asymmetry, and resolution), and mass spectrometric performance (targeted peptide ion intensity and mass measurement accuracy for high resolving power instruments) via control charts. Thresholds are experiment- and instrument-specific and are determined empirically from user-defined quality control standards that enable the separation of random noise and systematic error. Finally, Pareto analysis provides a summary of performance metrics and guides the user to metrics with high variance. The utility of these charts to evaluate proteomic experiments is illustrated in two case studies.

  6. Statistical transformation and the interpretation of inpatient glucose control data.

    Science.gov (United States)

    Saulnier, George E; Castro, Janna C; Cook, Curtiss B

    2014-03-01

    To introduce a statistical method of assessing hospital-based non-intensive care unit (non-ICU) inpatient glucose control. Point-of-care blood glucose (POC-BG) data from hospital non-ICUs were extracted for January 1 through December 31, 2011. Glucose data distribution was examined before and after Box-Cox transformations and compared to normality. Different subsets of data were used to establish upper and lower control limits, and exponentially weighted moving average (EWMA) control charts were constructed from June, July, and October data as examples to determine if out-of-control events were identified differently in nontransformed versus transformed data. A total of 36,381 POC-BG values were analyzed. In all 3 monthly test samples, glucose distributions in nontransformed data were skewed but approached a normal distribution once transformed. Interpretation of out-of-control events from EWMA control chart analyses also revealed differences. In the June test data, an out-of-control process was identified at sample 53 with nontransformed data, whereas the transformed data remained in control for the duration of the observed period. Analysis of July data demonstrated an out-of-control process sooner in the transformed (sample 55) than nontransformed (sample 111) data, whereas for October, transformed data remained in control longer than nontransformed data. Statistical transformations increase the normal behavior of inpatient non-ICU glycemic data sets. The decision to transform glucose data could influence the interpretation and conclusions about the status of inpatient glycemic control. Further study is required to determine whether transformed versus nontransformed data influence clinical decisions or evaluation of interventions.

  7. Statistical analysis of quality control of automatic processor

    International Nuclear Information System (INIS)

    Niu Yantao; Zhao Lei; Zhang Wei; Yan Shulin

    2002-01-01

    Objective: To strengthen the scientific management of automatic processor and promote QC, based on analyzing QC management chart for automatic processor by statistical method, evaluating and interpreting the data and trend of the chart. Method: Speed, contrast, minimum density of step wedge of film strip were measured everyday and recorded on the QC chart. Mean (x-bar), standard deviation (s) and range (R) were calculated. The data and the working trend were evaluated and interpreted for management decisions. Results: Using relative frequency distribution curve constructed by measured data, the authors can judge whether it is a symmetric bell-shaped curve or not. If not, it indicates a few extremes overstepping control limits possibly are pulling the curve to the left or right. If it is a normal distribution, standard deviation (s) is observed. When x-bar +- 2s lies in upper and lower control limits of relative performance indexes, it indicates the processor works in stable status in this period. Conclusion: Guided by statistical method, QC work becomes more scientific and quantified. The authors can deepen understanding and application of the trend chart, and improve the quality management to a new step

  8. Using Statistical Process Control Methods to Classify Pilot Mental Workloads

    National Research Council Canada - National Science Library

    Kudo, Terence

    2001-01-01

    .... These include cardiac, ocular, respiratory, and brain activity measures. The focus of this effort is to apply statistical process control methodology on different psychophysiological features in an attempt to classify pilot mental workload...

  9. Statistical Process Control in the Practice of Program Evaluation.

    Science.gov (United States)

    Posavac, Emil J.

    1995-01-01

    A technique developed to monitor the quality of manufactured products, statistical process control (SPC), incorporates several features that may prove attractive to evaluators. This paper reviews the history of SPC, suggests how the approach can enrich program evaluation, and illustrates its use in a hospital-based example. (SLD)

  10. Exploring the use of statistical process control methods to assess course changes

    Science.gov (United States)

    Vollstedt, Ann-Marie

    This dissertation pertains to the field of Engineering Education. The Department of Mechanical Engineering at the University of Nevada, Reno (UNR) is hosting this dissertation under a special agreement. This study was motivated by the desire to find an improved, quantitative measure of student quality that is both convenient to use and easy to evaluate. While traditional statistical analysis tools such as ANOVA (analysis of variance) are useful, they are somewhat time consuming and are subject to error because they are based on grades, which are influenced by numerous variables, independent of student ability and effort (e.g. inflation and curving). Additionally, grades are currently the only measure of quality in most engineering courses even though most faculty agree that grades do not accurately reflect student quality. Based on a literature search, in this study, quality was defined as content knowledge, cognitive level, self efficacy, and critical thinking. Nineteen treatments were applied to a pair of freshmen classes in an effort in increase the qualities. The qualities were measured via quiz grades, essays, surveys, and online critical thinking tests. Results from the quality tests were adjusted and filtered prior to analysis. All test results were subjected to Chauvenet's criterion in order to detect and remove outlying data. In addition to removing outliers from data sets, it was felt that individual course grades needed adjustment to accommodate for the large portion of the grade that was defined by group work. A new method was developed to adjust grades within each group based on the residual of the individual grades within the group and the portion of the course grade defined by group work. It was found that the grade adjustment method agreed 78% of the time with the manual ii grade changes instructors made in 2009, and also increased the correlation between group grades and individual grades. Using these adjusted grades, Statistical Process Control

  11. Application of classical versus bayesian statistical control charts to on-line radiological monitoring

    International Nuclear Information System (INIS)

    DeVol, T.A.; Gohres, A.A.; Williams, C.L.

    2009-01-01

    False positive and false negative incidence rates of radiological monitoring data from classical and Bayesian statistical process control chart techniques are compared. The on-line monitoring for illicit radioactive material with no false positives or false negatives is the goal of homeland security monitoring, but is unrealistic. However, statistical fluctuations in the detector signal, short detection times, large source to detector distances, and shielding effects make distinguishing between a radiation source and natural background particularly difficult. Experimental time series data were collected using a 1' x 1' LaCl 3 (Ce) based scintillation detector (Scionix, Orlando, FL) under various simulated conditions. Experimental parameters include radionuclide (gamma-ray) energy, activity, density thickness (source to detector distance and shielding), time, and temperature. All statistical algorithms were developed using MATLAB TM . The Shewhart (3-σ) control chart and the cumulative sum (CUSUM) control chart are the classical procedures adopted, while the Bayesian technique is the Shiryayev-Roberts (S-R) control chart. The Shiryayev-Roberts method was the best method for controlling the number of false positive detects, followed by the CUSUM method. However, The Shiryayev-Roberts method, used without modification, resulted in one of the highest false negative incidence rates independent of the signal strength. Modification of The Shiryayev-Roberts statistical analysis method reduced the number of false negatives, but resulted in an increase in the false positive incidence rate. (author)

  12. Effect of food additives on hyperphosphatemia among patients with end-stage renal disease: a randomized controlled trial.

    Science.gov (United States)

    Sullivan, Catherine; Sayre, Srilekha S; Leon, Janeen B; Machekano, Rhoderick; Love, Thomas E; Porter, David; Marbury, Marquisha; Sehgal, Ashwini R

    2009-02-11

    High dietary phosphorus intake has deleterious consequences for renal patients and is possibly harmful for the general public as well. To prevent hyperphosphatemia, patients with end-stage renal disease limit their intake of foods that are naturally high in phosphorus. However, phosphorus-containing additives are increasingly being added to processed and fast foods. The effect of such additives on serum phosphorus levels is unclear. To determine the effect of limiting the intake of phosphorus-containing food additives on serum phosphorus levels among patients with end-stage renal disease. Cluster randomized controlled trial at 14 long-term hemodialysis facilities in northeast Ohio. Two hundred seventy-nine patients with elevated baseline serum phosphorus levels (>5.5 mg/dL) were recruited between May and October 2007. Two shifts at each of 12 large facilities and 1 shift at each of 2 small facilities were randomly assigned to an intervention or control group. Intervention participants (n=145) received education on avoiding foods with phosphorus additives when purchasing groceries or visiting fast food restaurants. Control participants (n=134) continued to receive usual care. Change in serum phosphorus level after 3 months. At baseline, there was no significant difference in serum phosphorus levels between the 2 groups. After 3 months, the decline in serum phosphorus levels was 0.6 mg/dL larger among intervention vs control participants (95% confidence interval, -1.0 to -0.1 mg/dL). Intervention participants also had statistically significant increases in reading ingredient lists (Pfood knowledge scores (P = .13). Educating end-stage renal disease patients to avoid phosphorus-containing food additives resulted in modest improvements in hyperphosphatemia. clinicaltrials.gov Identifier: NCT00583570.

  13. Is there still a role for additional linear ablation in addition to pulmonary vein isolation in patients with paroxysmal atrial fibrillation? An Updated Meta-analysis of randomized controlled trials.

    Science.gov (United States)

    Hu, Xiaoliang; Jiang, Jingzhou; Ma, Yuedong; Tang, Anli

    2016-04-15

    The benefits and risks of additional left atrium (LA) linear ablation in patients with paroxysmal atrial fibrillation (AF) remain unclear. Randomized controlled trials were identified in the PubMed, Web of Science, Embase and Cochrane databases, and the relevant papers were examined. Pooled relative risks (RR) and 95% confidence interval (95% CI) were estimated using random effects models. The primary endpoint was the maintenance of sinus rhythm after a single ablation. Nine randomized controlled trials involving 1138 patients were included in this analysis. Additional LA linear ablation did not improve the maintenance of the sinus rhythm following a single procedure (RR, 1.03; 95% CI, 0.93-1.13; P=0.60). A subgroup analysis demonstrated that all methods of additional linear ablation failed to improve the outcome. Additional linear ablation significantly increased the mean procedural time (166.53±67.7 vs. 139.57±62.44min, Plinear ablation did not exhibit any benefits in terms of sinus rhythm maintenance for paroxysmal AF patients following a single procedure. Additional linear ablation significantly increased the mean procedural, fluoroscopy and RF application times. This additional ablation was not associated with a statistically significant increase in complication rates. This finding must be confirmed by further large, high-quality clinical trials. Copyright © 2016. Published by Elsevier Ireland Ltd.

  14. Statistical Process Control in a Modern Production Environment

    DEFF Research Database (Denmark)

    Windfeldt, Gitte Bjørg

    gathered here and standard statistical software. In Paper 2 a new method for process monitoring is introduced. The method uses a statistical model of the quality characteristic and a sliding window of observations to estimate the probability that the next item will not respect the specications......Paper 1 is aimed at practicians to help them test the assumption that the observations in a sample are independent and identically distributed. An assumption that is essential when using classical Shewhart charts. The test can easily be performed in the control chart setup using the samples....... If the estimated probability exceeds a pre-determined threshold the process will be stopped. The method is exible, allowing a complexity in modeling that remains invisible to the end user. Furthermore, the method allows to build diagnostic plots based on the parameters estimates that can provide valuable insight...

  15. Statistical process control: separating signal from noise in emergency department operations.

    Science.gov (United States)

    Pimentel, Laura; Barrueto, Fermin

    2015-05-01

    Statistical process control (SPC) is a visually appealing and statistically rigorous methodology very suitable to the analysis of emergency department (ED) operations. We demonstrate that the control chart is the primary tool of SPC; it is constructed by plotting data measuring the key quality indicators of operational processes in rationally ordered subgroups such as units of time. Control limits are calculated using formulas reflecting the variation in the data points from one another and from the mean. SPC allows managers to determine whether operational processes are controlled and predictable. We review why the moving range chart is most appropriate for use in the complex ED milieu, how to apply SPC to ED operations, and how to determine when performance improvement is needed. SPC is an excellent tool for operational analysis and quality improvement for these reasons: 1) control charts make large data sets intuitively coherent by integrating statistical and visual descriptions; 2) SPC provides analysis of process stability and capability rather than simple comparison with a benchmark; 3) SPC allows distinction between special cause variation (signal), indicating an unstable process requiring action, and common cause variation (noise), reflecting a stable process; and 4) SPC keeps the focus of quality improvement on process rather than individual performance. Because data have no meaning apart from their context, and every process generates information that can be used to improve it, we contend that SPC should be seriously considered for driving quality improvement in emergency medicine. Copyright © 2015 Elsevier Inc. All rights reserved.

  16. Use of statistic control of the process as part of a quality assurance plan

    International Nuclear Information System (INIS)

    Acosta, S.; Lewis, C.

    2013-01-01

    One of the technical requirements of the standard IRAM ISO 17025 for the accreditation of testing laboratories, is the assurance of the quality of the results through the control and monitoring of the factors influencing the reliability of them. The grade the factors contribute to the total measurement uncertainty, determines which of them should be considered when developing a quality assurance plan. The laboratory of environmental measurements of strontium-90 in the accreditation process, performs most of its determinations in samples with values close to the detection limit. For this reason the correct characterization of the white, is a critical parameter and is verified through a letter for statistical process control. The scope of the present work is concerned the control of whites and so it was collected a statistically significant amount of data, for a period of time that is covered of different conditions. This allowed consider significant variables in the process, such as temperature and humidity, and build a graph of white control, which forms the basis of a statistical process control. The data obtained were lower and upper limits for the preparation of the charter white control. In this way the process of characterization of white was considered to operate under statistical control and concludes that it can be used as part of a plan of insurance of the quality

  17. Investigation of variation of additional enthalpy of proteins with respect to pH by statistical mechanical methods

    International Nuclear Information System (INIS)

    Oylumoglu, G.

    2005-01-01

    In this study variation of additional enthalpy with respect to pH has been investigated by the statistical mechanical methods.. To bring up the additional effect, the partition function of the proteins are calculated by single protein molecule approximation. From the partition function, free energies of the proteins are obtained and by this way additional free energy has been used in the calculation of the terms in the thermodynamical quantity. Additional enthalpy H D has been obtained by taking effective electric field E and constant dipole moment M as thermodynamical variables and using Maxwell Equations. In the presented semi phenomenological theory, necessary data are taken from the experimental study of P.L. Privalov. The variation in the additional enthalpy H D has been investigated in the pH interval of 1-5 and the results of the calculations are discussed for Lysozyme

  18. Guideline implementation in clinical practice: Use of statistical process control charts as visual feedback devices

    Directory of Open Access Journals (Sweden)

    Fahad A Al-Hussein

    2009-01-01

    Conclusions: A process of audits in the context of statistical process control is necessary for any improvement in the implementation of guidelines in primary care. Statistical process control charts are an effective means of visual feedback to the care providers.

  19. Statistical applications for chemistry, manufacturing and controls (CMC) in the pharmaceutical industry

    CERN Document Server

    Burdick, Richard K; Pfahler, Lori B; Quiroz, Jorge; Sidor, Leslie; Vukovinsky, Kimberly; Zhang, Lanju

    2017-01-01

    This book examines statistical techniques that are critically important to Chemistry, Manufacturing, and Control (CMC) activities. Statistical methods are presented with a focus on applications unique to the CMC in the pharmaceutical industry. The target audience consists of statisticians and other scientists who are responsible for performing statistical analyses within a CMC environment. Basic statistical concepts are addressed in Chapter 2 followed by applications to specific topics related to development and manufacturing. The mathematical level assumes an elementary understanding of statistical methods. The ability to use Excel or statistical packages such as Minitab, JMP, SAS, or R will provide more value to the reader. The motivation for this book came from an American Association of Pharmaceutical Scientists (AAPS) short course on statistical methods applied to CMC applications presented by four of the authors. One of the course participants asked us for a good reference book, and the only book recomm...

  20. Statistical Process Control. A Summary. FEU/PICKUP Project Report.

    Science.gov (United States)

    Owen, M.; Clark, I.

    A project was conducted to develop a curriculum and training materials to be used in training industrial operatives in statistical process control (SPC) techniques. During the first phase of the project, questionnaires were sent to 685 companies (215 of which responded) to determine where SPC was being used, what type of SPC firms needed, and how…

  1. Statistical process control for electron beam monitoring.

    Science.gov (United States)

    López-Tarjuelo, Juan; Luquero-Llopis, Naika; García-Mollá, Rafael; Quirós-Higueras, Juan David; Bouché-Babiloni, Ana; Juan-Senabre, Xavier Jordi; de Marco-Blancas, Noelia; Ferrer-Albiach, Carlos; Santos-Serra, Agustín

    2015-07-01

    To assess the electron beam monitoring statistical process control (SPC) in linear accelerator (linac) daily quality control. We present a long-term record of our measurements and evaluate which SPC-led conditions are feasible for maintaining control. We retrieved our linac beam calibration, symmetry, and flatness daily records for all electron beam energies from January 2008 to December 2013, and retrospectively studied how SPC could have been applied and which of its features could be used in the future. A set of adjustment interventions designed to maintain these parameters under control was also simulated. All phase I data was under control. The dose plots were characterized by rising trends followed by steep drops caused by our attempts to re-center the linac beam calibration. Where flatness and symmetry trends were detected they were less-well defined. The process capability ratios ranged from 1.6 to 9.3 at a 2% specification level. Simulated interventions ranged from 2% to 34% of the total number of measurement sessions. We also noted that if prospective SPC had been applied it would have met quality control specifications. SPC can be used to assess the inherent variability of our electron beam monitoring system. It can also indicate whether a process is capable of maintaining electron parameters under control with respect to established specifications by using a daily checking device, but this is not practical unless a method to establish direct feedback from the device to the linac can be devised. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  2. Statistical disclosure control for microdata methods and applications in R

    CERN Document Server

    Templ, Matthias

    2017-01-01

    This book on statistical disclosure control presents the theory, applications and software implementation of the traditional approach to (micro)data anonymization, including data perturbation methods, disclosure risk, data utility, information loss and methods for simulating synthetic data. Introducing readers to the R packages sdcMicro and simPop, the book also features numerous examples and exercises with solutions, as well as case studies with real-world data, accompanied by the underlying R code to allow readers to reproduce all results. The demand for and volume of data from surveys, registers or other sources containing sensible information on persons or enterprises have increased significantly over the last several years. At the same time, privacy protection principles and regulations have imposed restrictions on the access and use of individual data. Proper and secure microdata dissemination calls for the application of statistical disclosure control methods to the data before release. This book is in...

  3. Moment based model predictive control for systems with additive uncertainty

    NARCIS (Netherlands)

    Saltik, M.B.; Ozkan, L.; Weiland, S.; Ludlage, J.H.A.

    2017-01-01

    In this paper, we present a model predictive control (MPC) strategy based on the moments of the state variables and the cost functional. The statistical properties of the state predictions are calculated through the open loop iteration of dynamics and used in the formulation of MPC cost function. We

  4. A case study: application of statistical process control tool for determining process capability and sigma level.

    Science.gov (United States)

    Chopra, Vikram; Bairagi, Mukesh; Trivedi, P; Nagar, Mona

    2012-01-01

    Statistical process control is the application of statistical methods to the measurement and analysis of variation process. Various regulatory authorities such as Validation Guidance for Industry (2011), International Conference on Harmonisation ICH Q10 (2009), the Health Canada guidelines (2009), Health Science Authority, Singapore: Guidance for Product Quality Review (2008), and International Organization for Standardization ISO-9000:2005 provide regulatory support for the application of statistical process control for better process control and understanding. In this study risk assessments, normal probability distributions, control charts, and capability charts are employed for selection of critical quality attributes, determination of normal probability distribution, statistical stability, and capability of production processes, respectively. The objective of this study is to determine tablet production process quality in the form of sigma process capability. By interpreting data and graph trends, forecasting of critical quality attributes, sigma process capability, and stability of process were studied. The overall study contributes to an assessment of process at the sigma level with respect to out-of-specification attributes produced. Finally, the study will point to an area where the application of quality improvement and quality risk assessment principles for achievement of six sigma-capable processes is possible. Statistical process control is the most advantageous tool for determination of the quality of any production process. This tool is new for the pharmaceutical tablet production process. In the case of pharmaceutical tablet production processes, the quality control parameters act as quality assessment parameters. Application of risk assessment provides selection of critical quality attributes among quality control parameters. Sequential application of normality distributions, control charts, and capability analyses provides a valid statistical

  5. Application of Multivariable Statistical Techniques in Plant-wide WWTP Control Strategies Analysis

    DEFF Research Database (Denmark)

    Flores Alsina, Xavier; Comas, J.; Rodríguez-Roda, I.

    2007-01-01

    The main objective of this paper is to present the application of selected multivariable statistical techniques in plant-wide wastewater treatment plant (WWTP) control strategies analysis. In this study, cluster analysis (CA), principal component analysis/factor analysis (PCA/FA) and discriminant...... analysis (DA) are applied to the evaluation matrix data set obtained by simulation of several control strategies applied to the plant-wide IWA Benchmark Simulation Model No 2 (BSM2). These techniques allow i) to determine natural groups or clusters of control strategies with a similar behaviour, ii......) to find and interpret hidden, complex and casual relation features in the data set and iii) to identify important discriminant variables within the groups found by the cluster analysis. This study illustrates the usefulness of multivariable statistical techniques for both analysis and interpretation...

  6. PROCESS VARIABILITY REDUCTION THROUGH STATISTICAL PROCESS CONTROL FOR QUALITY IMPROVEMENT

    Directory of Open Access Journals (Sweden)

    B.P. Mahesh

    2010-09-01

    Full Text Available Quality has become one of the most important customer decision factors in the selection among the competing product and services. Consequently, understanding and improving quality is a key factor leading to business success, growth and an enhanced competitive position. Hence quality improvement program should be an integral part of the overall business strategy. According to TQM, the effective way to improve the Quality of the product or service is to improve the process used to build the product. Hence, TQM focuses on process, rather than results as the results are driven by the processes. Many techniques are available for quality improvement. Statistical Process Control (SPC is one such TQM technique which is widely accepted for analyzing quality problems and improving the performance of the production process. This article illustrates the step by step procedure adopted at a soap manufacturing company to improve the Quality by reducing process variability using Statistical Process Control.

  7. Evaluation of statistical control charts for on-line radiation monitoring

    International Nuclear Information System (INIS)

    Hughes, L.D.; DeVol, T.A.

    2008-01-01

    Statistical control charts are presented for the evaluation of time series radiation counter data from flow cells used for monitoring of low levels of 99 TcO 4 - in environmental solutions. Control chart methods consisted of the 3-sigma (3σ) chart, the cumulative sum (CUSUM) chart, and the exponentially weighted moving average (EWMA) chart. Each method involves a control limit based on the detector background which constitutes the detection limit. Both the CUSUM and EWMA charts are suitable to detect and estimate sample concentration requiring less solution volume than when using a 3? control chart. Data presented here indicate that the overall accuracy and precision of the CUSUM method is the best. (author)

  8. Active-Reactive Additional Damping Control of a Doubly-Fed Induction Generator Based on Active Disturbance Rejection Control

    Directory of Open Access Journals (Sweden)

    Yanfeng Ma

    2018-05-01

    Full Text Available Large-scale wind power interfacing to the power grid has an impact on the stability of the power system. However, with an additional damping controller of the wind generator, new ways for improving system damping and suppressing the low frequency oscillation (LFO of power systems can be put forward. In this paper, an active-reactive power additional damping controller based on active disturbance rejection control (ADRC is proposed. In order to improve the precision of the controller, the theory of data driven control is adopted, using the numerical algorithms for subspace state space system identification (N4SID to obtain the two order model of the ADRC controlled object. Based on the identification model, the ADRC additional damping controller is designed. Taking a 2-area 4-machine system containing the doubly fed induction generator (DFIG wind farm as an example, it is verified that the active-reactive additional damping controller designed in this paper performs well in suppressing negative-damping LFO and forced power oscillation. When the operation state of the power system changes, it can still restrain the LFO effectively, showing stronger robustness and better effectiveness compared to the traditional proportional–integral–derivative (PID additional damping controller.

  9. Experience in statistical quality control for road construction in South Africa

    CSIR Research Space (South Africa)

    Mitchell, MF

    1977-06-01

    Full Text Available of statistically oriented acceptance control procedures to a major road construction project is examined and it is concluded that such procedures promise to be of benefit to both the client and the contractor....

  10. Quality control statistic for laboratory analysis and assays in Departamento de Tecnologia de Combustiveis - IPEN-BR

    International Nuclear Information System (INIS)

    Lima, Waldir C. de; Lainetti, Paulo E.O.; Lima, Roberto M. de; Peres, Henrique G.

    1996-01-01

    The purpose of this work is the study for introduction of the statistical control in test and analysis realized in the Departamento de Tecnologia de Combustiveis. Are succinctly introduced: theories of statistical process control, elaboration of control graphs, the definition of standards test (or analysis) and how the standards are employed for determination the control limits in the graphs. The more expressive result is the applied form for the practice quality control, moreover it is also exemplified the utilization of one standard of verification and analysis in the laboratory of control. (author)

  11. An Automated Statistical Process Control Study of Inline Mixing Using Spectrophotometric Detection

    Science.gov (United States)

    Dickey, Michael D.; Stewart, Michael D.; Willson, C. Grant

    2006-01-01

    An experiment is described, which is designed for a junior-level chemical engineering "fundamentals of measurements and data analysis" course, where students are introduced to the concept of statistical process control (SPC) through a simple inline mixing experiment. The students learn how to create and analyze control charts in an effort to…

  12. Statistical process control: An approach to quality assurance in the production of vitrified nuclear waste

    International Nuclear Information System (INIS)

    Pulsipher, B.A.; Kuhn, W.L.

    1987-01-01

    Current planning for liquid high-level nuclear wastes existing in the United States includes processing in a liquid-fed ceramic melter to incorporate it into a high-quality glass, and placement in a deep geologic repository. The nuclear waste vitrification process requires assurance of a quality product with little or no final inspection. Statistical process control (SPC) is a quantitative approach to one quality assurance aspect of vitrified nuclear waste. This method for monitoring and controlling a process in the presence of uncertainties provides a statistical basis for decisions concerning product quality improvement. Statistical process control is shown to be a feasible and beneficial tool to help the waste glass producers demonstrate that the vitrification process can be controlled sufficiently to produce an acceptable product. This quantitative aspect of quality assurance could be an effective means of establishing confidence in the claims to a quality product

  13. Statistical process control: An approach to quality assurance in the production of vitrified nuclear waste

    International Nuclear Information System (INIS)

    Pulsipher, B.A.; Kuhn, W.L.

    1987-02-01

    Current planning for liquid high-level nuclear wastes existing in the US includes processing in a liquid-fed ceramic melter to incorporate it into a high-quality glass, and placement in a deep geologic repository. The nuclear waste vitrification process requires assurance of a quality product with little or no final inspection. Statistical process control (SPC) is a quantitative approach to one quality assurance aspect of vitrified nuclear waste. This method for monitoring and controlling a process in the presence of uncertainties provides a statistical basis for decisions concerning product quality improvement. Statistical process control is shown to be a feasible and beneficial tool to help the waste glass producers demonstrate that the vitrification process can be controlled sufficiently to produce an acceptable product. This quantitative aspect of quality assurance could be an effective means of establishing confidence in the claims to a quality product. 2 refs., 4 figs

  14. Monitoring a PVC batch process with multivariate statistical process control charts

    NARCIS (Netherlands)

    Tates, A. A.; Louwerse, D. J.; Smilde, A. K.; Koot, G. L. M.; Berndt, H.

    1999-01-01

    Multivariate statistical process control charts (MSPC charts) are developed for the industrial batch production process of poly(vinyl chloride) (PVC). With these MSPC charts different types of abnormal batch behavior were detected on-line. With batch contribution plots, the probable causes of these

  15. The application of statistical process control in linac quality assurance

    International Nuclear Information System (INIS)

    Li Dingyu; Dai Jianrong

    2009-01-01

    Objective: To improving linac quality assurance (QA) program with statistical process control (SPC) method. Methods: SPC is applied to set the control limit of QA data, draw charts and differentiate the random and systematic errors. A SPC quality assurance software named QA M ANAGER has been developed by VB programming for clinical use. Two clinical cases are analyzed with SPC to study daily output QA of a 6MV photon beam. Results: In the clinical case, the SPC is able to identify the systematic errors. Conclusion: The SPC application may be assistant to detect systematic errors in linac quality assurance thus it alarms the abnormal trend to eliminate the systematic errors and improves quality control. (authors)

  16. SU-C-BRD-01: A Statistical Modeling Method for Quality Control of Intensity- Modulated Radiation Therapy Planning

    International Nuclear Information System (INIS)

    Gao, S; Meyer, R; Shi, L; D'Souza, W; Zhang, H

    2014-01-01

    Purpose: To apply a statistical modeling approach, threshold modeling (TM), for quality control of intensity-modulated radiation therapy (IMRT) treatment plans. Methods: A quantitative measure, which was the weighted sum of violations of dose/dose-volume constraints, was first developed to represent the quality of each IMRT plan. Threshold modeling approach, which is is an extension of extreme value theory in statistics and is an effect way to model extreme values, was then applied to analyze the quality of the plans summarized by our quantitative measures. Our approach modeled the plans generated by planners as a series of independent and identically distributed random variables and described the behaviors of them if the plan quality was controlled below certain threshold. We tested our approach with five locally advanced head and neck cancer patients retrospectively. Two statistics were incorporated for numerical analysis: probability of quality improvement (PQI) of the plans and expected amount of improvement on the quantitative measure (EQI). Results: After clinical planners generated 15 plans for each patient, we applied our approach to obtain the PQI and EQI as if planners would generate additional 15 plans. For two of the patients, the PQI was significantly higher than the other three (0.17 and 0.18 comparing to 0.08, 0.01 and 0.01). The actual percentage of the additional 15 plans that outperformed the best of initial 15 plans was 20% and 27% comparing to 11%, 0% and 0%. EQI for the two potential patients were 34.5 and 32.9 and the rest of three patients were 9.9, 1.4 and 6.6. The actual improvements obtained were 28.3 and 20.5 comparing to 6.2, 0 and 0. Conclusion: TM is capable of reliably identifying the potential quality improvement of IMRT plans. It provides clinicians an effective tool to assess the trade-off between extra planning effort and achievable plan quality. This work was supported in part by NIH/NCI grant CA130814

  17. Additive, control, energy, medical and other hot topics

    NARCIS (Netherlands)

    Brouwer, Dannis Michel

    2014-01-01

    The 14th International Conference of the European Society for Precision Engineering and Nanotechnology (euspen) was held in Dubrovnik, Croatia, on 2-6 June 2014. Among the hot topics discussed were additive manufacturing (AM), motion control in precision systems, renewable energy technologies, and

  18. Cancer Statistics

    Science.gov (United States)

    ... What Is Cancer? Cancer Statistics Cancer Disparities Cancer Statistics Cancer has a major impact on society in ... success of efforts to control and manage cancer. Statistics at a Glance: The Burden of Cancer in ...

  19. Improving the Document Development Process: Integrating Relational Data and Statistical Process Control.

    Science.gov (United States)

    Miller, John

    1994-01-01

    Presents an approach to document numbering, document titling, and process measurement which, when used with fundamental techniques of statistical process control, reveals meaningful process-element variation as well as nominal productivity models. (SR)

  20. Analyzing a Mature Software Inspection Process Using Statistical Process Control (SPC)

    Science.gov (United States)

    Barnard, Julie; Carleton, Anita; Stamper, Darrell E. (Technical Monitor)

    1999-01-01

    This paper presents a cooperative effort where the Software Engineering Institute and the Space Shuttle Onboard Software Project could experiment applying Statistical Process Control (SPC) analysis to inspection activities. The topics include: 1) SPC Collaboration Overview; 2) SPC Collaboration Approach and Results; and 3) Lessons Learned.

  1. Using Statistical Process Control to Drive Improvement in Neonatal Care: A Practical Introduction to Control Charts.

    Science.gov (United States)

    Gupta, Munish; Kaplan, Heather C

    2017-09-01

    Quality improvement (QI) is based on measuring performance over time, and variation in data measured over time must be understood to guide change and make optimal improvements. Common cause variation is natural variation owing to factors inherent to any process; special cause variation is unnatural variation owing to external factors. Statistical process control methods, and particularly control charts, are robust tools for understanding data over time and identifying common and special cause variation. This review provides a practical introduction to the use of control charts in health care QI, with a focus on neonatology. Copyright © 2017 Elsevier Inc. All rights reserved.

  2. Tailor-Made Additives for Morphology Control in Molecular Bulk-Heterojunction Photovoltaics

    KAUST Repository

    Graham, Kenneth R.

    2013-01-09

    Tailor-made additives, which are molecules that share the same molecular structure as a parent molecule with only slight structural variations, have previously been demonstrated as a useful means to control crystallization dynamics in solution. For example, tailor-made additives can be added to solutions of a crystallizing parent molecule to alter the crystal growth rate, size, and shape. We apply this strategy as a means to predictably control morphology in molecular bulk-heterojunction (BHJ) photovoltaic cells. Through the use of an asymmetric oligomer substituted with a bulky triisobutylsilyl end group, the morphology of BHJ blends can be controlled resulting in a near doubling (from 1.3 to 2.2%) in power conversion efficiency. The use of tailor-made additives provides promising opportunities for controlling crystallization dynamics, and thereby film morphologies, for many organic electronic devices such as photovoltaics and field-effect transistors. © 2012 American Chemical Society.

  3. Tailor-Made Additives for Morphology Control in Molecular Bulk-Heterojunction Photovoltaics

    KAUST Repository

    Graham, Kenneth R.; Stalder, Romain; Wieruszewski, Patrick M.; Patel, Dinesh G.; Salazar, Danielle H.; Reynolds, John R.

    2013-01-01

    Tailor-made additives, which are molecules that share the same molecular structure as a parent molecule with only slight structural variations, have previously been demonstrated as a useful means to control crystallization dynamics in solution. For example, tailor-made additives can be added to solutions of a crystallizing parent molecule to alter the crystal growth rate, size, and shape. We apply this strategy as a means to predictably control morphology in molecular bulk-heterojunction (BHJ) photovoltaic cells. Through the use of an asymmetric oligomer substituted with a bulky triisobutylsilyl end group, the morphology of BHJ blends can be controlled resulting in a near doubling (from 1.3 to 2.2%) in power conversion efficiency. The use of tailor-made additives provides promising opportunities for controlling crystallization dynamics, and thereby film morphologies, for many organic electronic devices such as photovoltaics and field-effect transistors. © 2012 American Chemical Society.

  4. Mathematical statistics

    CERN Document Server

    Pestman, Wiebe R

    2009-01-01

    This textbook provides a broad and solid introduction to mathematical statistics, including the classical subjects hypothesis testing, normal regression analysis, and normal analysis of variance. In addition, non-parametric statistics and vectorial statistics are considered, as well as applications of stochastic analysis in modern statistics, e.g., Kolmogorov-Smirnov testing, smoothing techniques, robustness and density estimation. For students with some elementary mathematical background. With many exercises. Prerequisites from measure theory and linear algebra are presented.

  5. PRECISE - pregabalin in addition to usual care: Statistical analysis plan

    NARCIS (Netherlands)

    S. Mathieson (Stephanie); L. Billot (Laurent); C. Maher (Chris); A.J. McLachlan (Andrew J.); J. Latimer (Jane); B.W. Koes (Bart); M.J. Hancock (Mark J.); I. Harris (Ian); R.O. Day (Richard O.); J. Pik (Justin); S. Jan (Stephen); C.-W.C. Lin (Chung-Wei Christine)

    2016-01-01

    textabstractBackground: Sciatica is a severe, disabling condition that lacks high quality evidence for effective treatment strategies. This a priori statistical analysis plan describes the methodology of analysis for the PRECISE study. Methods/design: PRECISE is a prospectively registered, double

  6. Statistical transformation and the interpretation of inpatient glucose control data from the intensive care unit.

    Science.gov (United States)

    Saulnier, George E; Castro, Janna C; Cook, Curtiss B

    2014-05-01

    Glucose control can be problematic in critically ill patients. We evaluated the impact of statistical transformation on interpretation of intensive care unit inpatient glucose control data. Point-of-care blood glucose (POC-BG) data derived from patients in the intensive care unit for 2011 was obtained. Box-Cox transformation of POC-BG measurements was performed, and distribution of data was determined before and after transformation. Different data subsets were used to establish statistical upper and lower control limits. Exponentially weighted moving average (EWMA) control charts constructed from April, October, and November data determined whether out-of-control events could be identified differently in transformed versus nontransformed data. A total of 8679 POC-BG values were analyzed. POC-BG distributions in nontransformed data were skewed but approached normality after transformation. EWMA control charts revealed differences in projected detection of out-of-control events. In April, an out-of-control process resulting in the lower control limit being exceeded was identified at sample 116 in nontransformed data but not in transformed data. October transformed data detected an out-of-control process exceeding the upper control limit at sample 27 that was not detected in nontransformed data. Nontransformed November results remained in control, but transformation identified an out-of-control event less than 10 samples into the observation period. Using statistical methods to assess population-based glucose control in the intensive care unit could alter conclusions about the effectiveness of care processes for managing hyperglycemia. Further study is required to determine whether transformed versus nontransformed data change clinical decisions about the interpretation of care or intervention results. © 2014 Diabetes Technology Society.

  7. The product composition control system at Savannah River: Statistical process control algorithm

    International Nuclear Information System (INIS)

    Brown, K.G.

    1994-01-01

    The Defense Waste Processing Facility (DWPF) at the Savannah River Site (SRS) will be used to immobilize the approximately 130 million liters of high-level nuclear waste currently stored at the site in 51 carbon steel tanks. Waste handling operations separate this waste into highly radioactive insoluble sludge and precipitate and less radioactive water soluble salts. In DWPF, precipitate (PHA) is blended with insoluble sludge and ground glass frit to produce melter feed slurry which is continuously fed to the DWPF melter. The melter produces a molten borosilicate glass which is poured into stainless steel canisters for cooling and, ultimately, shipment to and storage in an geologic repository. Described here is the Product Composition Control System (PCCS) process control algorithm. The PCCS is the amalgam of computer hardware and software intended to ensure that the melt will be processable and that the glass wasteform produced will be acceptable. Within PCCS, the Statistical Process Control (SPC) Algorithm is the means which guides control of the DWPF process. The SPC Algorithm is necessary to control the multivariate DWPF process in the face of uncertainties arising from the process, its feeds, sampling, modeling, and measurement systems. This article describes the functions performed by the SPC Algorithm, characterization of DWPF prior to making product, accounting for prediction uncertainty, accounting for measurement uncertainty, monitoring a SME batch, incorporating process information, and advantages of the algorithm. 9 refs., 6 figs

  8. Quality Control of the Print with the Application of Statistical Methods

    Science.gov (United States)

    Simonenko, K. V.; Bulatova, G. S.; Antropova, L. B.; Varepo, L. G.

    2018-04-01

    The basis for standardizing the process of offset printing is the control of print quality indicators. The solution of this problem has various approaches, among which the most important are statistical methods. Practical implementation of them for managing the quality of the printing process is very relevant and is reflected in this paper. The possibility of using the method of constructing a Control Card to identify the reasons for the deviation of the optical density for a triad of inks in offset printing is shown.

  9. The application of statistical and/or non-statistical sampling techniques by internal audit functions in the South African banking industry

    Directory of Open Access Journals (Sweden)

    D.P. van der Nest

    2015-03-01

    Full Text Available This article explores the use by internal audit functions of audit sampling techniques in order to test the effectiveness of controls in the banking sector. The article focuses specifically on the use of statistical and/or non-statistical sampling techniques by internal auditors. The focus of the research for this article was internal audit functions in the banking sector of South Africa. The results discussed in the article indicate that audit sampling is still used frequently as an audit evidence-gathering technique. Non-statistical sampling techniques are used more frequently than statistical sampling techniques for the evaluation of the sample. In addition, both techniques are regarded as important for the determination of the sample size and the selection of the sample items

  10. Assessing thermal comfort and energy efficiency in buildings by statistical quality control for autocorrelated data

    International Nuclear Information System (INIS)

    Barbeito, Inés; Zaragoza, Sonia; Tarrío-Saavedra, Javier; Naya, Salvador

    2017-01-01

    Highlights: • Intelligent web platform development for energy efficiency management in buildings. • Controlling and supervising thermal comfort and energy consumption in buildings. • Statistical quality control procedure to deal with autocorrelated data. • Open source alternative using R software. - Abstract: In this paper, a case study of performing a reliable statistical procedure to evaluate the quality of HVAC systems in buildings using data retrieved from an ad hoc big data web energy platform is presented. The proposed methodology based on statistical quality control (SQC) is used to analyze the real state of thermal comfort and energy efficiency of the offices of the company FRIDAMA (Spain) in a reliable way. Non-conformities or alarms, and the actual assignable causes of these out of control states are detected. The capability to meet specification requirements is also analyzed. Tools and packages implemented in the open-source R software are employed to apply the different procedures. First, this study proposes to fit ARIMA time series models to CTQ variables. Then, the application of Shewhart and EWMA control charts to the time series residuals is proposed to control and monitor thermal comfort and energy consumption in buildings. Once thermal comfort and consumption variability are estimated, the implementation of capability indexes for autocorrelated variables is proposed to calculate the degree to which standards specifications are met. According with case study results, the proposed methodology has detected real anomalies in HVAC installation, helping to detect assignable causes and to make appropriate decisions. One of the goals is to perform and describe step by step this statistical procedure in order to be replicated by practitioners in a better way.

  11. Statistical Process Control: A Quality Tool for a Venous Thromboembolic Disease Registry.

    Science.gov (United States)

    Posadas-Martinez, Maria Lourdes; Rojas, Liliana Paloma; Vazquez, Fernando Javier; De Quiros, Fernan Bernaldo; Waisman, Gabriel Dario; Giunta, Diego Hernan

    2016-01-01

    We aim to describe Statistical Control Process as a quality tool for the Institutional Registry of Venous Thromboembolic Disease (IRTD), a registry developed in a community-care tertiary hospital in Buenos Aires, Argentina. The IRTD is a prospective cohort. The process of data acquisition began with the creation of a computerized alert generated whenever physicians requested imaging or laboratory study to diagnose venous thromboembolism, which defined eligible patients. The process then followed a structured methodology for patient's inclusion, evaluation, and posterior data entry. To control this process, process performance indicators were designed to be measured monthly. These included the number of eligible patients, the number of included patients, median time to patient's evaluation, and percentage of patients lost to evaluation. Control charts were graphed for each indicator. The registry was evaluated in 93 months, where 25,757 patients were reported and 6,798 patients met inclusion criteria. The median time to evaluation was 20 hours (SD, 12) and 7.7% of the total was lost to evaluation. Each indicator presented trends over time, caused by structural changes and improvement cycles, and therefore the central limit suffered inflexions. Statistical process control through process performance indicators allowed us to control the performance of the registry over time to detect systematic problems. We postulate that this approach could be reproduced for other clinical registries.

  12. Method for mapping population-based case-control studies: an application using generalized additive models

    Directory of Open Access Journals (Sweden)

    Aschengrau Ann

    2006-06-01

    Full Text Available Abstract Background Mapping spatial distributions of disease occurrence and risk can serve as a useful tool for identifying exposures of public health concern. Disease registry data are often mapped by town or county of diagnosis and contain limited data on covariates. These maps often possess poor spatial resolution, the potential for spatial confounding, and the inability to consider latency. Population-based case-control studies can provide detailed information on residential history and covariates. Results Generalized additive models (GAMs provide a useful framework for mapping point-based epidemiologic data. Smoothing on location while controlling for covariates produces adjusted maps. We generate maps of odds ratios using the entire study area as a reference. We smooth using a locally weighted regression smoother (loess, a method that combines the advantages of nearest neighbor and kernel methods. We choose an optimal degree of smoothing by minimizing Akaike's Information Criterion. We use a deviance-based test to assess the overall importance of location in the model and pointwise permutation tests to locate regions of significantly increased or decreased risk. The method is illustrated with synthetic data and data from a population-based case-control study, using S-Plus and ArcView software. Conclusion Our goal is to develop practical methods for mapping population-based case-control and cohort studies. The method described here performs well for our synthetic data, reproducing important features of the data and adequately controlling the covariate. When applied to the population-based case-control data set, the method suggests spatial confounding and identifies statistically significant areas of increased and decreased odds ratios.

  13. Multivariate Statistical Process Control Charts and the Problem of Interpretation: A Short Overview and Some Applications in Industry

    OpenAIRE

    Bersimis, Sotiris; Panaretos, John; Psarakis, Stelios

    2005-01-01

    Woodall and Montgomery [35] in a discussion paper, state that multivariate process control is one of the most rapidly developing sections of statistical process control. Nowadays, in industry, there are many situations in which the simultaneous monitoring or control, of two or more related quality - process characteristics is necessary. Process monitoring problems in which several related variables are of interest are collectively known as Multivariate Statistical Process Control (MSPC).This ...

  14. Additive Manufacturing: Multi Material Processing and Part Quality Control

    DEFF Research Database (Denmark)

    Pedersen, David Bue

    This Ph.D dissertation,ffAdditive Manufacturing: Multi Material Processing and Part Quality Controlff, deal with Additive Manufacturing technologies which is a common name for a series of processes that are recognized by being computer controlled, highly automated, and manufacture objects...... by a layered deposition of material. Two areas of particular interest is addressed. They are rooted in two very different areas, yet is intended to fuel the same goal. To help Additive Manufacturing technologies one step closer to becoming the autonomous, digital manufacturing method of tomorrow. Vision...... systems A paradox exist in the field of Additive Manufacturing. The technologies allow for close-to unrestrained and integral geometrical freedom. Almost any geometry can be manufactured fast, e"ciently and cheap. Something that has been missing fundamental capability since the entering of the industrial...

  15. Advances in Statistical Control, Algebraic Systems Theory, and Dynamic Systems Characteristics A Tribute to Michael K Sain

    CERN Document Server

    Won, Chang-Hee; Michel, Anthony N

    2008-01-01

    This volume - dedicated to Michael K. Sain on the occasion of his seventieth birthday - is a collection of chapters covering recent advances in stochastic optimal control theory and algebraic systems theory. Written by experts in their respective fields, the chapters are thematically organized into four parts: Part I focuses on statistical control theory, where the cost function is viewed as a random variable and performance is shaped through cost cumulants. In this respect, statistical control generalizes linear-quadratic-Gaussian and H-infinity control. Part II addresses algebraic systems th

  16. Natural time analysis and Tsallis non-additive entropy statistical mechanics.

    Science.gov (United States)

    Sarlis, N. V.; Skordas, E. S.; Varotsos, P.

    2016-12-01

    Upon analyzing the seismic data in natural time and employing a sliding natural time window comprising a number of events that would occur in a few months, it has been recently uncovered[1] that a precursory Seismic Electric Signals activity[2] initiates almost simultaneously with the appearance of a minimum in the fluctuations of the order parameter of seismicity [3]. Such minima have been ascertained [4] during periods of the magnitude time series exhibiting long range correlations [5] a few months before all earthquakes of magnitude 7.6 or larger that occurred in the entire Japanese area from 1 January 1984 to 11 March 2011 (the day of the M9 Tohoku-Oki earthquake). Before and after these minima, characteristic changes of the temporal correlations between earthquake magnitudes are observed which cannot be captured by Tsallis non-additive entropy statistical mechanics in the frame of which it has been suggested that kappa distributions arise [6]. Here, we extend the study concerning the existence of such minima in a large area that includes Aegean Sea and its surrounding area which exhibits in general seismo-tectonics [7] different than that of the entire Japanese area. References P. A. Varotsos et al., Tectonophysics, 589 (2013) 116. P. Varotsos and M. Lazaridou, Tectonophysics 188 (1991) 321. P.A. Varotsos et al., Phys Rev E 72 (2005) 041103. N. V. Sarlis et al., Proc Natl Acad Sci USA 110 (2013) 13734. P. A. Varotsos, N. V. Sarlis, and E. S. Skordas, J Geophys Res Space Physics 119 (2014), 9192, doi: 10.1002/2014JA0205800. G. Livadiotis, and D. J. McComas, J Geophys Res 114 (2009) A11105, doi:10.1029/2009JA014352. S. Uyeda et al., Tectonophysics, 304 (1999) 41.

  17. Assessment of the beryllium lymphocyte proliferation test using statistical process control.

    Science.gov (United States)

    Cher, Daniel J; Deubner, David C; Kelsh, Michael A; Chapman, Pamela S; Ray, Rose M

    2006-10-01

    Despite more than 20 years of surveillance and epidemiologic studies using the beryllium blood lymphocyte proliferation test (BeBLPT) as a measure of beryllium sensitization (BeS) and as an aid for diagnosing subclinical chronic beryllium disease (CBD), improvements in specific understanding of the inhalation toxicology of CBD have been limited. Although epidemiologic data suggest that BeS and CBD risks vary by process/work activity, it has proven difficult to reach specific conclusions regarding the dose-response relationship between workplace beryllium exposure and BeS or subclinical CBD. One possible reason for this uncertainty could be misclassification of BeS resulting from variation in BeBLPT testing performance. The reliability of the BeBLPT, a biological assay that measures beryllium sensitization, is unknown. To assess the performance of four laboratories that conducted this test, we used data from a medical surveillance program that offered testing for beryllium sensitization with the BeBLPT. The study population was workers exposed to beryllium at various facilities over a 10-year period (1992-2001). Workers with abnormal results were offered diagnostic workups for CBD. Our analyses used a standard statistical technique, statistical process control (SPC), to evaluate test reliability. The study design involved a repeated measures analysis of BeBLPT results generated from the company-wide, longitudinal testing. Analytical methods included use of (1) statistical process control charts that examined temporal patterns of variation for the stimulation index, a measure of cell reactivity to beryllium; (2) correlation analysis that compared prior perceptions of BeBLPT instability to the statistical measures of test variation; and (3) assessment of the variation in the proportion of missing test results and how time periods with more missing data influenced SPC findings. During the period of this study, all laboratories displayed variation in test results that

  18. A Unified Statistical Rain-Attenuation Model for Communication Link Fade Predictions and Optimal Stochastic Fade Control Design Using a Location-Dependent Rain-Statistic Database

    Science.gov (United States)

    Manning, Robert M.

    1990-01-01

    A static and dynamic rain-attenuation model is presented which describes the statistics of attenuation on an arbitrarily specified satellite link for any location for which there are long-term rainfall statistics. The model may be used in the design of the optimal stochastic control algorithms to mitigate the effects of attenuation and maintain link reliability. A rain-statistics data base is compiled, which makes it possible to apply the model to any location in the continental U.S. with a resolution of 0-5 degrees in latitude and longitude. The model predictions are compared with experimental observations, showing good agreement.

  19. Robust reconfigurable control for parametric and additive faults with FDI uncertainties

    DEFF Research Database (Denmark)

    Stoustrup, Jakob; Yang, Zhenyu

    2000-01-01

    From the system recoverable point of view, this paper discusses robust reconfigurable control synthesis for LTI systems and a class of nonlinear control systems with parametric and additive faults as well as derivations generated by FDI algorithms. By following the model-matching strategy......, an augmented optimal control problem is constructed based on the considered faulty and fictitious nominal systems, such that the robust control design techniques, such as H-infinity control and mu synthesis, can be employed for the reconfigurable control design....

  20. Statistical process control applied to the manufacturing of beryllia ceramics

    International Nuclear Information System (INIS)

    Ferguson, G.P.; Jech, D.E.; Sepulveda, J.L.

    1991-01-01

    To compete effectively in an international market, scrap and re-work costs must be minimized. Statistical Process Control (SPC) provides powerful tools to optimize production performance. These techniques are currently being applied to the forming, metallizing, and brazing of beryllia ceramic components. This paper describes specific examples of applications of SPC to dry-pressing of beryllium oxide 2x2 substrates, to Mo-Mn refractory metallization, and to metallization and brazing of plasma tubes used in lasers where adhesion strength is critical

  1. Evaluation of statistical protocols for quality control of ecosystem carbon dioxide fluxes

    Science.gov (United States)

    Jorge F. Perez-Quezada; Nicanor Z. Saliendra; William E. Emmerich; Emilio A. Laca

    2007-01-01

    The process of quality control of micrometeorological and carbon dioxide (CO2) flux data can be subjective and may lack repeatability, which would undermine the results of many studies. Multivariate statistical methods and time series analysis were used together and independently to detect and replace outliers in CO2 flux...

  2. Additional methodology development for statistical evaluation of reactor safety analyses

    International Nuclear Information System (INIS)

    Marshall, J.A.; Shore, R.W.; Chay, S.C.; Mazumdar, M.

    1977-03-01

    The project described is motivated by the desire for methods to quantify uncertainties and to identify conservatisms in nuclear power plant safety analysis. The report examines statistical methods useful for assessing the probability distribution of output response from complex nuclear computer codes, considers sensitivity analysis and several other topics, and also sets the path for using the developed methods for realistic assessment of the design basis accident

  3. Antagonistic control of a dual-input mammalian gene switch by food additives.

    Science.gov (United States)

    Xie, Mingqi; Ye, Haifeng; Hamri, Ghislaine Charpin-El; Fussenegger, Martin

    2014-08-01

    Synthetic biology has significantly advanced the design of mammalian trigger-inducible transgene-control devices that are able to programme complex cellular behaviour. Fruit-based benzoate derivatives licensed as food additives, such as flavours (e.g. vanillate) and preservatives (e.g. benzoate), are a particularly attractive class of trigger compounds for orthogonal mammalian transgene control devices because of their innocuousness, physiological compatibility and simple oral administration. Capitalizing on the genetic componentry of the soil bacterium Comamonas testosteroni, which has evolved to catabolize a variety of aromatic compounds, we have designed different mammalian gene expression systems that could be induced and repressed by the food additives benzoate and vanillate. When implanting designer cells engineered for gene switch-driven expression of the human placental secreted alkaline phosphatase (SEAP) into mice, blood SEAP levels of treated animals directly correlated with a benzoate-enriched drinking programme. Additionally, the benzoate-/vanillate-responsive device was compatible with other transgene control systems and could be assembled into higher-order control networks providing expression dynamics reminiscent of a lap-timing stopwatch. Designer gene switches using licensed food additives as trigger compounds to achieve antagonistic dual-input expression profiles and provide novel control topologies and regulation dynamics may advance future gene- and cell-based therapies. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  4. Bootstrap-based confidence estimation in PCA and multivariate statistical process control

    DEFF Research Database (Denmark)

    Babamoradi, Hamid

    be used to detect outliers in the data since the outliers can distort the bootstrap estimates. Bootstrap-based confidence limits were suggested as alternative to the asymptotic limits for control charts and contribution plots in MSPC (Paper II). The results showed that in case of the Q-statistic......Traditional/Asymptotic confidence estimation has limited applicability since it needs statistical theories to estimate the confidences, which are not available for all indicators/parameters. Furthermore, in case the theories are available for a specific indicator/parameter, the theories are based....... The goal was to improve process monitoring by improving the quality of MSPC charts and contribution plots. Bootstrapping algorithm to build confidence limits was illustrated in a case study format (Paper I). The main steps in the algorithm were discussed where a set of sensible choices (plus...

  5. Statistical process control support during Defense Waste Processing Facility chemical runs

    International Nuclear Information System (INIS)

    Brown, K.G.

    1994-01-01

    The Product Composition Control System (PCCS) has been developed to ensure that the wasteforms produced by the Defense Waste Processing Facility (DWPF) at the Savannah River Site (SRS) will satisfy the regulatory and processing criteria that will be imposed. The PCCS provides rigorous, statistically-defensible management of a noisy, multivariate system subject to multiple constraints. The system has been successfully tested and has been used to control the production of the first two melter feed batches during DWPF Chemical Runs. These operations will demonstrate the viability of the DWPF process. This paper provides a brief discussion of the technical foundation for the statistical process control algorithms incorporated into PCCS, and describes the results obtained and lessons learned from DWPF Cold Chemical Run operations. The DWPF will immobilize approximately 130 million liters of high-level nuclear waste currently stored at the Site in 51 carbon steel tanks. Waste handling operations separate this waste into highly radioactive sludge and precipitate streams and less radioactive water soluble salts. (In a separate facility, soluble salts are disposed of as low-level waste in a mixture of cement slag, and flyash.) In DWPF, the precipitate steam (Precipitate Hydrolysis Aqueous or PHA) is blended with the insoluble sludge and ground glass frit to produce melter feed slurry which is continuously fed to the DWPF melter. The melter produces a molten borosilicate glass which is poured into stainless steel canisters for cooling and, ultimately, shipment to and storage in a geologic repository

  6. Statistical process control applied to intensity modulated radiotherapy pretreatment controls with portal dosimetry;Maitrise statistique des processus appliquee aux controles avant traitement par dosimetrie portale en radiotherapie conformationnelle avec modulation d'intensite

    Energy Technology Data Exchange (ETDEWEB)

    Villani, N.; Noel, A. [Laboratoire de recherche en radiophysique, CRAN UMR 7039, Nancy universite-CNRS, 54 - Vandoeuvre-les-Nancy (France); Villani, N.; Gerard, K.; Marchesi, V.; Huger, S.; Noel, A. [Departement de radiophysique, centre Alexis-Vautrin, 54 - Vandoeuvre-les-Nancy (France); Francois, P. [Institut Curie, 75 - Paris (France)

    2010-06-15

    Purpose The first purpose of this study was to illustrate the contribution of statistical process control for a better security in intensity modulated radiotherapy (I.M.R.T.) treatments. This improvement is possible by controlling the dose delivery process, characterized by pretreatment quality control results. So, it is necessary to put under control portal dosimetry measurements (currently, the ionisation chamber measurements were already monitored by statistical process control thanks to statistical process control tools). The second objective was to state whether it is possible to substitute ionisation chamber with portal dosimetry in order to optimize time devoted to pretreatment quality control. Patients and methods At Alexis-Vautrin center, pretreatment quality controls in I.M.R.T. for prostate and head and neck treatments were performed for each beam of each patient. These controls were made with an ionisation chamber, which is the reference detector for the absolute dose measurement, and with portal dosimetry for the verification of dose distribution. Statistical process control is a statistical analysis method, coming from industry, used to control and improve the studied process quality. It uses graphic tools as control maps to follow-up process, warning the operator in case of failure, and quantitative tools to evaluate the process toward its ability to respect guidelines: this is the capability study. The study was performed on 450 head and neck beams and on 100 prostate beams. Results Control charts, showing drifts, both slow and weak, and also both strong and fast, of mean and standard deviation have been established and have shown special cause introduced (manual shift of the leaf gap of the multi-leaf collimator). Correlation between dose measured at one point, given with the E.P.I.D. and the ionisation chamber has been evaluated at more than 97% and disagreement cases between the two measurements were identified. Conclusion The study allowed to

  7. Semi-Poisson statistics in quantum chaos.

    Science.gov (United States)

    García-García, Antonio M; Wang, Jiao

    2006-03-01

    We investigate the quantum properties of a nonrandom Hamiltonian with a steplike singularity. It is shown that the eigenfunctions are multifractals and, in a certain range of parameters, the level statistics is described exactly by semi-Poisson statistics (SP) typical of pseudointegrable systems. It is also shown that our results are universal, namely, they depend exclusively on the presence of the steplike singularity and are not modified by smooth perturbations of the potential or the addition of a magnetic flux. Although the quantum properties of our system are similar to those of a disordered conductor at the Anderson transition, we report important quantitative differences in both the level statistics and the multifractal dimensions controlling the transition. Finally, the study of quantum transport properties suggests that the classical singularity induces quantum anomalous diffusion. We discuss how these findings may be experimentally corroborated by using ultracold atoms techniques.

  8. Additivity of statistical moments in the exponentially modified Gaussian model of chromatography

    International Nuclear Information System (INIS)

    Howerton, Samuel B.; Lee Chomin; McGuffin, Victoria L.

    2002-01-01

    A homologous series of saturated fatty acids ranging from C 10 to C 22 was separated by reversed-phase capillary liquid chromatography. The resultant zone profiles were found to be fit best by an exponentially modified Gaussian (EMG) function. To compare the EMG function and statistical moments for the analysis of the experimental zone profiles, a series of simulated profiles was generated by using fixed values for retention time and different values for the symmetrical (σ) and asymmetrical (τ) contributions to the variance. The simulated profiles were modified with respect to the integration limits, the number of points, and the signal-to-noise ratio. After modification, each profile was analyzed by using statistical moments and an iteratively fit EMG equation. These data indicate that the statistical moment method is much more susceptible to error when the degree of asymmetry is large, when the integration limits are inappropriately chosen, when the number of points is small, and when the signal-to-noise ratio is small. The experimental zone profiles were then analyzed by using the statistical moment and EMG methods. Although care was taken to minimize the sources of error discussed above, significant differences were found between the two methods. The differences in the second moment suggest that the symmetrical and asymmetrical contributions to broadening in the experimental zone profiles are not independent. As a consequence, the second moment is not equal to the sum of σ 2 and τ 2 , as is commonly assumed. This observation has important implications for the elucidation of thermodynamic and kinetic information from chromatographic zone profiles

  9. Impact of Autocorrelation on Principal Components and Their Use in Statistical Process Control

    DEFF Research Database (Denmark)

    Vanhatalo, Erik; Kulahci, Murat

    2015-01-01

    A basic assumption when using principal component analysis (PCA) for inferential purposes, such as in statistical process control (SPC), is that the data are independent in time. In many industrial processes, frequent sampling and process dynamics make this assumption unrealistic rendering sampled...

  10. Statistical process control for alpha spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Richardson, W; Majoras, R E [Oxford Instruments, Inc. P.O. Box 2560, Oak Ridge TN 37830 (United States); Joo, I O; Seymour, R S [Accu-Labs Research, Inc. 4663 Table Mountain Drive, Golden CO 80403 (United States)

    1995-10-01

    Statistical process control(SPC) allows for the identification of problems in alpha spectroscopy processes before they occur, unlike standard laboratory Q C which only identifies problems after a process fails. SPC tools that are directly applicable to alpha spectroscopy include individual X-charts and X-bar charts, process capability plots, and scatter plots. Most scientists are familiar with the concepts the and methods employed by SPC. These tools allow analysis of process bias, precision, accuracy and reproducibility as well as process capability. Parameters affecting instrument performance are monitored and analyzed using SPC methods. These instrument parameters can also be compared to sampling, preparation, measurement, and analysis Q C parameters permitting the evaluation of cause effect relationships. Three examples of SPC, as applied to alpha spectroscopy , are presented. The first example investigates background contamination using averaging to show trends quickly. A second example demonstrates how SPC can identify sample processing problems, analyzing both how and why this problem occurred. A third example illustrates how SPC can predict when an alpha spectroscopy process is going to fail. This allows for an orderly and timely shutdown of the process to perform preventative maintenance, avoiding the need to repeat costly sample analyses. 7 figs., 2 tabs.

  11. Statistical process control for alpha spectroscopy

    International Nuclear Information System (INIS)

    Richardson, W.; Majoras, R.E.; Joo, I.O.; Seymour, R.S.

    1995-01-01

    Statistical process control(SPC) allows for the identification of problems in alpha spectroscopy processes before they occur, unlike standard laboratory Q C which only identifies problems after a process fails. SPC tools that are directly applicable to alpha spectroscopy include individual X-charts and X-bar charts, process capability plots, and scatter plots. Most scientists are familiar with the concepts the and methods employed by SPC. These tools allow analysis of process bias, precision, accuracy and reproducibility as well as process capability. Parameters affecting instrument performance are monitored and analyzed using SPC methods. These instrument parameters can also be compared to sampling, preparation, measurement, and analysis Q C parameters permitting the evaluation of cause effect relationships. Three examples of SPC, as applied to alpha spectroscopy , are presented. The first example investigates background contamination using averaging to show trends quickly. A second example demonstrates how SPC can identify sample processing problems, analyzing both how and why this problem occurred. A third example illustrates how SPC can predict when an alpha spectroscopy process is going to fail. This allows for an orderly and timely shutdown of the process to perform preventative maintenance, avoiding the need to repeat costly sample analyses. 7 figs., 2 tabs

  12. Review of the patient positioning reproducibility in head-and-neck radiotherapy using Statistical Process Control.

    Science.gov (United States)

    Moore, Sarah J; Herst, Patries M; Louwe, Robert J W

    2018-05-01

    A remarkable improvement in patient positioning was observed after the implementation of various process changes aiming to increase the consistency of patient positioning throughout the radiotherapy treatment chain. However, no tool was available to describe these changes over time in a standardised way. This study reports on the feasibility of Statistical Process Control (SPC) to highlight changes in patient positioning accuracy and facilitate correlation of these changes with the underlying process changes. Metrics were designed to quantify the systematic and random patient deformation as input for the SPC charts. These metrics were based on data obtained from multiple local ROI matches for 191 patients who were treated for head-and-neck cancer during the period 2011-2016. SPC highlighted a significant improvement in patient positioning that coincided with multiple intentional process changes. The observed improvements could be described as a combination of a reduction in outliers and a systematic improvement in the patient positioning accuracy of all patients. SPC is able to track changes in the reproducibility of patient positioning in head-and-neck radiation oncology, and distinguish between systematic and random process changes. Identification of process changes underlying these trends requires additional statistical analysis and seems only possible when the changes do not overlap in time. Copyright © 2018 Elsevier B.V. All rights reserved.

  13. Methods for computational disease surveillance in infection prevention and control: Statistical process control versus Twitter's anomaly and breakout detection algorithms.

    Science.gov (United States)

    Wiemken, Timothy L; Furmanek, Stephen P; Mattingly, William A; Wright, Marc-Oliver; Persaud, Annuradha K; Guinn, Brian E; Carrico, Ruth M; Arnold, Forest W; Ramirez, Julio A

    2018-02-01

    Although not all health care-associated infections (HAIs) are preventable, reducing HAIs through targeted intervention is key to a successful infection prevention program. To identify areas in need of targeted intervention, robust statistical methods must be used when analyzing surveillance data. The objective of this study was to compare and contrast statistical process control (SPC) charts with Twitter's anomaly and breakout detection algorithms. SPC and anomaly/breakout detection (ABD) charts were created for vancomycin-resistant Enterococcus, Acinetobacter baumannii, catheter-associated urinary tract infection, and central line-associated bloodstream infection data. Both SPC and ABD charts detected similar data points as anomalous/out of control on most charts. The vancomycin-resistant Enterococcus ABD chart detected an extra anomalous point that appeared to be higher than the same time period in prior years. Using a small subset of the central line-associated bloodstream infection data, the ABD chart was able to detect anomalies where the SPC chart was not. SPC charts and ABD charts both performed well, although ABD charts appeared to work better in the context of seasonal variation and autocorrelation. Because they account for common statistical issues in HAI data, ABD charts may be useful for practitioners for analysis of HAI surveillance data. Copyright © 2018 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.

  14. A comprehensive analysis of the IMRT dose delivery process using statistical process control (SPC)

    Energy Technology Data Exchange (ETDEWEB)

    Gerard, Karine; Grandhaye, Jean-Pierre; Marchesi, Vincent; Kafrouni, Hanna; Husson, Francois; Aletti, Pierre [Research Center for Automatic Control (CRAN), Nancy University, CNRS, 54516 Vandoeuvre-les-Nancy (France); Department of Medical Physics, Alexis Vautrin Cancer Center, 54511 Vandoeuvre-les-Nancy Cedex (France) and DOSIsoft SA, 94230 Cachan (France); Research Laboratory for Innovative Processes (ERPI), Nancy University, EA 3767, 5400 Nancy Cedex (France); Department of Medical Physics, Alexis Vautrin Cancer Center, 54511 Vandoeuvre-les-Nancy Cedex (France); DOSIsoft SA, 94230 Cachan (France); Research Center for Automatic Control (CRAN), Nancy University, CNRS, 54516 Vandoeuvre-les-Nancy, France and Department of Medical Physics, Alexis Vautrin Cancer Center, 54511 Vandoeuvre-les-Nancy Cedex (France)

    2009-04-15

    The aim of this study is to introduce tools to improve the security of each IMRT patient treatment by determining action levels for the dose delivery process. To achieve this, the patient-specific quality control results performed with an ionization chamber--and which characterize the dose delivery process--have been retrospectively analyzed using a method borrowed from industry: Statistical process control (SPC). The latter consisted in fulfilling four principal well-structured steps. The authors first quantified the short term variability of ionization chamber measurements regarding the clinical tolerances used in the cancer center ({+-}4% of deviation between the calculated and measured doses) by calculating a control process capability (C{sub pc}) index. The C{sub pc} index was found superior to 4, which implies that the observed variability of the dose delivery process is not biased by the short term variability of the measurement. Then, the authors demonstrated using a normality test that the quality control results could be approximated by a normal distribution with two parameters (mean and standard deviation). Finally, the authors used two complementary tools--control charts and performance indices--to thoroughly analyze the IMRT dose delivery process. Control charts aim at monitoring the process over time using statistical control limits to distinguish random (natural) variations from significant changes in the process, whereas performance indices aim at quantifying the ability of the process to produce data that are within the clinical tolerances, at a precise moment. The authors retrospectively showed that the analysis of three selected control charts (individual value, moving-range, and EWMA control charts) allowed efficient drift detection of the dose delivery process for prostate and head-and-neck treatments before the quality controls were outside the clinical tolerances. Therefore, when analyzed in real time, during quality controls, they should

  15. A comprehensive analysis of the IMRT dose delivery process using statistical process control (SPC).

    Science.gov (United States)

    Gérard, Karine; Grandhaye, Jean-Pierre; Marchesi, Vincent; Kafrouni, Hanna; Husson, François; Aletti, Pierre

    2009-04-01

    The aim of this study is to introduce tools to improve the security of each IMRT patient treatment by determining action levels for the dose delivery process. To achieve this, the patient-specific quality control results performed with an ionization chamber--and which characterize the dose delivery process--have been retrospectively analyzed using a method borrowed from industry: Statistical process control (SPC). The latter consisted in fulfilling four principal well-structured steps. The authors first quantified the short-term variability of ionization chamber measurements regarding the clinical tolerances used in the cancer center (+/- 4% of deviation between the calculated and measured doses) by calculating a control process capability (C(pc)) index. The C(pc) index was found superior to 4, which implies that the observed variability of the dose delivery process is not biased by the short-term variability of the measurement. Then, the authors demonstrated using a normality test that the quality control results could be approximated by a normal distribution with two parameters (mean and standard deviation). Finally, the authors used two complementary tools--control charts and performance indices--to thoroughly analyze the IMRT dose delivery process. Control charts aim at monitoring the process over time using statistical control limits to distinguish random (natural) variations from significant changes in the process, whereas performance indices aim at quantifying the ability of the process to produce data that are within the clinical tolerances, at a precise moment. The authors retrospectively showed that the analysis of three selected control charts (individual value, moving-range, and EWMA control charts) allowed efficient drift detection of the dose delivery process for prostate and head-and-neck treatments before the quality controls were outside the clinical tolerances. Therefore, when analyzed in real time, during quality controls, they should improve the

  16. Application of machine learning and expert systems to Statistical Process Control (SPC) chart interpretation

    Science.gov (United States)

    Shewhart, Mark

    1991-01-01

    Statistical Process Control (SPC) charts are one of several tools used in quality control. Other tools include flow charts, histograms, cause and effect diagrams, check sheets, Pareto diagrams, graphs, and scatter diagrams. A control chart is simply a graph which indicates process variation over time. The purpose of drawing a control chart is to detect any changes in the process signalled by abnormal points or patterns on the graph. The Artificial Intelligence Support Center (AISC) of the Acquisition Logistics Division has developed a hybrid machine learning expert system prototype which automates the process of constructing and interpreting control charts.

  17. Severe postpartum haemorrhage after vaginal delivery: a statistical process control chart to report seven years of continuous quality improvement.

    Science.gov (United States)

    Dupont, Corinne; Occelli, Pauline; Deneux-Tharaux, Catherine; Touzet, Sandrine; Duclos, Antoine; Bouvier-Colle, Marie-Hélène; Rudigoz, René-Charles; Huissoud, Cyril

    2014-07-01

    Severe postpartum haemorrhage after vaginal delivery: a statistical process control chart to report seven years of continuous quality improvement To use statistical process control charts to describe trends in the prevalence of severe postpartum haemorrhage after vaginal delivery. This assessment was performed 7 years after we initiated a continuous quality improvement programme that began with regular criteria-based audits Observational descriptive study, in a French maternity unit in the Rhône-Alpes region. Quarterly clinical audit meetings to analyse all cases of severe postpartum haemorrhage after vaginal delivery and provide feedback on quality of care with statistical process control tools. The primary outcomes were the prevalence of severe PPH after vaginal delivery and its quarterly monitoring with a control chart. The secondary outcomes included the global quality of care for women with severe postpartum haemorrhage, including the performance rate of each recommended procedure. Differences in these variables between 2005 and 2012 were tested. From 2005 to 2012, the prevalence of severe postpartum haemorrhage declined significantly, from 1.2% to 0.6% of vaginal deliveries (pcontrol limits, that is, been out of statistical control. The proportion of cases that were managed consistently with the guidelines increased for all of their main components. Implementation of continuous quality improvement efforts began seven years ago and used, among other tools, statistical process control charts. During this period, the prevalence of severe postpartum haemorrhage after vaginal delivery has been reduced by 50%. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  18. Additives for pH control in PWR secondary water

    International Nuclear Information System (INIS)

    Cobble, J.W.; Turner, P.J.

    1985-08-01

    The purpose of this project is the development of methods for identifying the most promising materials for use as pH control agents in steam generator and auxiliary hot water systems. The methods developed in the first project year to select new pH control agents have been explored in more detail in an effort to assess and improve their reliability. In addition, some new classes of compounds which were not treatable before have been examined on the basis of new experimental results. Predictions of pH and volatility to 300 0 C have now been made for a total of 79 organic bases. Some of the newer classes of compounds studied present properties which may make them of interest either as secondary additives or as principal agents in the event of change in the approach to water treatment. While there have been no substantial changes in number of compounds which meet all EPRI criteria, there are other species which may meet the EPRI criteria after further research. 19 refs., 31 tabs

  19. Effect of moulding sand on statistically controlled hybrid rapid casting solution for zinc alloys

    Energy Technology Data Exchange (ETDEWEB)

    Singh, Rupinder [Guru Nanak Dev Engineering College, Ludhiana (India)

    2010-08-15

    The purpose of the present investigations is to study the effect of moulding sand on decreasing shell wall thickness of mould cavities for economical and statistically controlled hybrid rapid casting solutions (combination of three dimensional printing and conventional sand casting) for zinc alloys. Starting from the identification of component/ benchmark, technological prototypes were produced at different shell wall thicknesses supported by three different types of sands (namely: dry, green and molasses). Prototypes prepared by the proposed process are for assembly check purpose and not for functional validation of the parts. The study suggested that a shell wall with a less than recommended thickness (12mm) is more suitable for dimensional accuracy. The best dimensional accuracy was obtained at 3mm shell wall thickness with green sand. The process was found to be under statistical control

  20. Between and beyond additivity and non-additivity : the statistical modelling of genotype by environment interaction in plant breeding

    NARCIS (Netherlands)

    Eeuwijk, van F.A.

    1996-01-01

    In plant breeding it is a common observation to see genotypes react differently to environmental changes. This phenomenon is called genotype by environment interaction. Many statistical approaches for analysing genotype by environment interaction rely heavily on the analysis of variance model.

  1. Statistically Controlling for Confounding Constructs Is Harder than You Think.

    Directory of Open Access Journals (Sweden)

    Jacob Westfall

    Full Text Available Social scientists often seek to demonstrate that a construct has incremental validity over and above other related constructs. However, these claims are typically supported by measurement-level models that fail to consider the effects of measurement (unreliability. We use intuitive examples, Monte Carlo simulations, and a novel analytical framework to demonstrate that common strategies for establishing incremental construct validity using multiple regression analysis exhibit extremely high Type I error rates under parameter regimes common in many psychological domains. Counterintuitively, we find that error rates are highest--in some cases approaching 100%--when sample sizes are large and reliability is moderate. Our findings suggest that a potentially large proportion of incremental validity claims made in the literature are spurious. We present a web application (http://jakewestfall.org/ivy/ that readers can use to explore the statistical properties of these and other incremental validity arguments. We conclude by reviewing SEM-based statistical approaches that appropriately control the Type I error rate when attempting to establish incremental validity.

  2. Person Fit Based on Statistical Process Control in an Adaptive Testing Environment. Research Report 98-13.

    Science.gov (United States)

    van Krimpen-Stoop, Edith M. L. A.; Meijer, Rob R.

    Person-fit research in the context of paper-and-pencil tests is reviewed, and some specific problems regarding person fit in the context of computerized adaptive testing (CAT) are discussed. Some new methods are proposed to investigate person fit in a CAT environment. These statistics are based on Statistical Process Control (SPC) theory. A…

  3. A rank-based algorithm of differential expression analysis for small cell line data with statistical control.

    Science.gov (United States)

    Li, Xiangyu; Cai, Hao; Wang, Xianlong; Ao, Lu; Guo, You; He, Jun; Gu, Yunyan; Qi, Lishuang; Guan, Qingzhou; Lin, Xu; Guo, Zheng

    2017-10-13

    To detect differentially expressed genes (DEGs) in small-scale cell line experiments, usually with only two or three technical replicates for each state, the commonly used statistical methods such as significance analysis of microarrays (SAM), limma and RankProd (RP) lack statistical power, while the fold change method lacks any statistical control. In this study, we demonstrated that the within-sample relative expression orderings (REOs) of gene pairs were highly stable among technical replicates of a cell line but often widely disrupted after certain treatments such like gene knockdown, gene transfection and drug treatment. Based on this finding, we customized the RankComp algorithm, previously designed for individualized differential expression analysis through REO comparison, to identify DEGs with certain statistical control for small-scale cell line data. In both simulated and real data, the new algorithm, named CellComp, exhibited high precision with much higher sensitivity than the original RankComp, SAM, limma and RP methods. Therefore, CellComp provides an efficient tool for analyzing small-scale cell line data. © The Author 2017. Published by Oxford University Press.

  4. Mainstreaming Remedial Mathematics Students in Introductory Statistics: Results Using a Randomized Controlled Trial

    Science.gov (United States)

    Logue, Alexandra W.; Watanabe-Rose, Mari

    2014-01-01

    This study used a randomized controlled trial to determine whether students, assessed by their community colleges as needing an elementary algebra (remedial) mathematics course, could instead succeed at least as well in a college-level, credit-bearing introductory statistics course with extra support (a weekly workshop). Researchers randomly…

  5. Evaluating Statistical Process Control (SPC) techniques and computing the uncertainty of force calibrations

    Science.gov (United States)

    Navard, Sharon E.

    1989-01-01

    In recent years there has been a push within NASA to use statistical techniques to improve the quality of production. Two areas where statistics are used are in establishing product and process quality control of flight hardware and in evaluating the uncertainty of calibration of instruments. The Flight Systems Quality Engineering branch is responsible for developing and assuring the quality of all flight hardware; the statistical process control methods employed are reviewed and evaluated. The Measurement Standards and Calibration Laboratory performs the calibration of all instruments used on-site at JSC as well as those used by all off-site contractors. These calibrations must be performed in such a way as to be traceable to national standards maintained by the National Institute of Standards and Technology, and they must meet a four-to-one ratio of the instrument specifications to calibrating standard uncertainty. In some instances this ratio is not met, and in these cases it is desirable to compute the exact uncertainty of the calibration and determine ways of reducing it. A particular example where this problem is encountered is with a machine which does automatic calibrations of force. The process of force calibration using the United Force Machine is described in detail. The sources of error are identified and quantified when possible. Suggestions for improvement are made.

  6. The chaos and control of a food chain model supplying additional food to top-predator

    International Nuclear Information System (INIS)

    Sahoo, Banshidhar; Poria, Swarup

    2014-01-01

    Highlights: • We propose a chaotic food chain model supplying additional food to top-predator. • Local and global stability conditions are derived in presence of additional food. • Chaos is controlled only by increasing quantity of additional food. • System enters into periodic region and depicts Hopf bifurcations supplying additional food. • This an application of non-chemical methods for controlling chaos. -- Abstract: The control and management of chaotic population is one of the main objectives for constructing mathematical model in ecology today. In this paper, we apply a technique of controlling chaotic predator–prey population dynamics by supplying additional food to top-predator. We formulate a three species predator–prey model supplying additional food to top-predator. Existence conditions and local stability criteria of equilibrium points are determined analytically. Persistence conditions for the system are derived. Global stability conditions of interior equilibrium point is calculated. Theoretical results are verified through numerical simulations. Phase diagram is presented for various quality and quantity of additional food. One parameter bifurcation analysis is done with respect to quality and quantity of additional food separately keeping one of them fixed. Using MATCONT package, we derive the bifurcation scenarios when both the parameters quality and quantity of additional food vary together. We predict the existence of Hopf point (H), limit point (LP) and branch point (BP) in the model for suitable supply of additional food. We have computed the regions of different dynamical behaviour in the quantity–quality parametric plane. From our study we conclude that chaotic population dynamics of predator prey system can be controlled to obtain regular population dynamics only by supplying additional food to top predator. This study is aimed to introduce a new non-chemical chaos control mechanism in a predator–prey system with the

  7. Compliance and control characteristics of an additive manufactured-flexure stage

    Energy Technology Data Exchange (ETDEWEB)

    Lee, ChaBum; Tarbutton, Joshua A. [Department of Mechanical Engineering, University of South Carolina, 300 Main St., Columbia, South Carolina 29208 (United States)

    2015-04-15

    This paper presents a compliance and positioning control characteristics of additive manufactured-nanopositioning system consisted of the flexure mechanism and voice coil motor (VCM). The double compound notch type flexure stage was designed to utilize the elastic deformation of two symmetrical four-bar mechanisms to provide a millimeter-level working range. Additive manufacturing (AM) process, stereolithography, was used to fabricate the flexure stage. The AM stage was inspected by using 3D X-ray computerized tomography scanner: air-voids and shape irregularity. The compliance, open-loop resonance peak, and damping ratio of the AM stage were measured 0.317 mm/N, 80 Hz, and 0.19, respectively. The AM stage was proportional-integral-derivative positioning feedback-controlled and the capacitive type sensor was used to measure the displacement. As a result, the AM flexure mechanism was successfully 25 nm positioning controlled within 500 μm range. The resonance peak was found approximately at 280 Hz in closed-loop. This research showed that the AM flexure mechanism and the VCM can provide millimeter range with high precision and can be a good alternative to an expensive metal-based flexure mechanism and piezoelectric transducer.

  8. Compliance and control characteristics of an additive manufactured-flexure stage

    International Nuclear Information System (INIS)

    Lee, ChaBum; Tarbutton, Joshua A.

    2015-01-01

    This paper presents a compliance and positioning control characteristics of additive manufactured-nanopositioning system consisted of the flexure mechanism and voice coil motor (VCM). The double compound notch type flexure stage was designed to utilize the elastic deformation of two symmetrical four-bar mechanisms to provide a millimeter-level working range. Additive manufacturing (AM) process, stereolithography, was used to fabricate the flexure stage. The AM stage was inspected by using 3D X-ray computerized tomography scanner: air-voids and shape irregularity. The compliance, open-loop resonance peak, and damping ratio of the AM stage were measured 0.317 mm/N, 80 Hz, and 0.19, respectively. The AM stage was proportional-integral-derivative positioning feedback-controlled and the capacitive type sensor was used to measure the displacement. As a result, the AM flexure mechanism was successfully 25 nm positioning controlled within 500 μm range. The resonance peak was found approximately at 280 Hz in closed-loop. This research showed that the AM flexure mechanism and the VCM can provide millimeter range with high precision and can be a good alternative to an expensive metal-based flexure mechanism and piezoelectric transducer

  9. Sensing and controlling resin-layer thickness in additive manufacturing processes

    NARCIS (Netherlands)

    Kozhevnikov, A.

    2017-01-01

    This AM-TKI project in collaboration with TNO focusses on the sensing and control of resin-layer thickness in AM applications. Industrial Additive Manufacturing is considered to be a potential breakthrough production technology for many applications. A specific AM implementation is VAT photo

  10. Project T.E.A.M. (Technical Education Advancement Modules). Advanced Statistical Process Control.

    Science.gov (United States)

    Dunlap, Dale

    This instructional guide, one of a series developed by the Technical Education Advancement Modules (TEAM) project, is a 20-hour advanced statistical process control (SPC) and quality improvement course designed to develop the following competencies: (1) understanding quality systems; (2) knowing the process; (3) solving quality problems; and (4)…

  11. Pengendalian Kualitas Produk Di Industri Garment Dengan Menggunakan Statistical Procces Control (SPC

    Directory of Open Access Journals (Sweden)

    Rizal Rachman

    2017-09-01

    Full Text Available Abstrak Perusahaan memandang bahwa kualitas sebagai faktor kunci yang membawa keberhasilan dan standar mutu yang telah ditetapkan oleh buyer. Tujuan penelitian ini adalah untuk mengetahui tingkat kerusakan produk dalam batas pengendalian kualitas pada proses produksi pakaian jadi pada PT. Asia Penta Garment. Penelitian ini menggunakan metode statistical procces control. Data yang diambil dalam penelitian ini mengunakan data sekunder berupa laporan jumlah produksi dan kerusakan pakaian jadi di bagian finishing pada Januari 2017. Berdasarkan hasil menunjukkan kerusakan diluar batas pengendalian yaitu ada yang diluar batas kendali (out of control dengan batas pengendalian atas (UCL dan batas pengendalian bawah (LCL dan rata-rata kerusakan diluar batas kendali.Untuk meningkatkan kualitas produk khususnya pakaian yang dihasilkan perusahaan, kebijakan mutu yang telah ditetapkan harus dijalankan dengan benar, antara lain dalam pemilihan negoisasi bahan baku dengan buyer sesuai standar, perekrutan tenaga kerja yang berpengalaman, kedisiplinan kerja yang tinggi, pembinaan para karyawan, pemberian bonus pada karyawan yang sesuai target dan disiplin tinggi, perbaikan mesin secara terus menerus dan memperbaiki lingkungan kerja yang bersih, nyaman, serta aman.   Kata Kunci : Pengendalian kualitas, Kualitas produk, SPC. Abstract The Company considers that quality as a key factor that brings success and quality standards set by the buyer. The purpose of this study was to determine the level of damage to the product within the limits of quality control in the production process apparel in PT. Asia Penta Garment. This study uses a statistical procces control methode. Data taken in this study using secondary data from reports on the number of production and damage to clothing in the finishing section in January 2017. Based on the results show the damage outside the control limits is nothing beyond the control limit (out of control with the upper control limit

  12. Pengendalian Kualitas Produk Di Industri Garment Dengan Menggunakan Statistical Procces Control (SPC)

    OpenAIRE

    Rizal Rachman

    2017-01-01

    Abstrak Perusahaan memandang bahwa kualitas sebagai faktor kunci yang membawa keberhasilan dan standar mutu yang telah ditetapkan oleh buyer. Tujuan penelitian ini adalah untuk mengetahui tingkat kerusakan produk dalam batas pengendalian kualitas pada proses produksi pakaian jadi pada PT. Asia Penta Garment. Penelitian ini menggunakan metode statistical procces control. Data yang diambil dalam penelitian ini mengunakan data sekunder berupa laporan jumlah produksi dan kerusakan pakaian jad...

  13. 2017 Annual Disability Statistics Supplement

    Science.gov (United States)

    Lauer, E. A; Houtenville, A. J.

    2018-01-01

    The "Annual Disability Statistics Supplement" is a companion report to the "Annual Disability Statistics Compendium." The "Supplement" presents statistics on the same topics as the "Compendium," with additional categorizations by demographic characteristics including age, gender and race/ethnicity. In…

  14. Equilibrium statistical mechanics

    CERN Document Server

    Jackson, E Atlee

    2000-01-01

    Ideal as an elementary introduction to equilibrium statistical mechanics, this volume covers both classical and quantum methodology for open and closed systems. Introductory chapters familiarize readers with probability and microscopic models of systems, while additional chapters describe the general derivation of the fundamental statistical mechanics relationships. The final chapter contains 16 sections, each dealing with a different application, ordered according to complexity, from classical through degenerate quantum statistical mechanics. Key features include an elementary introduction t

  15. STATISTIC MODEL OF DYNAMIC DELAY AND DROPOUT ON CELLULAR DATA NETWORKED CONTROL SYSTEM

    Directory of Open Access Journals (Sweden)

    MUHAMMAD A. MURTI

    2017-07-01

    Full Text Available Delay and dropout are important parameters influence overall control performance in Networked Control System (NCS. The goal of this research is to find a model of delay and dropout of data communication link in the NCS. Experiments have been done in this research to a water level control of boiler tank as part of the NCS based on internet communication network using High Speed Packet Access (HSPA cellular technology. By this experiments have been obtained closed-loop system response as well as data delay and dropout of data packets. This research contributes on modeling of the NCS which is combination of controlled plant and data communication link. Another contribution is statistical model of delay and dropout on the NCS.

  16. Automatic optimisation of beam orientations using the simplex algorithm and optimisation of quality control using statistical process control (S.P.C.) for intensity modulated radiation therapy (I.M.R.T.)

    International Nuclear Information System (INIS)

    Gerard, K.

    2008-11-01

    Intensity Modulated Radiation Therapy (I.M.R.T.) is currently considered as a technique of choice to increase the local control of the tumour while reducing the dose to surrounding organs at risk. However, its routine clinical implementation is partially held back by the excessive amount of work required to prepare the patient treatment. In order to increase the efficiency of the treatment preparation, two axes of work have been defined. The first axis concerned the automatic optimisation of beam orientations. We integrated the simplex algorithm in the treatment planning system. Starting from the dosimetric objectives set by the user, it can automatically determine the optimal beam orientations that best cover the target volume while sparing organs at risk. In addition to time sparing, the simplex results of three patients with a cancer of the oropharynx, showed that the quality of the plan is also increased compared to a manual beam selection. Indeed, for an equivalent or even a better target coverage, it reduces the dose received by the organs at risk. The second axis of work concerned the optimisation of pre-treatment quality control. We used an industrial method: Statistical Process Control (S.P.C.) to retrospectively analyse the absolute dose quality control results performed using an ionisation chamber at Centre Alexis Vautrin (C.A.V.). This study showed that S.P.C. is an efficient method to reinforce treatment security using control charts. It also showed that our dose delivery process was stable and statistically capable for prostate treatments, which implies that a reduction of the number of controls can be considered for this type of treatment at the C.A.V.. (author)

  17. Fiducial registration error as a statistical process control metric in image-guided radiotherapy with prostatic markers

    International Nuclear Information System (INIS)

    Ung, M.N.; Wee, Leonard

    2010-01-01

    Full text: Portal imaging of implanted fiducial markers has been in use for image-guided radiotherapy (TORT) of prostate cancer, with ample attention to localization accuracy and organ motion. The geometric uncertainties in point-based rigid-body (PBRB) image registration during localization of prostate fiducial markers can be quantified in terms of a fiducial registration error (FRE). Statistical process control charts for individual patients can be designed to identify potentially significant deviation of FRE from expected behaviour. In this study, the aim was to retrospectively apply statistical process control methods to FREs in 34 individuals to identify parameters that may impact on the process stability in image-based localization. A robust procedure for estimating control parameters, control lim its and fixed tolerance levels from a small number of initial observations has been proposed and discussed. Four distinct types of qualitative control chart behavior have been observed. Probable clinical factors leading to IORT process instability are discussed in light of the control chart behaviour. Control charts have been shown to be a useful decision-making tool for detecting potentially out-of control processes on an individual basis. It can sensitively identify potential problems that warrant more detailed investigation in the 10RT of prostate cancer.

  18. [Statistical approach to evaluate the occurrence of out-of acceptable ranges and accuracy for antimicrobial susceptibility tests in inter-laboratory quality control program].

    Science.gov (United States)

    Ueno, Tamio; Matuda, Junichi; Yamane, Nobuhisa

    2013-03-01

    To evaluate the occurrence of out-of acceptable ranges and accuracy of antimicrobial susceptibility tests, we applied a new statistical tool to the Inter-Laboratory Quality Control Program established by the Kyushu Quality Control Research Group. First, we defined acceptable ranges of minimum inhibitory concentration (MIC) for broth microdilution tests and inhibitory zone diameter for disk diffusion tests on the basis of Clinical and Laboratory Standards Institute (CLSI) M100-S21. In the analysis, more than two out-of acceptable range results in the 20 tests were considered as not allowable according to the CLSI document. Of the 90 participating laboratories, 46 (51%) experienced one or more occurrences of out-of acceptable range results. Then, a binomial test was applied to each participating laboratory. The results indicated that the occurrences of out-of acceptable range results in the 11 laboratories were significantly higher when compared to the CLSI recommendation (allowable rate laboratory was statistically compared with zero using a Student's t-test. The results revealed that 5 of the 11 above laboratories reported erroneous test results that systematically drifted to the side of resistance. In conclusion, our statistical approach has enabled us to detect significantly higher occurrences and source of interpretive errors in antimicrobial susceptibility tests; therefore, this approach can provide us with additional information that can improve the accuracy of the test results in clinical microbiology laboratories.

  19. Statistical methods to assess and control processes and products during nuclear fuel fabrication

    International Nuclear Information System (INIS)

    Weidinger, H.

    1999-01-01

    Very good statistical tools and techniques are available today to access the quality and the reliability of fabrication process as the original sources for a good and reliable quality of the fabricated processes. Quality control charts of different types play a key role and the high capability of modern electronic data acquisition technologies proved, at least potentially, a high efficiency in the more or less online application of these methods. These techniques focus mainly on stability and the reliability of the fabrication process. In addition, relatively simple statistical tolls are available to access the capability of fabrication process, assuming they are stable, to fulfill the product specifications. All these techniques can only result in as good a product as the product design is able to describe the product requirements necessary for good performance. Therefore it is essential that product design is strictly and closely performance oriented. However, performance orientation is only successful through an open and effective cooperation with the customer who uses or applies those products. During the last one to two decades in the west, a multi-vendor strategy has been developed by the utility, sometimes leading to three different fuel vendors for one reactor core. This development resulted in better economic conditions for the user but did not necessarily increase an open attitude with the vendor toward the using utility. The responsibility of the utility increased considerably to ensure an adequate quality of the fuel they received. As a matter of fact, sometimes the utilities had to pay a high price because of unexpected performance problems. Thus the utilities are now learning that they need to increase their knowledge and experience in the area of nuclear fuel quality management and technology. This process started some time ago in the west. However, it now also reaches the utilities in the eastern countries. (author)

  20. Automated statistical modeling of analytical measurement systems

    International Nuclear Information System (INIS)

    Jacobson, J.J.

    1992-01-01

    The statistical modeling of analytical measurement systems at the Idaho Chemical Processing Plant (ICPP) has been completely automated through computer software. The statistical modeling of analytical measurement systems is one part of a complete quality control program used by the Remote Analytical Laboratory (RAL) at the ICPP. The quality control program is an integration of automated data input, measurement system calibration, database management, and statistical process control. The quality control program and statistical modeling program meet the guidelines set forth by the American Society for Testing Materials and American National Standards Institute. A statistical model is a set of mathematical equations describing any systematic bias inherent in a measurement system and the precision of a measurement system. A statistical model is developed from data generated from the analysis of control standards. Control standards are samples which are made up at precise known levels by an independent laboratory and submitted to the RAL. The RAL analysts who process control standards do not know the values of those control standards. The object behind statistical modeling is to describe real process samples in terms of their bias and precision and, to verify that a measurement system is operating satisfactorily. The processing of control standards gives us this ability

  1. SU-D-BRD-07: Evaluation of the Effectiveness of Statistical Process Control Methods to Detect Systematic Errors For Routine Electron Energy Verification

    International Nuclear Information System (INIS)

    Parker, S

    2015-01-01

    Purpose: To evaluate the ability of statistical process control methods to detect systematic errors when using a two dimensional (2D) detector array for routine electron beam energy verification. Methods: Electron beam energy constancy was measured using an aluminum wedge and a 2D diode array on four linear accelerators. Process control limits were established. Measurements were recorded in control charts and compared with both calculated process control limits and TG-142 recommended specification limits. The data was tested for normality, process capability and process acceptability. Additional measurements were recorded while systematic errors were intentionally introduced. Systematic errors included shifts in the alignment of the wedge, incorrect orientation of the wedge, and incorrect array calibration. Results: Control limits calculated for each beam were smaller than the recommended specification limits. Process capability and process acceptability ratios were greater than one in all cases. All data was normally distributed. Shifts in the alignment of the wedge were most apparent for low energies. The smallest shift (0.5 mm) was detectable using process control limits in some cases, while the largest shift (2 mm) was detectable using specification limits in only one case. The wedge orientation tested did not affect the measurements as this did not affect the thickness of aluminum over the detectors of interest. Array calibration dependence varied with energy and selected array calibration. 6 MeV was the least sensitive to array calibration selection while 16 MeV was the most sensitive. Conclusion: Statistical process control methods demonstrated that the data distribution was normally distributed, the process was capable of meeting specifications, and that the process was centered within the specification limits. Though not all systematic errors were distinguishable from random errors, process control limits increased the ability to detect systematic errors

  2. Disciplined Decision Making in an Interdisciplinary Environment: Some Implications for Clinical Applications of Statistical Process Control.

    Science.gov (United States)

    Hantula, Donald A.

    1995-01-01

    Clinical applications of statistical process control (SPC) in human service organizations are considered. SPC is seen as providing a standard set of criteria that serves as a common interface for data-based decision making, which may bring decision making under the control of established contingencies rather than the immediate contingencies of…

  3. ANALISIS KEHILANGAN MINYAK PADA CRUDE PALM OIL (CPO DENGAN MENGGUNAKAN METODE STATISTICAL PROCESS CONTROL

    Directory of Open Access Journals (Sweden)

    Vera Devani

    2014-06-01

    Full Text Available PKS “XYZ” merupakan perusahaan yang bergerak di bidang pengolahan kelapa sawit. Produk yang dihasilkan adalah Crude Palm Oil (CPO dan Palm Kernel Oil (PKO. Tujuan penelitian ini adalah menganalisa kehilangan minyak (oil losses dan faktor-faktor penyebab dengan menggunakan metoda Statistical Process Control. Statistical Process Control adalah sekumpulan strategi, teknik, dan tindakan yang diambil oleh sebuah organisasi untuk memastikan bahwa strategi tersebut menghasilkan produk yang berkualitas atau menyediakan pelayanan yang berkualitas. Sampel terjadinya oil losses pada CPO yang diteliti adalah tandan kosong (tankos, biji (nut, ampas (fibre, dan sludge akhir. Berdasarkan Peta Kendali I-MR dapat disimpulkan bahwa kondisi keempat jenis oil losses CPO berada dalam batas kendali dan konsisten. Sedangkan nilai Cpk dari total oil losses berada di luar batas kendali rata-rata proses, hal ini berarti CPO yang diproduksi telah memenuhi kebutuhan pelanggan, dengan total oil losses kurang dari batas maksimum yang ditetapkan oleh perusahaan yaitu 1,65%.

  4. Preliminary Retrospective Analysis of Daily Tomotherapy Output Constancy Checks Using Statistical Process Control.

    Science.gov (United States)

    Mezzenga, Emilio; D'Errico, Vincenzo; Sarnelli, Anna; Strigari, Lidia; Menghi, Enrico; Marcocci, Francesco; Bianchini, David; Benassi, Marcello

    2016-01-01

    The purpose of this study was to retrospectively evaluate the results from a Helical TomoTherapy Hi-Art treatment system relating to quality controls based on daily static and dynamic output checks using statistical process control methods. Individual value X-charts, exponentially weighted moving average charts, and process capability and acceptability indices were used to monitor the treatment system performance. Daily output values measured from January 2014 to January 2015 were considered. The results obtained showed that, although the process was in control, there was an out-of-control situation in the principal maintenance intervention for the treatment system. In particular, process capability indices showed a decreasing percentage of points in control which was, however, acceptable according to AAPM TG148 guidelines. Our findings underline the importance of restricting the acceptable range of daily output checks and suggest a future line of investigation for a detailed process control of daily output checks for the Helical TomoTherapy Hi-Art treatment system.

  5. Mathematics and Statistics Research Department progress report for period ending June 30, 1976

    International Nuclear Information System (INIS)

    Gosslee, D.G.; Shelton, B.K.; Ward, R.C.; Wilson, D.G.

    1976-10-01

    Brief summaries of work done in mathematics and related fields are presented. Research in mathematics and statistics concerned statistical estimation, statistical testing, experiment design, probability, continuum mechanics, functional integration, matrices and other operators, and mathematical software. More applied studies were conducted in the areas of analytical chemistry, biological research, chemistry and physics research, energy research, environmental research, health physics research, materials research, reactor and thermonuclear research, sampling inspection, quality control, and life testing, and uranium resource evaluation research. Additional sections deal with educational activities, presentation of research results, and professional activities. 7 figures, 9 tables

  6. Nonclinical statistics for pharmaceutical and biotechnology industries

    CERN Document Server

    2016-01-01

    This book serves as a reference text for regulatory, industry and academic statisticians and also a handy manual for entry level Statisticians. Additionally it aims to stimulate academic interest in the field of Nonclinical Statistics and promote this as an important discipline in its own right. This text brings together for the first time in a single volume a comprehensive survey of methods important to the nonclinical science areas within the pharmaceutical and biotechnology industries. Specifically the Discovery and Translational sciences, the Safety/Toxiology sciences, and the Chemistry, Manufacturing and Controls sciences. Drug discovery and development is a long and costly process. Most decisions in the drug development process are made with incomplete information. The data is rife with uncertainties and hence risky by nature. This is therefore the purview of Statistics. As such, this book aims to introduce readers to important statistical thinking and its application in these nonclinical areas. The cha...

  7. Improved Statistical Method For Hydrographic Climatic Records Quality Control

    Science.gov (United States)

    Gourrion, J.; Szekely, T.

    2016-02-01

    Climate research benefits from the continuous development of global in-situ hydrographic networks in the last decades. Apart from the increasing volume of observations available on a large range of temporal and spatial scales, a critical aspect concerns the ability to constantly improve the quality of the datasets. In the context of the Coriolis Dataset for ReAnalysis (CORA) version 4.2, a new quality control method based on a local comparison to historical extreme values ever observed is developed, implemented and validated. Temperature, salinity and potential density validity intervals are directly estimated from minimum and maximum values from an historical reference dataset, rather than from traditional mean and standard deviation estimates. Such an approach avoids strong statistical assumptions on the data distributions such as unimodality, absence of skewness and spatially homogeneous kurtosis. As a new feature, it also allows addressing simultaneously the two main objectives of a quality control strategy, i.e. maximizing the number of good detections while minimizing the number of false alarms. The reference dataset is presently built from the fusion of 1) all ARGO profiles up to early 2014, 2) 3 historical CTD datasets and 3) the Sea Mammals CTD profiles from the MEOP database. All datasets are extensively and manually quality controlled. In this communication, the latest method validation results are also presented. The method has been implemented in the latest version of the CORA dataset and will benefit to the next version of the Copernicus CMEMS dataset.

  8. Statistics for economics

    CERN Document Server

    Naghshpour, Shahdad

    2012-01-01

    Statistics is the branch of mathematics that deals with real-life problems. As such, it is an essential tool for economists. Unfortunately, the way you and many other economists learn the concept of statistics is not compatible with the way economists think and learn. The problem is worsened by the use of mathematical jargon and complex derivations. Here's a book that proves none of this is necessary. All the examples and exercises in this book are constructed within the field of economics, thus eliminating the difficulty of learning statistics with examples from fields that have no relation to business, politics, or policy. Statistics is, in fact, not more difficult than economics. Anyone who can comprehend economics can understand and use statistics successfully within this field, including you! This book utilizes Microsoft Excel to obtain statistical results, as well as to perform additional necessary computations. Microsoft Excel is not the software of choice for performing sophisticated statistical analy...

  9. Using Statistical Process Control Charts to Study Stuttering Frequency Variability during a Single Day

    Science.gov (United States)

    Karimi, Hamid; O'Brian, Sue; Onslow, Mark; Jones, Mark; Menzies, Ross; Packman, Ann

    2013-01-01

    Purpose: Stuttering varies between and within speaking situations. In this study, the authors used statistical process control charts with 10 case studies to investigate variability of stuttering frequency. Method: Participants were 10 adults who stutter. The authors counted the percentage of syllables stuttered (%SS) for segments of their speech…

  10. Guideline implementation in clinical practice: use of statistical process control charts as visual feedback devices.

    Science.gov (United States)

    Al-Hussein, Fahad A

    2009-01-01

    To use statistical control charts in a series of audits to improve the acceptance and consistant use of guidelines, and reduce the variations in prescription processing in primary health care. A series of audits were done at the main satellite of King Saud Housing Family and Community Medicine Center, National Guard Health Affairs, Riyadh, where three general practitioners and six pharmacists provide outpatient care to about 3000 residents. Audits were carried out every fortnight to calculate the proportion of prescriptions that did not conform to the given guidelines of prescribing and dispensing. Simple random samples of thirty were chosen from a sampling frame of all prescriptions given in the two previous weeks. Thirty six audits were carried out from September 2004 to February 2006. P-charts were constructed around a parametric specification of non-conformities not exceeding 25%. Of the 1081 prescriptions, the most frequent non-conformity was failure to write generic names (35.5%), followed by the failure to record patient's weight (16.4%), pharmacist's name (14.3%), duration of therapy (9.1%), and the use of inappropriate abbreviations (6.0%). Initially, 100% of prescriptions did not conform to the guidelines, but within a period of three months, this came down to 40%. A process of audits in the context of statistical process control is necessary for any improvement in the implementation of guidelines in primary care. Statistical process control charts are an effective means of visual feedback to the care providers.

  11. Errors in patient specimen collection: application of statistical process control.

    Science.gov (United States)

    Dzik, Walter Sunny; Beckman, Neil; Selleng, Kathleen; Heddle, Nancy; Szczepiorkowski, Zbigniew; Wendel, Silvano; Murphy, Michael

    2008-10-01

    Errors in the collection and labeling of blood samples for pretransfusion testing increase the risk of transfusion-associated patient morbidity and mortality. Statistical process control (SPC) is a recognized method to monitor the performance of a critical process. An easy-to-use SPC method was tested to determine its feasibility as a tool for monitoring quality in transfusion medicine. SPC control charts were adapted to a spreadsheet presentation. Data tabulating the frequency of mislabeled and miscollected blood samples from 10 hospitals in five countries from 2004 to 2006 were used to demonstrate the method. Control charts were produced to monitor process stability. The participating hospitals found the SPC spreadsheet very suitable to monitor the performance of the sample labeling and collection and applied SPC charts to suit their specific needs. One hospital monitored subcategories of sample error in detail. A large hospital monitored the number of wrong-blood-in-tube (WBIT) events. Four smaller-sized facilities, each following the same policy for sample collection, combined their data on WBIT samples into a single control chart. One hospital used the control chart to monitor the effect of an educational intervention. A simple SPC method is described that can monitor the process of sample collection and labeling in any hospital. SPC could be applied to other critical steps in the transfusion processes as a tool for biovigilance and could be used to develop regional or national performance standards for pretransfusion sample collection. A link is provided to download the spreadsheet for free.

  12. Primarily Statistics: Developing an Introductory Statistics Course for Pre-Service Elementary Teachers

    Science.gov (United States)

    Green, Jennifer L.; Blankenship, Erin E.

    2013-01-01

    We developed an introductory statistics course for pre-service elementary teachers. In this paper, we describe the goals and structure of the course, as well as the assessments we implemented. Additionally, we use example course work to demonstrate pre-service teachers' progress both in learning statistics and as novice teachers. Overall, the…

  13. Control of abusive water addition to Octopus vulgaris with non-destructive methods.

    Science.gov (United States)

    Mendes, Rogério; Schimmer, Ove; Vieira, Helena; Pereira, João; Teixeira, Bárbara

    2018-01-01

    Abusive water addition to octopus has evidenced the need for quick non-destructive methods for product qualification in the industry and control of fresh commercial products in markets. Electric conductivity (EC)/pH and dielectric property measurements were selected to detect water uptake in octopus. A significant EC decrease was determined after soaking octopus in freshwater for 4 h. EC reflected the water uptake of octopus and the correspondent concentration decrease of available ions in the interstitial fluid. Significant correlations were determined between octopus water uptake, EC (R = -0.940) and moisture/protein (M/P) ratio (R = 0.923) changes. Seasonal and spatial variation in proximate composition did not introduce any uncertainty in EC discrimination of freshwater tampering. Immersion in 5 g L -1 sodium tripolyphosphate (STPP) increased EC to a value similar to control octopus. EC false negatives resulting from the use of additives (STPP and citric acid) were eliminated with the additional determination of pH. Octopus soaked in freshwater, STPP and citric acid can also be clearly discriminated from untreated samples (control) and also from frozen (thawed) ones using the dielectric properties. No significant differences in the dielectric property scores were found between octopus sizes or geographical locations. Simultaneous EC/pH or dielectric property measurements can be used in a handheld device for non-destructive water addition detection in octopus. M/P ratio can be used as a reference destructive method. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.

  14. Measuring and improving the quality of postoperative epidural analgesia for major abdominal surgery using statistical process control charts.

    Science.gov (United States)

    Duncan, Fiona; Haigh, Carol

    2013-10-01

    To explore and improve the quality of continuous epidural analgesia for pain relief using Statistical Process Control tools. Measuring the quality of pain management interventions is complex. Intermittent audits do not accurately capture the results of quality improvement initiatives. The failure rate for one intervention, epidural analgesia, is approximately 30% in everyday practice, so it is an important area for improvement. Continuous measurement and analysis are required to understand the multiple factors involved in providing effective pain relief. Process control and quality improvement Routine prospectively acquired data collection started in 2006. Patients were asked about their pain and side effects of treatment. Statistical Process Control methods were applied for continuous data analysis. A multidisciplinary group worked together to identify reasons for variation in the data and instigated ideas for improvement. The key measure for improvement was a reduction in the percentage of patients with an epidural in severe pain. The baseline control charts illustrated the recorded variation in the rate of several processes and outcomes for 293 surgical patients. The mean visual analogue pain score (VNRS) was four. There was no special cause variation when data were stratified by surgeons, clinical area or patients who had experienced pain before surgery. Fifty-seven per cent of patients were hypotensive on the first day after surgery. We were able to demonstrate a significant improvement in the failure rate of epidurals as the project continued with quality improvement interventions. Statistical Process Control is a useful tool for measuring and improving the quality of pain management. The applications of Statistical Process Control methods offer the potential to learn more about the process of change and outcomes in an Acute Pain Service both locally and nationally. We have been able to develop measures for improvement and benchmarking in routine care that

  15. Testing statistical hypotheses

    CERN Document Server

    Lehmann, E L

    2005-01-01

    The third edition of Testing Statistical Hypotheses updates and expands upon the classic graduate text, emphasizing optimality theory for hypothesis testing and confidence sets. The principal additions include a rigorous treatment of large sample optimality, together with the requisite tools. In addition, an introduction to the theory of resampling methods such as the bootstrap is developed. The sections on multiple testing and goodness of fit testing are expanded. The text is suitable for Ph.D. students in statistics and includes over 300 new problems out of a total of more than 760. E.L. Lehmann is Professor of Statistics Emeritus at the University of California, Berkeley. He is a member of the National Academy of Sciences and the American Academy of Arts and Sciences, and the recipient of honorary degrees from the University of Leiden, The Netherlands and the University of Chicago. He is the author of Elements of Large-Sample Theory and (with George Casella) he is also the author of Theory of Point Estimat...

  16. Using statistical process control for monitoring the prevalence of hospital-acquired pressure ulcers.

    Science.gov (United States)

    Kottner, Jan; Halfens, Ruud

    2010-05-01

    Institutionally acquired pressure ulcers are used as outcome indicators to assess the quality of pressure ulcer prevention programs. Determining whether quality improvement projects that aim to decrease the proportions of institutionally acquired pressure ulcers lead to real changes in clinical practice depends on the measurement method and statistical analysis used. To examine whether nosocomial pressure ulcer prevalence rates in hospitals in the Netherlands changed, a secondary data analysis using different statistical approaches was conducted of annual (1998-2008) nationwide nursing-sensitive health problem prevalence studies in the Netherlands. Institutions that participated regularly in all survey years were identified. Risk-adjusted nosocomial pressure ulcers prevalence rates, grade 2 to 4 (European Pressure Ulcer Advisory Panel system) were calculated per year and hospital. Descriptive statistics, chi-square trend tests, and P charts based on statistical process control (SPC) were applied and compared. Six of the 905 healthcare institutions participated in every survey year and 11,444 patients in these six hospitals were identified as being at risk for pressure ulcers. Prevalence rates per year ranged from 0.05 to 0.22. Chi-square trend tests revealed statistically significant downward trends in four hospitals but based on SPC methods, prevalence rates of five hospitals varied by chance only. Results of chi-square trend tests and SPC methods were not comparable, making it impossible to decide which approach is more appropriate. P charts provide more valuable information than single P values and are more helpful for monitoring institutional performance. Empirical evidence about the decrease of nosocomial pressure ulcer prevalence rates in the Netherlands is contradictory and limited.

  17. Statistical process control of cocrystallization processes: A comparison between OPLS and PLS.

    Science.gov (United States)

    Silva, Ana F T; Sarraguça, Mafalda Cruz; Ribeiro, Paulo R; Santos, Adenilson O; De Beer, Thomas; Lopes, João Almeida

    2017-03-30

    Orthogonal partial least squares regression (OPLS) is being increasingly adopted as an alternative to partial least squares (PLS) regression due to the better generalization that can be achieved. Particularly in multivariate batch statistical process control (BSPC), the use of OPLS for estimating nominal trajectories is advantageous. In OPLS, the nominal process trajectories are expected to be captured in a single predictive principal component while uncorrelated variations are filtered out to orthogonal principal components. In theory, OPLS will yield a better estimation of the Hotelling's T 2 statistic and corresponding control limits thus lowering the number of false positives and false negatives when assessing the process disturbances. Although OPLS advantages have been demonstrated in the context of regression, its use on BSPC was seldom reported. This study proposes an OPLS-based approach for BSPC of a cocrystallization process between hydrochlorothiazide and p-aminobenzoic acid monitored on-line with near infrared spectroscopy and compares the fault detection performance with the same approach based on PLS. A series of cocrystallization batches with imposed disturbances were used to test the ability to detect abnormal situations by OPLS and PLS-based BSPC methods. Results demonstrated that OPLS was generally superior in terms of sensibility and specificity in most situations. In some abnormal batches, it was found that the imposed disturbances were only detected with OPLS. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. An integrated model of statistical process control and maintenance based on the delayed monitoring

    International Nuclear Information System (INIS)

    Yin, Hui; Zhang, Guojun; Zhu, Haiping; Deng, Yuhao; He, Fei

    2015-01-01

    This paper develops an integrated model of statistical process control and maintenance decision. The proposal of delayed monitoring policy postpones the sampling process till a scheduled time and contributes to ten-scenarios of the production process, where equipment failure may occur besides quality shift. The equipment failure and the control chart alert trigger the corrective maintenance and the predictive maintenance, respectively. The occurrence probability, the cycle time and the cycle cost of each scenario are obtained by integral calculation; therefore, a mathematical model is established to minimize the expected cost by using the genetic algorithm. A Monte Carlo simulation experiment is conducted and compared with the integral calculation in order to ensure the analysis of the ten-scenario model. Another ordinary integrated model without delayed monitoring is also established as comparison. The results of a numerical example indicate satisfactory economic performance of the proposed model. Finally, a sensitivity analysis is performed to investigate the effect of model parameters. - Highlights: • We develop an integrated model of statistical process control and maintenance. • We propose delayed monitoring policy and derive an economic model with 10 scenarios. • We consider two deterioration mechanisms, quality shift and equipment failure. • The delayed monitoring policy will help reduce the expected cost

  19. Improved statistical method for temperature and salinity quality control

    Science.gov (United States)

    Gourrion, Jérôme; Szekely, Tanguy

    2017-04-01

    Climate research and Ocean monitoring benefit from the continuous development of global in-situ hydrographic networks in the last decades. Apart from the increasing volume of observations available on a large range of temporal and spatial scales, a critical aspect concerns the ability to constantly improve the quality of the datasets. In the context of the Coriolis Dataset for ReAnalysis (CORA) version 4.2, a new quality control method based on a local comparison to historical extreme values ever observed is developed, implemented and validated. Temperature, salinity and potential density validity intervals are directly estimated from minimum and maximum values from an historical reference dataset, rather than from traditional mean and standard deviation estimates. Such an approach avoids strong statistical assumptions on the data distributions such as unimodality, absence of skewness and spatially homogeneous kurtosis. As a new feature, it also allows addressing simultaneously the two main objectives of an automatic quality control strategy, i.e. maximizing the number of good detections while minimizing the number of false alarms. The reference dataset is presently built from the fusion of 1) all ARGO profiles up to late 2015, 2) 3 historical CTD datasets and 3) the Sea Mammals CTD profiles from the MEOP database. All datasets are extensively and manually quality controlled. In this communication, the latest method validation results are also presented. The method has already been implemented in the latest version of the delayed-time CMEMS in-situ dataset and will be deployed soon in the equivalent near-real time products.

  20. Impact analysis of critical success factors on the benefits from statistical process control implementation

    Directory of Open Access Journals (Sweden)

    Fabiano Rodrigues Soriano

    Full Text Available Abstract The Statistical Process Control - SPC is a set of statistical techniques focused on process control, monitoring and analyzing variation causes in the quality characteristics and/or in the parameters used to control and process improvements. Implementing SPC in organizations is a complex task. The reasons for its failure are related to organizational or social factors such as lack of top management commitment and little understanding about its potential benefits. Other aspects concern technical factors such as lack of training on and understanding about the statistical techniques. The main aim of the present article is to understand the interrelations between conditioning factors associated with top management commitment (Support, SPC Training and Application, as well as to understand the relationships between these factors and the benefits associated with the implementation of the program. The Partial Least Squares Structural Equation Modeling (PLS-SEM was used in the analysis since the main goal is to establish the causal relations. A cross-section survey was used as research method to collect information of samples from Brazilian auto-parts companies, which were selected according to guides from the auto-parts industry associations. A total of 170 companies were contacted by e-mail and by phone in order to be invited to participate in the survey. However, just 93 industries agreed on participating, and only 43 answered the questionnaire. The results showed that the senior management support considerably affects the way companies develop their training programs. In turn, these trainings affect the way companies apply the techniques. Thus, it will reflect on the benefits gotten from implementing the program. It was observed that the managerial and technical aspects are closely connected to each other and that they are represented by the ratio between top management and training support. The technical aspects observed through SPC

  1. Project T.E.A.M. (Technical Education Advancement Modules). Introduction to Statistical Process Control.

    Science.gov (United States)

    Billings, Paul H.

    This instructional guide, one of a series developed by the Technical Education Advancement Modules (TEAM) project, is a 6-hour introductory module on statistical process control (SPC), designed to develop competencies in the following skill areas: (1) identification of the three classes of SPC use; (2) understanding a process and how it works; (3)…

  2. Perceived Statistical Knowledge Level and Self-Reported Statistical Practice Among Academic Psychologists

    Directory of Open Access Journals (Sweden)

    Laura Badenes-Ribera

    2018-06-01

    Full Text Available Introduction: Publications arguing against the null hypothesis significance testing (NHST procedure and in favor of good statistical practices have increased. The most frequently mentioned alternatives to NHST are effect size statistics (ES, confidence intervals (CIs, and meta-analyses. A recent survey conducted in Spain found that academic psychologists have poor knowledge about effect size statistics, confidence intervals, and graphic displays for meta-analyses, which might lead to a misinterpretation of the results. In addition, it also found that, although the use of ES is becoming generalized, the same thing is not true for CIs. Finally, academics with greater knowledge about ES statistics presented a profile closer to good statistical practice and research design. Our main purpose was to analyze the extension of these results to a different geographical area through a replication study.Methods: For this purpose, we elaborated an on-line survey that included the same items as the original research, and we asked academic psychologists to indicate their level of knowledge about ES, their CIs, and meta-analyses, and how they use them. The sample consisted of 159 Italian academic psychologists (54.09% women, mean age of 47.65 years. The mean number of years in the position of professor was 12.90 (SD = 10.21.Results: As in the original research, the results showed that, although the use of effect size estimates is becoming generalized, an under-reporting of CIs for ES persists. The most frequent ES statistics mentioned were Cohen's d and R2/η2, which can have outliers or show non-normality or violate statistical assumptions. In addition, academics showed poor knowledge about meta-analytic displays (e.g., forest plot and funnel plot and quality checklists for studies. Finally, academics with higher-level knowledge about ES statistics seem to have a profile closer to good statistical practices.Conclusions: Changing statistical practice is not

  3. Use of statistic control of the process as part of a quality assurance plan; Empleo del control estadistico de proceso como parte de un plan de aseguramiento de la calidad

    Energy Technology Data Exchange (ETDEWEB)

    Acosta, S.; Lewis, C., E-mail: sacosta@am.gob.ar [Autoridad Regulatoria Nuclear (ARN), Buenos Aires (Argentina)

    2013-07-01

    One of the technical requirements of the standard IRAM ISO 17025 for the accreditation of testing laboratories, is the assurance of the quality of the results through the control and monitoring of the factors influencing the reliability of them. The grade the factors contribute to the total measurement uncertainty, determines which of them should be considered when developing a quality assurance plan. The laboratory of environmental measurements of strontium-90 in the accreditation process, performs most of its determinations in samples with values close to the detection limit. For this reason the correct characterization of the white, is a critical parameter and is verified through a letter for statistical process control. The scope of the present work is concerned the control of whites and so it was collected a statistically significant amount of data, for a period of time that is covered of different conditions. This allowed consider significant variables in the process, such as temperature and humidity, and build a graph of white control, which forms the basis of a statistical process control. The data obtained were lower and upper limits for the preparation of the charter white control. In this way the process of characterization of white was considered to operate under statistical control and concludes that it can be used as part of a plan of insurance of the quality.

  4. Robust control charts in statistical process control

    NARCIS (Netherlands)

    Nazir, H.Z.

    2014-01-01

    The presence of outliers and contaminations in the output of the process highly affects the performance of the design structures of commonly used control charts and hence makes them of less practical use. One of the solutions to deal with this problem is to use control charts which are robust

  5. H∞ Control for a Networked Control Model of Systems with Two Additive Time-Varying Delays

    Directory of Open Access Journals (Sweden)

    Hanyong Shao

    2014-01-01

    Full Text Available This paper is concerned with H∞ control for a networked control model of systems with two additive time-varying delays. A new Lyapunov functional is constructed to make full use of the information of the delays, and for the derivative of the Lyapunov functional a novel technique is employed to compute a tighter upper bound, which is dependent on the two time-varying delays instead of the upper bounds of them. Then the convex polyhedron method is proposed to check the upper bound of the derivative of the Lyapunov functional. The resulting stability criteria have fewer matrix variables but less conservatism than some existing ones. The stability criteria are applied to designing a state feedback controller, which guarantees that the closed-loop system is asymptotically stable with a prescribed H∞ disturbance attenuation level. Finally examples are given to show the advantages of the stability criteria and the effectiveness of the proposed control method.

  6. Data exploration, quality control and statistical analysis of ChIP-exo/nexus experiments.

    Science.gov (United States)

    Welch, Rene; Chung, Dongjun; Grass, Jeffrey; Landick, Robert; Keles, Sündüz

    2017-09-06

    ChIP-exo/nexus experiments rely on innovative modifications of the commonly used ChIP-seq protocol for high resolution mapping of transcription factor binding sites. Although many aspects of the ChIP-exo data analysis are similar to those of ChIP-seq, these high throughput experiments pose a number of unique quality control and analysis challenges. We develop a novel statistical quality control pipeline and accompanying R/Bioconductor package, ChIPexoQual, to enable exploration and analysis of ChIP-exo and related experiments. ChIPexoQual evaluates a number of key issues including strand imbalance, library complexity, and signal enrichment of data. Assessment of these features are facilitated through diagnostic plots and summary statistics computed over regions of the genome with varying levels of coverage. We evaluated our QC pipeline with both large collections of public ChIP-exo/nexus data and multiple, new ChIP-exo datasets from Escherichia coli. ChIPexoQual analysis of these datasets resulted in guidelines for using these QC metrics across a wide range of sequencing depths and provided further insights for modelling ChIP-exo data. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  7. Adaptive sampling rate control for networked systems based on statistical characteristics of packet disordering.

    Science.gov (United States)

    Li, Jin-Na; Er, Meng-Joo; Tan, Yen-Kheng; Yu, Hai-Bin; Zeng, Peng

    2015-09-01

    This paper investigates an adaptive sampling rate control scheme for networked control systems (NCSs) subject to packet disordering. The main objectives of the proposed scheme are (a) to avoid heavy packet disordering existing in communication networks and (b) to stabilize NCSs with packet disordering, transmission delay and packet loss. First, a novel sampling rate control algorithm based on statistical characteristics of disordering entropy is proposed; secondly, an augmented closed-loop NCS that consists of a plant, a sampler and a state-feedback controller is transformed into an uncertain and stochastic system, which facilitates the controller design. Then, a sufficient condition for stochastic stability in terms of Linear Matrix Inequalities (LMIs) is given. Moreover, an adaptive tracking controller is designed such that the sampling period tracks a desired sampling period, which represents a significant contribution. Finally, experimental results are given to illustrate the effectiveness and advantages of the proposed scheme. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.

  8. IMPROVING QUALITY OF STATISTICAL PROCESS CONTROL BY DEALING WITH NON‐NORMAL DATA IN AUTOMOTIVE INDUSTRY

    Directory of Open Access Journals (Sweden)

    Zuzana ANDRÁSSYOVÁ

    2012-07-01

    Full Text Available Study deals with an analysis of data to the effect that it improves the quality of statistical tools in processes of assembly of automobile seats. Normal distribution of variables is one of inevitable conditions for the analysis, examination, and improvement of the manufacturing processes (f. e.: manufacturing process capability although, there are constantly more approaches to non‐normal data handling. An appropriate probability distribution of measured data is firstly tested by the goodness of fit of empirical distribution with theoretical normal distribution on the basis of hypothesis testing using programme StatGraphics Centurion XV.II. Data are collected from the assembly process of 1st row automobile seats for each characteristic of quality (Safety Regulation ‐S/R individually. Study closely processes the measured data of an airbag´s assembly and it aims to accomplish the normal distributed data and apply it the statistical process control. Results of the contribution conclude in a statement of rejection of the null hypothesis (measured variables do not follow the normal distribution therefore it is necessary to begin to work on data transformation supported by Minitab15. Even this approach does not reach a normal distributed data and so should be proposed a procedure that leads to the quality output of whole statistical control of manufacturing processes.

  9. Non-additive measure and integral

    CERN Document Server

    Denneberg, Dieter

    1994-01-01

    Non-Additive Measure and Integral is the first systematic approach to the subject. Much of the additive theory (convergence theorems, Lebesgue spaces, representation theorems) is generalized, at least for submodular measures which are characterized by having a subadditive integral. The theory is of interest for applications to economic decision theory (decisions under risk and uncertainty), to statistics (including belief functions, fuzzy measures) to cooperative game theory, artificial intelligence, insurance, etc. Non-Additive Measure and Integral collects the results of scattered and often isolated approaches to non-additive measures and their integrals which originate in pure mathematics, potential theory, statistics, game theory, economic decision theory and other fields of application. It unifies, simplifies and generalizes known results and supplements the theory with new results, thus providing a sound basis for applications and further research in this growing field of increasing interest. It also co...

  10. Gonorrhea Statistics

    Science.gov (United States)

    ... Search Form Controls Cancel Submit Search the CDC Gonorrhea Note: Javascript is disabled or is not supported ... Twitter STD on Facebook Sexually Transmitted Diseases (STDs) Gonorrhea Statistics Recommend on Facebook Tweet Share Compartir Gonorrhea ...

  11. Automated Material Accounting Statistics System at Rockwell Hanford Operations

    International Nuclear Information System (INIS)

    Eggers, R.F.; Giese, E.W.; Kodman, G.P.

    1986-01-01

    The Automated Material Accounting Statistics System (AMASS) was developed under the sponsorship of the U.S. Nuclear Regulatory Commission. The AMASS was developed when it was realized that classical methods of error propagation, based only on measured quantities, did not properly control false alarm rate and that errors other than measurement errors affect inventory differences. The classical assumptions that (1) the mean value of the inventory difference (ID) for a particular nuclear material processing facility is zero, and (2) the variance of the inventory difference is due only to errors in measured quantities are overly simplistic. The AMASS provides a valuable statistical tool for estimating the true mean value and variance of the ID data produced by a particular material balance area. In addition it provides statistical methods of testing both individual and cumulative sums of IDs, taking into account the estimated mean value and total observed variance of the ID

  12. Fieldcrest Cannon, Inc. Advanced Technical Preparation. Statistical Process Control (SPC). PRE-SPC I. Instructor Book.

    Science.gov (United States)

    Averitt, Sallie D.

    This instructor guide, which was developed for use in a manufacturing firm's advanced technical preparation program, contains the materials required to present a learning module that is designed to prepare trainees for the program's statistical process control module by improving their basic math skills and instructing them in basic calculator…

  13. HistFitter software framework for statistical data analysis

    Energy Technology Data Exchange (ETDEWEB)

    Baak, M. [CERN, Geneva (Switzerland); Besjes, G.J. [Radboud University Nijmegen, Nijmegen (Netherlands); Nikhef, Amsterdam (Netherlands); Cote, D. [University of Texas, Arlington (United States); Koutsman, A. [TRIUMF, Vancouver (Canada); Lorenz, J. [Ludwig-Maximilians-Universitaet Muenchen, Munich (Germany); Excellence Cluster Universe, Garching (Germany); Short, D. [University of Oxford, Oxford (United Kingdom)

    2015-04-15

    We present a software framework for statistical data analysis, called HistFitter, that has been used extensively by the ATLAS Collaboration to analyze big datasets originating from proton-proton collisions at the Large Hadron Collider at CERN. Since 2012 HistFitter has been the standard statistical tool in searches for supersymmetric particles performed by ATLAS. HistFitter is a programmable and flexible framework to build, book-keep, fit, interpret and present results of data models of nearly arbitrary complexity. Starting from an object-oriented configuration, defined by users, the framework builds probability density functions that are automatically fit to data and interpreted with statistical tests. Internally HistFitter uses the statistics packages RooStats and HistFactory. A key innovation of HistFitter is its design, which is rooted in analysis strategies of particle physics. The concepts of control, signal and validation regions are woven into its fabric. These are progressively treated with statistically rigorous built-in methods. Being capable of working with multiple models at once that describe the data, HistFitter introduces an additional level of abstraction that allows for easy bookkeeping, manipulation and testing of large collections of signal hypotheses. Finally, HistFitter provides a collection of tools to present results with publication quality style through a simple command-line interface. (orig.)

  14. HistFitter software framework for statistical data analysis

    International Nuclear Information System (INIS)

    Baak, M.; Besjes, G.J.; Cote, D.; Koutsman, A.; Lorenz, J.; Short, D.

    2015-01-01

    We present a software framework for statistical data analysis, called HistFitter, that has been used extensively by the ATLAS Collaboration to analyze big datasets originating from proton-proton collisions at the Large Hadron Collider at CERN. Since 2012 HistFitter has been the standard statistical tool in searches for supersymmetric particles performed by ATLAS. HistFitter is a programmable and flexible framework to build, book-keep, fit, interpret and present results of data models of nearly arbitrary complexity. Starting from an object-oriented configuration, defined by users, the framework builds probability density functions that are automatically fit to data and interpreted with statistical tests. Internally HistFitter uses the statistics packages RooStats and HistFactory. A key innovation of HistFitter is its design, which is rooted in analysis strategies of particle physics. The concepts of control, signal and validation regions are woven into its fabric. These are progressively treated with statistically rigorous built-in methods. Being capable of working with multiple models at once that describe the data, HistFitter introduces an additional level of abstraction that allows for easy bookkeeping, manipulation and testing of large collections of signal hypotheses. Finally, HistFitter provides a collection of tools to present results with publication quality style through a simple command-line interface. (orig.)

  15. Statistics Anxiety and Instructor Immediacy

    Science.gov (United States)

    Williams, Amanda S.

    2010-01-01

    The purpose of this study was to investigate the relationship between instructor immediacy and statistics anxiety. It was predicted that students receiving immediacy would report lower levels of statistics anxiety. Using a pretest-posttest-control group design, immediacy was measured using the Instructor Immediacy scale. Statistics anxiety was…

  16. Tidal controls on earthquake size-frequency statistics

    Science.gov (United States)

    Ide, S.; Yabe, S.; Tanaka, Y.

    2016-12-01

    The possibility that tidal stresses can trigger earthquakes is a long-standing issue in seismology. Except in some special cases, a causal relationship between seismicity and the phase of tidal stress has been rejected on the basis of studies using many small events. However, recently discovered deep tectonic tremors are highly sensitive to tidal stress levels, with the relationship being governed by a nonlinear law according to which the tremor rate increases exponentially with increasing stress; thus, slow deformation (and the probability of earthquakes) may be enhanced during periods of large tidal stress. Here, we show the influence of tidal stress on seismicity by calculating histories of tidal shear stress during the 2-week period before earthquakes. Very large earthquakes tend to occur near the time of maximum tidal stress, but this tendency is not obvious for small earthquakes. Rather, we found that tidal stress controls the earthquake size-frequency statistics; i.e., the fraction of large events increases (i.e. the b-value of the Gutenberg-Richter relation decreases) as the tidal shear stress increases. This correlation is apparent in data from the global catalog and in relatively homogeneous regional catalogues of earthquakes in Japan. The relationship is also reasonable, considering the well-known relationship between stress and the b-value. Our findings indicate that the probability of a tiny rock failure expanding to a gigantic rupture increases with increasing tidal stress levels. This finding has clear implications for probabilistic earthquake forecasting.

  17. On two methods of statistical image analysis

    NARCIS (Netherlands)

    Missimer, J; Knorr, U; Maguire, RP; Herzog, H; Seitz, RJ; Tellman, L; Leenders, K.L.

    1999-01-01

    The computerized brain atlas (CBA) and statistical parametric mapping (SPM) are two procedures for voxel-based statistical evaluation of PET activation studies. Each includes spatial standardization of image volumes, computation of a statistic, and evaluation of its significance. In addition,

  18. Performance of statistical process control methods for regional surgical site infection surveillance: a 10-year multicentre pilot study.

    Science.gov (United States)

    Baker, Arthur W; Haridy, Salah; Salem, Joseph; Ilieş, Iulian; Ergai, Awatef O; Samareh, Aven; Andrianas, Nicholas; Benneyan, James C; Sexton, Daniel J; Anderson, Deverick J

    2017-11-24

    Traditional strategies for surveillance of surgical site infections (SSI) have multiple limitations, including delayed and incomplete outbreak detection. Statistical process control (SPC) methods address these deficiencies by combining longitudinal analysis with graphical presentation of data. We performed a pilot study within a large network of community hospitals to evaluate performance of SPC methods for detecting SSI outbreaks. We applied conventional Shewhart and exponentially weighted moving average (EWMA) SPC charts to 10 previously investigated SSI outbreaks that occurred from 2003 to 2013. We compared the results of SPC surveillance to the results of traditional SSI surveillance methods. Then, we analysed the performance of modified SPC charts constructed with different outbreak detection rules, EWMA smoothing factors and baseline SSI rate calculations. Conventional Shewhart and EWMA SPC charts both detected 8 of the 10 SSI outbreaks analysed, in each case prior to the date of traditional detection. Among detected outbreaks, conventional Shewhart chart detection occurred a median of 12 months prior to outbreak onset and 22 months prior to traditional detection. Conventional EWMA chart detection occurred a median of 7 months prior to outbreak onset and 14 months prior to traditional detection. Modified Shewhart and EWMA charts additionally detected several outbreaks earlier than conventional SPC charts. Shewhart and SPC charts had low false-positive rates when used to analyse separate control hospital SSI data. Our findings illustrate the potential usefulness and feasibility of real-time SPC surveillance of SSI to rapidly identify outbreaks and improve patient safety. Further study is needed to optimise SPC chart selection and calculation, statistical outbreak detection rules and the process for reacting to signals of potential outbreaks. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights

  19. The Use of Statistical Process Control-Charts for Person-Fit Analysis on Computerized Adaptive Testing. LSAC Research Report Series.

    Science.gov (United States)

    Meijer, Rob R.; van Krimpen-Stoop, Edith M. L. A.

    In this study a cumulative-sum (CUSUM) procedure from the theory of Statistical Process Control was modified and applied in the context of person-fit analysis in a computerized adaptive testing (CAT) environment. Six person-fit statistics were proposed using the CUSUM procedure, and three of them could be used to investigate the CAT in online test…

  20. Results of a multicentre randomised controlled trial of statistical process control charts and structured diagnostic tools to reduce ward-acquired meticillin-resistant Staphylococcus aureus: the CHART Project.

    Science.gov (United States)

    Curran, E; Harper, P; Loveday, H; Gilmour, H; Jones, S; Benneyan, J; Hood, J; Pratt, R

    2008-10-01

    Statistical process control (SPC) charts have previously been advocated for infection control quality improvement. To determine their effectiveness, a multicentre randomised controlled trial was undertaken to explore whether monthly SPC feedback from infection control nurses (ICNs) to healthcare workers of ward-acquired meticillin-resistant Staphylococcus aureus (WA-MRSA) colonisation or infection rates would produce any reductions in incidence. Seventy-five wards in 24 hospitals in the UK were randomised into three arms: (1) wards receiving SPC chart feedback; (2) wards receiving SPC chart feedback in conjunction with structured diagnostic tools; and (3) control wards receiving neither type of feedback. Twenty-five months of pre-intervention WA-MRSA data were compared with 24 months of post-intervention data. Statistically significant and sustained decreases in WA-MRSA rates were identified in all three arms (Pcontrol wards, but with no significant difference between the control and intervention arms (P=0.23). There were significantly more post-intervention 'out-of-control' episodes (P=0.021) in the control arm (averages of 0.60, 0.28, and 0.28 for Control, SPC and SPC+Tools wards, respectively). Participants identified SPC charts as an effective communication tool and valuable for disseminating WA-MRSA data.

  1. Optimage central organised image quality control including statistics and reporting

    International Nuclear Information System (INIS)

    Jahnen, A.; Schilz, C.; Shannoun, F.; Schreiner, A.; Hermen, J.; Moll, C.

    2008-01-01

    Quality control of medical imaging systems is performed using dedicated phantoms. As the imaging systems are more and more digital, adequate image processing methods might help to save evaluation time and to receive objective results. The developed software package OPTIMAGE is focusing on this with a central approach: On one hand, OPTIMAGE provides a framework, which includes functions like database integration, DICOM data sources, multilingual user interface and image processing functionality. On the other hand, the test methods are implemented using modules which are able to process the images automatically for the common imaging systems. The integration of statistics and reporting into this environment is paramount: This is the only way to provide these functions in an interactive, user-friendly way. These features enable the users to discover degradation in performance quickly and document performed measurements easily. (authors)

  2. Statistical mechanics

    CERN Document Server

    Schwabl, Franz

    2006-01-01

    The completely revised new edition of the classical book on Statistical Mechanics covers the basic concepts of equilibrium and non-equilibrium statistical physics. In addition to a deductive approach to equilibrium statistics and thermodynamics based on a single hypothesis - the form of the microcanonical density matrix - this book treats the most important elements of non-equilibrium phenomena. Intermediate calculations are presented in complete detail. Problems at the end of each chapter help students to consolidate their understanding of the material. Beyond the fundamentals, this text demonstrates the breadth of the field and its great variety of applications. Modern areas such as renormalization group theory, percolation, stochastic equations of motion and their applications to critical dynamics, kinetic theories, as well as fundamental considerations of irreversibility, are discussed. The text will be useful for advanced students of physics and other natural sciences; a basic knowledge of quantum mechan...

  3. Statistical analysis of sediment toxicity by additive monotone regression splines

    NARCIS (Netherlands)

    Boer, de W.J.; Besten, den P.J.; Braak, ter C.J.F.

    2002-01-01

    Modeling nonlinearity and thresholds in dose-effect relations is a major challenge, particularly in noisy data sets. Here we show the utility of nonlinear regression with additive monotone regression splines. These splines lead almost automatically to the estimation of thresholds. We applied this

  4. Statistical quality management using miniTAB 14

    International Nuclear Information System (INIS)

    An, Seong Jin

    2007-01-01

    This book explains statistical quality management giving descriptions of definition of quality, quality management, quality cost, basic methods of quality management, principles of control chart, control chart for variables, control chart for attributes, capability analysis, other issues of statistical process control, acceptance sampling, sampling for variable acceptance, design and analysis of experiment, Taguchi quality engineering, reaction surface methodology reliability analysis.

  5. Statistical nuclear reactions

    International Nuclear Information System (INIS)

    Hilaire, S.

    2001-01-01

    A review of the statistical model of nuclear reactions is presented. The main relations are described, together with the ingredients necessary to perform practical calculations. In addition, a substantial overview of the width fluctuation correction factor is given. (author)

  6. Statistical comparisons of Savannah River anemometer data applied to quality control of instrument networks

    International Nuclear Information System (INIS)

    Porch, W.M.; Dickerson, M.H.

    1976-08-01

    Continuous monitoring of extensive meteorological instrument arrays is a requirement in the study of important mesoscale atmospheric phenomena. The phenomena include pollution transport prediction from continuous area sources, or one time releases of toxic materials and wind energy prospecting in areas of topographic enhancement of the wind. Quality control techniques that can be applied to these data to determine if the instruments are operating within their prescribed tolerances were investigated. Savannah River Plant data were analyzed with both independent and comparative statistical techniques. The independent techniques calculate the mean, standard deviation, moments about the mean, kurtosis, skewness, probability density distribution, cumulative probability and power spectra. The comparative techniques include covariance, cross-spectral analysis and two dimensional probability density. At present the calculating and plotting routines for these statistical techniques do not reside in a single code so it is difficult to ascribe independent memory size and computation time accurately. However, given the flexibility of a data system which includes simple and fast running statistics at the instrument end of the data network (ASF) and more sophisticated techniques at the computational end (ACF) a proper balance will be attained. These techniques are described in detail and preliminary results are presented

  7. Statistical process control analysis for patient-specific IMRT and VMAT QA.

    Science.gov (United States)

    Sanghangthum, Taweap; Suriyapee, Sivalee; Srisatit, Somyot; Pawlicki, Todd

    2013-05-01

    This work applied statistical process control to establish the control limits of the % gamma pass of patient-specific intensity modulated radiotherapy (IMRT) and volumetric modulated arc therapy (VMAT) quality assurance (QA), and to evaluate the efficiency of the QA process by using the process capability index (Cpml). A total of 278 IMRT QA plans in nasopharyngeal carcinoma were measured with MapCHECK, while 159 VMAT QA plans were undertaken with ArcCHECK. Six megavolts with nine fields were used for the IMRT plan and 2.5 arcs were used to generate the VMAT plans. The gamma (3%/3 mm) criteria were used to evaluate the QA plans. The % gamma passes were plotted on a control chart. The first 50 data points were employed to calculate the control limits. The Cpml was calculated to evaluate the capability of the IMRT/VMAT QA process. The results showed higher systematic errors in IMRT QA than VMAT QA due to the more complicated setup used in IMRT QA. The variation of random errors was also larger in IMRT QA than VMAT QA because the VMAT plan has more continuity of dose distribution. The average % gamma pass was 93.7% ± 3.7% for IMRT and 96.7% ± 2.2% for VMAT. The Cpml value of IMRT QA was 1.60 and VMAT QA was 1.99, which implied that the VMAT QA process was more accurate than the IMRT QA process. Our lower control limit for % gamma pass of IMRT is 85.0%, while the limit for VMAT is 90%. Both the IMRT and VMAT QA processes are good quality because Cpml values are higher than 1.0.

  8. Teaching Quality Control with Chocolate Chip Cookies

    Science.gov (United States)

    Baker, Ardith

    2014-01-01

    Chocolate chip cookies are used to illustrate the importance and effectiveness of control charts in Statistical Process Control. By counting the number of chocolate chips, creating the spreadsheet, calculating the control limits and graphing the control charts, the student becomes actively engaged in the learning process. In addition, examining…

  9. Statistical process control for radiotherapy quality assurance

    International Nuclear Information System (INIS)

    Pawlicki, Todd; Whitaker, Matthew; Boyer, Arthur L.

    2005-01-01

    Every quality assurance process uncovers random and systematic errors. These errors typically consist of many small random errors and a very few number of large errors that dominate the result. Quality assurance practices in radiotherapy do not adequately differentiate between these two sources of error. The ability to separate these types of errors would allow the dominant source(s) of error to be efficiently detected and addressed. In this work, statistical process control is applied to quality assurance in radiotherapy for the purpose of setting action thresholds that differentiate between random and systematic errors. The theoretical development and implementation of process behavior charts are described. We report on a pilot project is which these techniques are applied to daily output and flatness/symmetry quality assurance for a 10 MV photon beam in our department. This clinical case was followed over 52 days. As part of our investigation, we found that action thresholds set using process behavior charts were able to identify systematic changes in our daily quality assurance process. This is in contrast to action thresholds set using the standard deviation, which did not identify the same systematic changes in the process. The process behavior thresholds calculated from a subset of the data detected a 2% change in the process whereas with a standard deviation calculation, no change was detected. Medical physicists must make decisions on quality assurance data as it is acquired. Process behavior charts help decide when to take action and when to acquire more data before making a change in the process

  10. Versatility of cooperative transcriptional activation: a thermodynamical modeling analysis for greater-than-additive and less-than-additive effects.

    Directory of Open Access Journals (Sweden)

    Till D Frank

    Full Text Available We derive a statistical model of transcriptional activation using equilibrium thermodynamics of chemical reactions. We examine to what extent this statistical model predicts synergy effects of cooperative activation of gene expression. We determine parameter domains in which greater-than-additive and less-than-additive effects are predicted for cooperative regulation by two activators. We show that the statistical approach can be used to identify different causes of synergistic greater-than-additive effects: nonlinearities of the thermostatistical transcriptional machinery and three-body interactions between RNA polymerase and two activators. In particular, our model-based analysis suggests that at low transcription factor concentrations cooperative activation cannot yield synergistic greater-than-additive effects, i.e., DNA transcription can only exhibit less-than-additive effects. Accordingly, transcriptional activity turns from synergistic greater-than-additive responses at relatively high transcription factor concentrations into less-than-additive responses at relatively low concentrations. In addition, two types of re-entrant phenomena are predicted. First, our analysis predicts that under particular circumstances transcriptional activity will feature a sequence of less-than-additive, greater-than-additive, and eventually less-than-additive effects when for fixed activator concentrations the regulatory impact of activators on the binding of RNA polymerase to the promoter increases from weak, to moderate, to strong. Second, for appropriate promoter conditions when activator concentrations are increased then the aforementioned re-entrant sequence of less-than-additive, greater-than-additive, and less-than-additive effects is predicted as well. Finally, our model-based analysis suggests that even for weak activators that individually induce only negligible increases in promoter activity, promoter activity can exhibit greater-than-additive

  11. Introduction to Bayesian statistics

    CERN Document Server

    Bolstad, William M

    2017-01-01

    There is a strong upsurge in the use of Bayesian methods in applied statistical analysis, yet most introductory statistics texts only present frequentist methods. Bayesian statistics has many important advantages that students should learn about if they are going into fields where statistics will be used. In this Third Edition, four newly-added chapters address topics that reflect the rapid advances in the field of Bayesian staistics. The author continues to provide a Bayesian treatment of introductory statistical topics, such as scientific data gathering, discrete random variables, robust Bayesian methods, and Bayesian approaches to inferenfe cfor discrete random variables, bionomial proprotion, Poisson, normal mean, and simple linear regression. In addition, newly-developing topics in the field are presented in four new chapters: Bayesian inference with unknown mean and variance; Bayesian inference for Multivariate Normal mean vector; Bayesian inference for Multiple Linear RegressionModel; and Computati...

  12. Enhancement of Engine Oil Wear and Friction Control Performance Through Titanium Additive Chemistry

    International Nuclear Information System (INIS)

    Guevremont, J.; Guinther, G.; Szemenyei, D.; Devlin, M.; Jao, T.; Jaye, C.; Woicik, J.; Fischer, D.

    2008-01-01

    Traditionally, wear protection and friction modification by engine oil is provided by zinc dithiophosphate (ZDDP) or other phosphorus compounds. These additives provide effective wear protection and friction control on engine parts through formation of a glassy polyphosphate antiwear film. However, the deposition of phosphorus species on automotive catalytic converters from lubricants has been known for some time to have a detrimental effect of poisoning the catalysts. To mitigate the situation, the industry has been making every effort to find ZDDP-replacement additives that are friendly to catalysts. Toward this goal we have investigated a titanium additive chemistry as a ZDDP replacement. Fully formulated engine oils incorporating this additive component have been found to be effective in reducing wear and controlling friction in a high-frequency reciprocating rig (HFRR), 4-ball bench wear, Sequence IIIG, and Sequence IVA engine tests. Surface analysis of the tested parts by Auger electron spectroscopy, secondary ion mass spectrometry (SIMS), and X-ray photoelectron spectroscopy (XPS) have shown that Ti species have been incorporated into the wear tracks and can only be found on the wear tracks. We used synchrotron based near edge X-ray absorption fine structure (NEXAFS) to investigate the chemical bonding mechanism of the Ti additive with the metal surface that affects the wear improvement mechanism. We postulate that Ti provides antiwear enhancement through inclusion in the metal/metal oxide structure of the ferrous surface by forming FeTiO3.

  13. Statistics for non-statisticians

    CERN Document Server

    Madsen, Birger Stjernholm

    2016-01-01

    This book was written for those who need to know how to collect, analyze and present data. It is meant to be a first course for practitioners, a book for private study or brush-up on statistics, and supplementary reading for general statistics classes. The book is untraditional, both with respect to the choice of topics and the presentation: Topics were determined by what is most useful for practical statistical work, and the presentation is as non-mathematical as possible. The book contains many examples using statistical functions in spreadsheets. In this second edition, new topics have been included e.g. within the area of statistical quality control, in order to make the book even more useful for practitioners working in industry. .

  14. Pre-Statistical Process Control: Making Numbers Count! JobLink Winning at Work Instructor's Manual, Module 3.

    Science.gov (United States)

    Coast Community Coll. District, Costa Mesa, CA.

    This instructor's manual for workplace trainers contains the materials required to conduct a course in pre-statistical process control. The course consists of six lessons for workers and two lessons for supervisors that discuss the following: concepts taught in the six lessons; workers' progress in the individual lessons; and strategies for…

  15. Aminocarminic acid in E120-labelled food additives and beverages.

    Science.gov (United States)

    Sabatino, Leonardo; Scordino, Monica; Gargano, Maria; Lazzaro, Francesco; Borzì, Marco A; Traulo, Pasqualino; Gagliano, Giacomo

    2012-01-01

    An analytical method was developed for investigating aminocarminic acid occurrence in E120-labelled red-coloured-beverages and in E120 additives, with the aim of controlling the purity of the carmine additive in countries where the use of aminocarminic acid is forbidden. The carminic acid and the aminocarminic acid were separated by high-performance liquid chromatography-photodiode array-tandem mass spectrography (HPLC-PDA-MS/MS). The method was statistically validated. The regression lines, ranging from 10 to 100 mg/L, showed r(2 )> 0.9996. Recoveries from 97% to 101% were obtained for the fortification level of 50 mg/L; the relative standard deviations did not exceed 3%. The LODs were below 2 mg/L, whereas the LOQs did not exceed 4 mg/L. The method was successfully applied to 27 samples of commercial E120-labelled red-coloured beverages and E120 additives, collected in Italy during quality control investigations conducted by the Ministry. The results demonstrated that more than 50% of the samples contained aminocarminic acid, evidencing the alarming illicit use of this semi-synthetic carmine acid derivative.

  16. TU-FG-201-05: Varian MPC as a Statistical Process Control Tool

    International Nuclear Information System (INIS)

    Carver, A; Rowbottom, C

    2016-01-01

    Purpose: Quality assurance in radiotherapy requires the measurement of various machine parameters to ensure they remain within permitted values over time. In Truebeam release 2.0 the Machine Performance Check (MPC) was released allowing beam output and machine axis movements to be assessed in a single test. We aim to evaluate the Varian Machine Performance Check (MPC) as a tool for Statistical Process Control (SPC). Methods: Varian’s MPC tool was used on three Truebeam and one EDGE linac for a period of approximately one year. MPC was commissioned against independent systems. After this period the data were reviewed to determine whether or not the MPC was useful as a process control tool. Analyses on individual tests were analysed using Shewhart control plots, using Matlab for analysis. Principal component analysis was used to determine if a multivariate model was of any benefit in analysing the data. Results: Control charts were found to be useful to detect beam output changes, worn T-nuts and jaw calibration issues. Upper and lower control limits were defined at the 95% level. Multivariate SPC was performed using Principal Component Analysis. We found little evidence of clustering beyond that which might be naively expected such as beam uniformity and beam output. Whilst this makes multivariate analysis of little use it suggests that each test is giving independent information. Conclusion: The variety of independent parameters tested in MPC makes it a sensitive tool for routine machine QA. We have determined that using control charts in our QA programme would rapidly detect changes in machine performance. The use of control charts allows large quantities of tests to be performed on all linacs without visual inspection of all results. The use of control limits alerts users when data are inconsistent with previous measurements before they become out of specification. A. Carver has received a speaker’s honorarium from Varian

  17. Statistics for Engineers

    International Nuclear Information System (INIS)

    Kim, Jin Gyeong; Park, Jin Ho; Park, Hyeon Jin; Lee, Jae Jun; Jun, Whong Seok; Whang, Jin Su

    2009-08-01

    This book explains statistics for engineers using MATLAB, which includes arrangement and summary of data, probability, probability distribution, sampling distribution, assumption, check, variance analysis, regression analysis, categorical data analysis, quality assurance such as conception of control chart, consecutive control chart, breakthrough strategy and analysis using Matlab, reliability analysis like measurement of reliability and analysis with Maltab, and Markov chain.

  18. Statistical modeling for degradation data

    CERN Document Server

    Lio, Yuhlong; Ng, Hon; Tsai, Tzong-Ru

    2017-01-01

    This book focuses on the statistical aspects of the analysis of degradation data. In recent years, degradation data analysis has come to play an increasingly important role in different disciplines such as reliability, public health sciences, and finance. For example, information on products’ reliability can be obtained by analyzing degradation data. In addition, statistical modeling and inference techniques have been developed on the basis of different degradation measures. The book brings together experts engaged in statistical modeling and inference, presenting and discussing important recent advances in degradation data analysis and related applications. The topics covered are timely and have considerable potential to impact both statistics and reliability engineering.

  19. Fabrication of Hyperbranched Block-Statistical Copolymer-Based Prodrug with Dual Sensitivities for Controlled Release.

    Science.gov (United States)

    Zheng, Luping; Wang, Yunfei; Zhang, Xianshuo; Ma, Liwei; Wang, Baoyan; Ji, Xiangling; Wei, Hua

    2018-01-17

    Dendrimer with hyperbranched structure and multivalent surface is regarded as one of the most promising candidates close to the ideal drug delivery systems, but the clinical translation and scale-up production of dendrimer has been hampered significantly by the synthetic difficulties. Therefore, there is considerable scope for the development of novel hyperbranched polymer that can not only address the drawbacks of dendrimer but maintain its advantages. The reversible addition-fragmentation chain transfer self-condensing vinyl polymerization (RAFT-SCVP) technique has enabled facile preparation of segmented hyperbranched polymer (SHP) by using chain transfer monomer (CTM)-based double-head agent during the past decade. Meanwhile, the design and development of block-statistical copolymers has been proven in our recent studies to be a simple yet effective way to address the extracellular stability vs intracellular high delivery efficacy dilemma. To integrate the advantages of both hyperbranched and block-statistical structures, we herein reported the fabrication of hyperbranched block-statistical copolymer-based prodrug with pH and reduction dual sensitivities using RAFT-SCVP and post-polymerization click coupling. The external homo oligo(ethylene glycol methyl ether methacrylate) (OEGMA) block provides sufficient extracellularly colloidal stability for the nanocarriers by steric hindrance, and the interior OEGMA units incorporated by the statistical copolymerization promote intracellular drug release by facilitating the permeation of GSH and H + for the cleavage of the reduction-responsive disulfide bond and pH-liable carbonate link as well as weakening the hydrophobic encapsulation of drug molecules. The delivery efficacy of the target hyperbranched block-statistical copolymer-based prodrug was evaluated in terms of in vitro drug release and cytotoxicity studies, which confirms both acidic pH and reduction-triggered drug release for inhibiting proliferation of He

  20. Multivariate statistical methods a first course

    CERN Document Server

    Marcoulides, George A

    2014-01-01

    Multivariate statistics refer to an assortment of statistical methods that have been developed to handle situations in which multiple variables or measures are involved. Any analysis of more than two variables or measures can loosely be considered a multivariate statistical analysis. An introductory text for students learning multivariate statistical methods for the first time, this book keeps mathematical details to a minimum while conveying the basic principles. One of the principal strategies used throughout the book--in addition to the presentation of actual data analyses--is poin

  1. Application of Statistical Increase in Industrial Quality

    International Nuclear Information System (INIS)

    Akhmad-Fauzy

    2000-01-01

    Application of statistical method in industrial field is slightly newcompared with agricultural and biology. Statistical method which is appliedin industrial field more focus on industrial system control and useful formaintaining economical control of produce quality which is produced on bigscale. Application of statistical method in industrial field has increasedrapidly. This fact is supported by release of ISO 9000 quality system in 1987as international quality standard which is adopted by more than 100countries. (author)

  2. Application of statistical process control to qualitative molecular diagnostic assays.

    Science.gov (United States)

    O'Brien, Cathal P; Finn, Stephen P

    2014-01-01

    Modern pathology laboratories and in particular high throughput laboratories such as clinical chemistry have developed a reliable system for statistical process control (SPC). Such a system is absent from the majority of molecular laboratories and where present is confined to quantitative assays. As the inability to apply SPC to an assay is an obvious disadvantage this study aimed to solve this problem by using a frequency estimate coupled with a confidence interval calculation to detect deviations from an expected mutation frequency. The results of this study demonstrate the strengths and weaknesses of this approach and highlight minimum sample number requirements. Notably, assays with low mutation frequencies and detection of small deviations from an expected value require greater sample numbers to mitigate a protracted time to detection. Modeled laboratory data was also used to highlight how this approach might be applied in a routine molecular laboratory. This article is the first to describe the application of SPC to qualitative laboratory data.

  3. Application of statistical process control to qualitative molecular diagnostic assays

    LENUS (Irish Health Repository)

    O'Brien, Cathal P.

    2014-11-01

    Modern pathology laboratories and in particular high throughput laboratories such as clinical chemistry have developed a reliable system for statistical process control (SPC). Such a system is absent from the majority of molecular laboratories and where present is confined to quantitative assays. As the inability to apply SPC to an assay is an obvious disadvantage this study aimed to solve this problem by using a frequency estimate coupled with a confidence interval calculation to detect deviations from an expected mutation frequency. The results of this study demonstrate the strengths and weaknesses of this approach and highlight minimum sample number requirements. Notably, assays with low mutation frequencies and detection of small deviations from an expected value require greater sample numbers to mitigate a protracted time to detection. Modeled laboratory data was also used to highlight how this approach might be applied in a routine molecular laboratory. This article is the first to describe the application of SPC to qualitative laboratory data.

  4. Basic elements of computational statistics

    CERN Document Server

    Härdle, Wolfgang Karl; Okhrin, Yarema

    2017-01-01

    This textbook on computational statistics presents tools and concepts of univariate and multivariate statistical data analysis with a strong focus on applications and implementations in the statistical software R. It covers mathematical, statistical as well as programming problems in computational statistics and contains a wide variety of practical examples. In addition to the numerous R sniplets presented in the text, all computer programs (quantlets) and data sets to the book are available on GitHub and referred to in the book. This enables the reader to fully reproduce as well as modify and adjust all examples to their needs. The book is intended for advanced undergraduate and first-year graduate students as well as for data analysts new to the job who would like a tour of the various statistical tools in a data analysis workshop. The experienced reader with a good knowledge of statistics and programming might skip some sections on univariate models and enjoy the various mathematical roots of multivariate ...

  5. A quality improvement project using statistical process control methods for type 2 diabetes control in a resource-limited setting.

    Science.gov (United States)

    Flood, David; Douglas, Kate; Goldberg, Vera; Martinez, Boris; Garcia, Pablo; Arbour, MaryCatherine; Rohloff, Peter

    2017-08-01

    Quality improvement (QI) is a key strategy for improving diabetes care in low- and middle-income countries (LMICs). This study reports on a diabetes QI project in rural Guatemala whose primary aim was to improve glycemic control of a panel of adult diabetes patients. Formative research suggested multiple areas for programmatic improvement in ambulatory diabetes care. This project utilized the Model for Improvement and Agile Global Health, our organization's complementary healthcare implementation framework. A bundle of improvement activities were implemented at the home, clinic and institutional level. Control charts of mean hemoglobin A1C (HbA1C) and proportion of patients meeting target HbA1C showed improvement as special cause variation was identified 3 months after the intervention began. Control charts for secondary process measures offered insights into the value of different components of the intervention. Intensity of home-based diabetes education emerged as an important driver of panel glycemic control. Diabetes QI work is feasible in resource-limited settings in LMICs and can improve glycemic control. Statistical process control charts are a promising methodology for use with panels or registries of diabetes patients. © The Author 2017. Published by Oxford University Press in association with the International Society for Quality in Health Care. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  6. THE FLUORBOARD A STATISTICALLY BASED DASHBOARD METHOD FOR IMPROVING SAFETY

    International Nuclear Information System (INIS)

    PREVETTE, S.S.

    2005-01-01

    The FluorBoard is a statistically based dashboard method for improving safety. Fluor Hanford has achieved significant safety improvements--including more than a 80% reduction in OSHA cases per 200,000 hours, during its work at the US Department of Energy's Hanford Site in Washington state. The massive project on the former nuclear materials production site is considered one of the largest environmental cleanup projects in the world. Fluor Hanford's safety improvements were achieved by a committed partnering of workers, managers, and statistical methodology. Safety achievements at the site have been due to a systematic approach to safety. This includes excellent cooperation between the field workers, the safety professionals, and management through OSHA Voluntary Protection Program principles. Fluor corporate values are centered around safety, and safety excellence is important for every manager in every project. In addition, Fluor Hanford has utilized a rigorous approach to using its safety statistics, based upon Dr. Shewhart's control charts, and Dr. Deming's management and quality methods

  7. A statistical analysis of the impact of advertising signs on road safety.

    Science.gov (United States)

    Yannis, George; Papadimitriou, Eleonora; Papantoniou, Panagiotis; Voulgari, Chrisoula

    2013-01-01

    This research aims to investigate the impact of advertising signs on road safety. An exhaustive review of international literature was carried out on the effect of advertising signs on driver behaviour and safety. Moreover, a before-and-after statistical analysis with control groups was applied on several road sites with different characteristics in the Athens metropolitan area, in Greece, in order to investigate the correlation between the placement or removal of advertising signs and the related occurrence of road accidents. Road accident data for the 'before' and 'after' periods on the test sites and the control sites were extracted from the database of the Hellenic Statistical Authority, and the selected 'before' and 'after' periods vary from 2.5 to 6 years. The statistical analysis shows no statistical correlation between road accidents and advertising signs in none of the nine sites examined, as the confidence intervals of the estimated safety effects are non-significant at 95% confidence level. This can be explained by the fact that, in the examined road sites, drivers are overloaded with information (traffic signs, directions signs, labels of shops, pedestrians and other vehicles, etc.) so that the additional information load from advertising signs may not further distract them.

  8. Morphological control in polymer solar cells using low-boiling-point solvent additives

    Science.gov (United States)

    Mahadevapuram, Rakesh C.

    In the global search for clean, renewable energy sources, organic photovoltaics (OPVs) have recently been given much attention. Popular modern-day OPVs are made from solution-processible, carbon-based polymers (e.g. the model poly(3-hexylthiophene) that are intimately blended with fullerene derivatives (e.g. [6,6]-phenyl-C71-butyric acid methyl ester) to form what is known as the dispersed bulk-heterojunction (BHJ). This BHJ architecture has produced some of the most efficient OPVs to date, with reports closing in on 10% power conversion efficiency. To push efficiencies further into double digits, many groups have identified the BHJ nanomorphology---that is, the phase separations and grain sizes within the polymer: fullerene composite---as a key aspect in need of control and improvement. As a result, many methods, including thermal annealing, slow-drying (solvent) annealing, vapor annealing, and solvent additives, have been developed and studied to promote BHJ self-organization. Processing organic photovoltaic (OPV) blend solutions with high-boiling-point solvent additives has recently been used for morphological control in BHJ OPV cells. Here we show that even low-boiling-point solvents can be effective additives. When P3HT:PCBM OPV cells were processed with a low-boiling-point solvent tetrahydrafuran as an additive in parent solvent o-dichlorobenzene, charge extraction increased leading to fill factors as high as 69.5%, without low work-function cathodes, electrode buffer layers or thermal treatment. This was attributed to PCBM demixing from P3HT domains and better vertical phase separation, as indicated by photoluminescence lifetimes, hole mobilities, and shunt leakage currents. Dependence on solvent parameters and applicability beyond P3HT system was also investigated.

  9. Fundamentals of statistics

    CERN Document Server

    Mulholland, Henry

    1968-01-01

    Fundamentals of Statistics covers topics on the introduction, fundamentals, and science of statistics. The book discusses the collection, organization and representation of numerical data; elementary probability; the binomial Poisson distributions; and the measures of central tendency. The text describes measures of dispersion for measuring the spread of a distribution; continuous distributions for measuring on a continuous scale; the properties and use of normal distribution; and tests involving the normal or student's 't' distributions. The use of control charts for sample means; the ranges

  10. Independent assessment to continue improvement: Implementing statistical process control at the Hanford Site

    International Nuclear Information System (INIS)

    Hu, T.A.; Lo, J.C.

    1994-11-01

    A Quality Assurance independent assessment has brought about continued improvement in the PUREX Plant surveillance program at the Department of Energy's Hanford Site. After the independent assessment, Quality Assurance personnel were closely involved in improving the surveillance program, specifically regarding storage tank monitoring. The independent assessment activities included reviewing procedures, analyzing surveillance data, conducting personnel interviews, and communicating with management. Process improvement efforts included: (1) designing data collection methods; (2) gaining concurrence between engineering and management, (3) revising procedures; and (4) interfacing with shift surveillance crews. Through this process, Statistical Process Control (SPC) was successfully implemented and surveillance management was improved. The independent assessment identified several deficiencies within the surveillance system. These deficiencies can be grouped into two areas: (1) data recording and analysis and (2) handling off-normal conditions. By using several independent assessment techniques, Quality Assurance was able to point out program weakness to senior management and present suggestions for improvements. SPC charting, as implemented by Quality Assurance, is an excellent tool for diagnosing the process, improving communication between the team members, and providing a scientific database for management decisions. In addition, the surveillance procedure was substantially revised. The goals of this revision were to (1) strengthen the role of surveillance management, engineering and operators and (2) emphasize the importance of teamwork for each individual who performs a task. In this instance we believe that the value independent assessment adds to the system is the continuous improvement activities that follow the independent assessment. Excellence in teamwork between the independent assessment organization and the auditee is the key to continuing improvement

  11. A statistical manual for chemists

    CERN Document Server

    Bauer, Edward

    1971-01-01

    A Statistical Manual for Chemists, Second Edition presents simple and fast statistical tools for data analysis of working chemists. This edition is organized into nine chapters and begins with an overview of the fundamental principles of the statistical techniques used in experimental data analysis. The subsequent chapters deal with the concept of statistical average, experimental design, and analysis of variance. The discussion then shifts to control charts, with particular emphasis on variable charts that are more useful to chemists and chemical engineers. A chapter focuses on the effect

  12. Using a statistical process control chart during the quality assessment of cancer registry data.

    Science.gov (United States)

    Myles, Zachary M; German, Robert R; Wilson, Reda J; Wu, Manxia

    2011-01-01

    Statistical process control (SPC) charts may be used to detect acute variations in the data while simultaneously evaluating unforeseen aberrations that may warrant further investigation by the data user. Using cancer stage data captured by the Summary Stage 2000 (SS2000) variable, we sought to present a brief report highlighting the utility of the SPC chart during the quality assessment of cancer registry data. Using a county-level caseload for the diagnosis period of 2001-2004 (n=25,648), we found the overall variation of the SS2000 variable to be in control during diagnosis years of 2001 and 2002, exceeded the lower control limit (LCL) in 2003, and exceeded the upper control limit (UCL) in 2004; in situ/localized stages were in control throughout the diagnosis period, regional stage exceeded UCL in 2004, and distant stage exceeded the LCL in 2001 and the UCL in 2004. Our application of the SPC chart with cancer registry data illustrates that the SPC chart may serve as a readily available and timely tool for identifying areas of concern during the data collection and quality assessment of central cancer registry data.

  13. Statistical Models and Methods for Lifetime Data

    CERN Document Server

    Lawless, Jerald F

    2011-01-01

    Praise for the First Edition"An indispensable addition to any serious collection on lifetime data analysis and . . . a valuable contribution to the statistical literature. Highly recommended . . ."-Choice"This is an important book, which will appeal to statisticians working on survival analysis problems."-Biometrics"A thorough, unified treatment of statistical models and methods used in the analysis of lifetime data . . . this is a highly competent and agreeable statistical textbook."-Statistics in MedicineThe statistical analysis of lifetime or response time data is a key tool in engineering,

  14. Statistics for nuclear engineers and scientists. Part 1. Basic statistical inference

    Energy Technology Data Exchange (ETDEWEB)

    Beggs, W.J.

    1981-02-01

    This report is intended for the use of engineers and scientists working in the nuclear industry, especially at the Bettis Atomic Power Laboratory. It serves as the basis for several Bettis in-house statistics courses. The objectives of the report are to introduce the reader to the language and concepts of statistics and to provide a basic set of techniques to apply to problems of the collection and analysis of data. Part 1 covers subjects of basic inference. The subjects include: descriptive statistics; probability; simple inference for normally distributed populations, and for non-normal populations as well; comparison of two populations; the analysis of variance; quality control procedures; and linear regression analysis.

  15. DESIGNING ENVIRONMENTAL MONITORING DATABASES FOR STATISTIC ASSESSMENT

    Science.gov (United States)

    Databases designed for statistical analyses have characteristics that distinguish them from databases intended for general use. EMAP uses a probabilistic sampling design to collect data to produce statistical assessments of environmental conditions. In addition to supporting the ...

  16. A Multicenter, Prospective, Randomized Controlled Trial to Evaluate the Additional Benefit of a Multistrain Synbiotic (Prodefen® in the Clinical Management of Acute Viral Diarrhea in Children

    Directory of Open Access Journals (Sweden)

    Emilia García-Menor MD

    2016-11-01

    Full Text Available This randomized, open-label study evaluated the additional benefits of the synbiotic Prodefen® in the clinical management of acute diarrhea of suspected viral origin in children between 6 months and 12 years of age. Study outcomes included the duration of diarrhea, the recovery from diarrhea, and the tolerability and acceptance of the treatment. The proportion of patients without diarrhea over the study period was greater in the synbiotic group than in the control group at all study time points, showing a statistically significant difference on the fifth day (95% vs 79%, p < 0.001. The duration of diarrhea (median and interquartile range was reduced by 1 day in the synbiotic-treated patients (3 [2-5] vs 4 [3-5], p = 0.377. The tolerability of the treatment regimen, as evaluated by the parents, was significantly better in those receiving the synbiotic than in the control group. Overall, 96% of the parents of children receiving the synbiotic reported being satisfied to very satisfied with the treatment regimen. The results of this study indicate that the addition of the synbiotic Prodefen® is a well-tolerated and well-accepted approach that provides an additional benefit to the standard supportive therapy in the management of acute viral diarrhea in children.

  17. Statistical process control as a tool for controlling operating room performance: retrospective analysis and benchmarking.

    Science.gov (United States)

    Chen, Tsung-Tai; Chang, Yun-Jau; Ku, Shei-Ling; Chung, Kuo-Piao

    2010-10-01

    There is much research using statistical process control (SPC) to monitor surgical performance, including comparisons among groups to detect small process shifts, but few of these studies have included a stabilization process. This study aimed to analyse the performance of surgeons in operating room (OR) and set a benchmark by SPC after stabilized process. The OR profile of 499 patients who underwent laparoscopic cholecystectomy performed by 16 surgeons at a tertiary hospital in Taiwan during 2005 and 2006 were recorded. SPC was applied to analyse operative and non-operative times using the following five steps: first, the times were divided into two segments; second, they were normalized; third, they were evaluated as individual processes; fourth, the ARL(0) was calculated;, and fifth, the different groups (surgeons) were compared. Outliers were excluded to ensure stability for each group and to facilitate inter-group comparison. The results showed that in the stabilized process, only one surgeon exhibited a significantly shorter total process time (including operative time and non-operative time). In this study, we use five steps to demonstrate how to control surgical and non-surgical time in phase I. There are some measures that can be taken to prevent skew and instability in the process. Also, using SPC, one surgeon can be shown to be a real benchmark. © 2010 Blackwell Publishing Ltd.

  18. Statistical Analysis of Coherent Ultrashort Light Pulse CDMA With Multiple Optical Amplifiers Using Additive Noise Model

    Science.gov (United States)

    Jamshidi, Kambiz; Salehi, Jawad A.

    2005-05-01

    This paper describes a study of the performance of various configurations for placing multiple optical amplifiers in a typical coherent ultrashort light pulse code-division multiple access (CULP-CDMA) communication system using the additive noise model. For this study, a comprehensive performance analysis was developed that takes into account multiple-access noise, noise due to optical amplifiers, and thermal noise using the saddle-point approximation technique. Prior to obtaining the overall system performance, the input/output statistical models for different elements of the system such as encoders/decoders,star coupler, and optical amplifiers were obtained. Performance comparisons between an ideal and lossless quantum-limited case and a typical CULP-CDMA with various losses exhibit more than 30 dB more power requirement to obtain the same bit-error rate (BER). Considering the saturation effect of optical amplifiers, this paper discusses an algorithm for amplifiers' gain setting in various stages of the network in order to overcome the nonlinear effects on signal modulation in optical amplifiers. Finally, using this algorithm,various configurations of multiple optical amplifiers in CULP-CDMA are discussed and the rules for the required optimum number of amplifiers are shown with their corresponding optimum locations to be implemented along the CULP-CDMA system.

  19. Instruction of Statistics via Computer-Based Tools: Effects on Statistics' Anxiety, Attitude, and Achievement

    Science.gov (United States)

    Ciftci, S. Koza; Karadag, Engin; Akdal, Pinar

    2014-01-01

    The purpose of this study was to determine the effect of statistics instruction using computer-based tools, on statistics anxiety, attitude, and achievement. This study was designed as quasi-experimental research and the pattern used was a matched pre-test/post-test with control group design. Data was collected using three scales: a Statistics…

  20. SOCR: Statistics Online Computational Resource

    Directory of Open Access Journals (Sweden)

    Ivo D. Dinov

    2006-10-01

    Full Text Available The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an integrated educational web-based framework for: interactive distribution modeling, virtual online probability experimentation, statistical data analysis, visualization and integration. Following years of experience in statistical teaching at all college levels using established licensed statistical software packages, like STATA, S-PLUS, R, SPSS, SAS, Systat, etc., we have attempted to engineer a new statistics education environment, the Statistics Online Computational Resource (SOCR. This resource performs many of the standard types of statistical analysis, much like other classical tools. In addition, it is designed in a plug-in object-oriented architecture and is completely platform independent, web-based, interactive, extensible and secure. Over the past 4 years we have tested, fine-tuned and reanalyzed the SOCR framework in many of our undergraduate and graduate probability and statistics courses and have evidence that SOCR resources build student's intuition and enhance their learning.

  1. Using Statistical Process Control Charts to Identify the Steroids Era in Major League Baseball: An Educational Exercise

    Science.gov (United States)

    Hill, Stephen E.; Schvaneveldt, Shane J.

    2011-01-01

    This article presents an educational exercise in which statistical process control charts are constructed and used to identify the Steroids Era in American professional baseball. During this period (roughly 1993 until the present), numerous baseball players were alleged or proven to have used banned, performance-enhancing drugs. Also observed…

  2. Reanalysis of morphine consumption from two randomized controlled trials of gabapentin using longitudinal statistical methods

    Directory of Open Access Journals (Sweden)

    Zhang S

    2015-02-01

    Full Text Available Shiyuan Zhang,1 James Paul,2 Manyat Nantha-Aree,2 Norman Buckley,2 Uswa Shahzad,2 Ji Cheng,2 Justin DeBeer,5 Mitchell Winemaker,5 David Wismer,5 Dinshaw Punthakee,5 Victoria Avram,5 Lehana Thabane1–4 1Department of Clinical Epidemiology and Biostatistics, McMaster University, 2Department of Anesthesia, McMaster University, 3Biostatistics Unit/Centre for Evaluation of Medicines, St Joseph’s Healthcare-Hamilton, 4Population Health Research Institute, Hamilton Health Science/McMaster University, 5Department of Surgery, Division of Orthopaedics, McMaster University, Hamilton, ON, Canada Background: Postoperative pain management in total joint replacement surgery remains ineffective in up to 50% of patients and has an overwhelming impact in terms of patient well-being and health care burden. We present here an empirical analysis of two randomized controlled trials assessing whether addition of gabapentin to a multimodal perioperative analgesia regimen can reduce morphine consumption or improve analgesia for patients following total joint arthroplasty (the MOBILE trials. Methods: Morphine consumption, measured for four time periods in patients undergoing total hip or total knee arthroplasty, was analyzed using a linear mixed-effects model to provide a longitudinal estimate of the treatment effect. Repeated-measures analysis of variance and generalized estimating equations were used in a sensitivity analysis to compare the robustness of the methods. Results: There was no statistically significant difference in morphine consumption between the treatment group and a control group (mean effect size estimate 1.0, 95% confidence interval −4.7, 6.7, P=0.73. The results remained robust across different longitudinal methods. Conclusion: The results of the current reanalysis of morphine consumption align with those of the MOBILE trials. Gabapentin did not significantly reduce morphine consumption in patients undergoing major replacement surgeries. The

  3. ON STATISTICALLY CONVERGENT IN FINITE DIMENSIONAL SPACES

    OpenAIRE

    GÜNCAN, Ayşe Nur

    2009-01-01

    Abstract: In this paper, the notion of statistical convergence, which was introduced by Steinhaus (1951), was studied in Rm ; and some concepts and theorems, whose statistical correspondence for the real number sequences were given, were carried to Rm . In addition, the concepts of the statistical limit point and the statistical cluster point were given and it was mentioned that these two concepts were'nt equal in Fridy's study in 1993. These concepts were given in Rm and the i...

  4. Titanic: A Statistical Exploration.

    Science.gov (United States)

    Takis, Sandra L.

    1999-01-01

    Uses the available data about the Titanic's passengers to interest students in exploring categorical data and the chi-square distribution. Describes activities incorporated into a statistics class and gives additional resources for collecting information about the Titanic. (ASK)

  5. Emollient bath additives for the treatment of childhood eczema (BATHE): multicentre pragmatic parallel group randomised controlled trial of clinical and cost effectiveness.

    Science.gov (United States)

    Santer, Miriam; Ridd, Matthew J; Francis, Nick A; Stuart, Beth; Rumsby, Kate; Chorozoglou, Maria; Becque, Taeko; Roberts, Amanda; Liddiard, Lyn; Nollett, Claire; Hooper, Julie; Prude, Martina; Wood, Wendy; Thomas, Kim S; Thomas-Jones, Emma; Williams, Hywel C; Little, Paul

    2018-05-03

    over the 16 week period was 7.5 (SD. 6.0) in the bath additives group and 8.4 (SD 6.0) in the no bath additives group. No statistically significant difference was found in weekly POEM scores between groups over 16 weeks. After controlling for baseline severity and confounders (ethnicity, topical corticosteroid use, soap substitute use) and allowing for clustering of participants within centres and responses within participants over time, POEM scores in the no bath additives group were 0.41 points higher than in the bath additives group (95% confidence interval -0.27 to 1.10), below the published minimal clinically important difference for POEM of 3 points. The groups did not differ in secondary outcomes, economic outcomes, or adverse effects. This trial found no evidence of clinical benefit from including emollient bath additives in the standard management of eczema in children. Further research is needed into optimal regimens for leave-on emollient and soap substitutes. Current Controlled Trials ISRCTN84102309. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  6. Search Databases and Statistics

    DEFF Research Database (Denmark)

    Refsgaard, Jan C; Munk, Stephanie; Jensen, Lars J

    2016-01-01

    having strengths and weaknesses that must be considered for the individual needs. These are reviewed in this chapter. Equally critical for generating highly confident output datasets is the application of sound statistical criteria to limit the inclusion of incorrect peptide identifications from database...... searches. Additionally, careful filtering and use of appropriate statistical tests on the output datasets affects the quality of all downstream analyses and interpretation of the data. Our considerations and general practices on these aspects of phosphoproteomics data processing are presented here....

  7. Fieldcrest Cannon, Inc. Advanced Technical Preparation. Statistical Process Control (SPC). PRE-SPC 11: SPC & Graphs. Instructor Book.

    Science.gov (United States)

    Averitt, Sallie D.

    This instructor guide, which was developed for use in a manufacturing firm's advanced technical preparation program, contains the materials required to present a learning module that is designed to prepare trainees for the program's statistical process control module by improving their basic math skills in working with line graphs and teaching…

  8. Statistical methods in radiation physics

    CERN Document Server

    Turner, James E; Bogard, James S

    2012-01-01

    This statistics textbook, with particular emphasis on radiation protection and dosimetry, deals with statistical solutions to problems inherent in health physics measurements and decision making. The authors begin with a description of our current understanding of the statistical nature of physical processes at the atomic level, including radioactive decay and interactions of radiation with matter. Examples are taken from problems encountered in health physics, and the material is presented such that health physicists and most other nuclear professionals will more readily understand the application of statistical principles in the familiar context of the examples. Problems are presented at the end of each chapter, with solutions to selected problems provided online. In addition, numerous worked examples are included throughout the text.

  9. Building information for systematic improvement of the prevention of hospital-acquired pressure ulcers with statistical process control charts and regression.

    Science.gov (United States)

    Padula, William V; Mishra, Manish K; Weaver, Christopher D; Yilmaz, Taygan; Splaine, Mark E

    2012-06-01

    To demonstrate complementary results of regression and statistical process control (SPC) chart analyses for hospital-acquired pressure ulcers (HAPUs), and identify possible links between changes and opportunities for improvement between hospital microsystems and macrosystems. Ordinary least squares and panel data regression of retrospective hospital billing data, and SPC charts of prospective patient records for a US tertiary-care facility (2004-2007). A prospective cohort of hospital inpatients at risk for HAPUs was the study population. There were 337 HAPU incidences hospital wide among 43 844 inpatients. A probit regression model predicted the correlation of age, gender and length of stay on HAPU incidence (pseudo R(2)=0.096). Panel data analysis determined that for each additional day in the hospital, there was a 0.28% increase in the likelihood of HAPU incidence. A p-chart of HAPU incidence showed a mean incidence rate of 1.17% remaining in statistical control. A t-chart showed the average time between events for the last 25 HAPUs was 13.25 days. There was one 57-day period between two incidences during the observation period. A p-chart addressing Braden scale assessments showed that 40.5% of all patients were risk stratified for HAPUs upon admission. SPC charts complement standard regression analysis. SPC amplifies patient outcomes at the microsystem level and is useful for guiding quality improvement. Macrosystems should monitor effective quality improvement initiatives in microsystems and aid the spread of successful initiatives to other microsystems, followed by system-wide analysis with regression. Although HAPU incidence in this study is below the national mean, there is still room to improve HAPU incidence in this hospital setting since 0% incidence is theoretically achievable. Further assessment of pressure ulcer incidence could illustrate improvement in the quality of care and prevent HAPUs.

  10. Bringing statistics up to speed with data in analysis of lymphocyte motility.

    Science.gov (United States)

    Letendre, Kenneth; Donnadieu, Emmanuel; Moses, Melanie E; Cannon, Judy L

    2015-01-01

    Two-photon (2P) microscopy provides immunologists with 3D video of the movement of lymphocytes in vivo. Motility parameters extracted from these videos allow detailed analysis of lymphocyte motility in lymph nodes and peripheral tissues. However, standard parametric statistical analyses such as the Student's t-test are often used incorrectly, and fail to take into account confounds introduced by the experimental methods, potentially leading to erroneous conclusions about T cell motility. Here, we compare the motility of WT T cell versus PKCθ-/-, CARMA1-/-, CCR7-/-, and PTX-treated T cells. We show that the fluorescent dyes used to label T cells have significant effects on T cell motility, and we demonstrate the use of factorial ANOVA as a statistical tool that can control for these effects. In addition, researchers often choose between the use of "cell-based" parameters by averaging multiple steps of a single cell over time (e.g. cell mean speed), or "step-based" parameters, in which all steps of a cell population (e.g. instantaneous speed) are grouped without regard for the cell track. Using mixed model ANOVA, we show that we can maintain cell-based analyses without losing the statistical power of step-based data. We find that as we use additional levels of statistical control, we can more accurately estimate the speed of T cells as they move in lymph nodes as well as measure the impact of individual signaling molecules on T cell motility. As there is increasing interest in using computational modeling to understand T cell behavior in in vivo, these quantitative measures not only give us a better determination of actual T cell movement, they may prove crucial for models to generate accurate predictions about T cell behavior.

  11. Multivariate statistical process control in product quality review assessment - A case study.

    Science.gov (United States)

    Kharbach, M; Cherrah, Y; Vander Heyden, Y; Bouklouze, A

    2017-11-01

    According to the Food and Drug Administration and the European Good Manufacturing Practices (GMP) guidelines, Annual Product Review (APR) is a mandatory requirement in GMP. It consists of evaluating a large collection of qualitative or quantitative data in order to verify the consistency of an existing process. According to the Code of Federal Regulation Part 11 (21 CFR 211.180), all finished products should be reviewed annually for the quality standards to determine the need of any change in specification or manufacturing of drug products. Conventional Statistical Process Control (SPC) evaluates the pharmaceutical production process by examining only the effect of a single factor at the time using a Shewhart's chart. It neglects to take into account the interaction between the variables. In order to overcome this issue, Multivariate Statistical Process Control (MSPC) can be used. Our case study concerns an APR assessment, where 164 historical batches containing six active ingredients, manufactured in Morocco, were collected during one year. Each batch has been checked by assaying the six active ingredients by High Performance Liquid Chromatography according to European Pharmacopoeia monographs. The data matrix was evaluated both by SPC and MSPC. The SPC indicated that all batches are under control, while the MSPC, based on Principal Component Analysis (PCA), for the data being either autoscaled or robust scaled, showed four and seven batches, respectively, out of the Hotelling T 2 95% ellipse. Also, an improvement of the capability of the process is observed without the most extreme batches. The MSPC can be used for monitoring subtle changes in the manufacturing process during an APR assessment. Copyright © 2017 Académie Nationale de Pharmacie. Published by Elsevier Masson SAS. All rights reserved.

  12. Hierarchical tailoring of strut architecture to control permeability of additive manufactured titanium implants.

    Science.gov (United States)

    Zhang, Z; Jones, D; Yue, S; Lee, P D; Jones, J R; Sutcliffe, C J; Jones, E

    2013-10-01

    Porous titanium implants are a common choice for bone augmentation. Implants for spinal fusion and repair of non-union fractures must encourage blood flow after implantation so that there is sufficient cell migration, nutrient and growth factor transport to stimulate bone ingrowth. Additive manufacturing techniques allow a large number of pore network designs. This study investigates how the design factors offered by selective laser melting technique can be used to alter the implant architecture on multiple length scales to control and even tailor the flow. Permeability is a convenient parameter that characterises flow, correlating to structure openness (interconnectivity and pore window size), tortuosity and hence flow shear rates. Using experimentally validated computational simulations, we demonstrate how additive manufacturing can be used to tailor implant properties by controlling surface roughness at a microstructual level (microns), and by altering the strut ordering and density at a mesoscopic level (millimetre). Copyright © 2013 The Authors. Published by Elsevier B.V. All rights reserved.

  13. The interprocess NIR sampling as an alternative approach to multivariate statistical process control for identifying sources of product-quality variability.

    Science.gov (United States)

    Marković, Snežana; Kerč, Janez; Horvat, Matej

    2017-03-01

    We are presenting a new approach of identifying sources of variability within a manufacturing process by NIR measurements of samples of intermediate material after each consecutive unit operation (interprocess NIR sampling technique). In addition, we summarize the development of a multivariate statistical process control (MSPC) model for the production of enteric-coated pellet product of the proton-pump inhibitor class. By developing provisional NIR calibration models, the identification of critical process points yields comparable results to the established MSPC modeling procedure. Both approaches are shown to lead to the same conclusion, identifying parameters of extrusion/spheronization and characteristics of lactose that have the greatest influence on the end-product's enteric coating performance. The proposed approach enables quicker and easier identification of variability sources during manufacturing process, especially in cases when historical process data is not straightforwardly available. In the presented case the changes of lactose characteristics are influencing the performance of the extrusion/spheronization process step. The pellet cores produced by using one (considered as less suitable) lactose source were on average larger and more fragile, leading to consequent breakage of the cores during subsequent fluid bed operations. These results were confirmed by additional experimental analyses illuminating the underlying mechanism of fracture of oblong pellets during the pellet coating process leading to compromised film coating.

  14. Development of Statistical Process Control Methodology for an Environmentally Compliant Surface Cleaning Process in a Bonding Laboratory

    Science.gov (United States)

    Hutchens, Dale E.; Doan, Patrick A.; Boothe, Richard E.

    1997-01-01

    Bonding labs at both MSFC and the northern Utah production plant prepare bond test specimens which simulate or witness the production of NASA's Reusable Solid Rocket Motor (RSRM). The current process for preparing the bonding surfaces employs 1,1,1-trichloroethane vapor degreasing, which simulates the current RSRM process. Government regulations (e.g., the 1990 Amendments to the Clean Air Act) have mandated a production phase-out of a number of ozone depleting compounds (ODC) including 1,1,1-trichloroethane. In order to comply with these regulations, the RSRM Program is qualifying a spray-in-air (SIA) precision cleaning process using Brulin 1990, an aqueous blend of surfactants. Accordingly, surface preparation prior to bonding process simulation test specimens must reflect the new production cleaning process. The Bonding Lab Statistical Process Control (SPC) program monitors the progress of the lab and its capabilities, as well as certifies the bonding technicians, by periodically preparing D6AC steel tensile adhesion panels with EA-91 3NA epoxy adhesive using a standardized process. SPC methods are then used to ensure the process is statistically in control, thus producing reliable data for bonding studies, and identify any problems which might develop. Since the specimen cleaning process is being changed, new SPC limits must be established. This report summarizes side-by-side testing of D6AC steel tensile adhesion witness panels and tapered double cantilevered beams (TDCBs) using both the current baseline vapor degreasing process and a lab-scale spray-in-air process. A Proceco 26 inches Typhoon dishwasher cleaned both tensile adhesion witness panels and TDCBs in a process which simulates the new production process. The tests were performed six times during 1995, subsequent statistical analysis of the data established new upper control limits (UCL) and lower control limits (LCL). The data also demonstrated that the new process was equivalent to the vapor

  15. Do perfume additives termed human pheromones warrant being termed pheromones?

    Science.gov (United States)

    Winman, Anders

    2004-09-30

    Two studies of the effects of perfume additives, termed human pheromones by the authors, have conveyed the message that these substances can promote an increase in human sociosexual behaviour [Physiol. Behav. 75 (2003) R1; Arch. Sex. Behav. 27 (1998) R2]. The present paper presents an extended analysis of this data. It is shown that in neither study is there a statistically significant increase in any of the sociosexual behaviours for the experimental groups. In the control groups of both studies, there are, however, moderate but statistically significant decreases in the corresponding behaviour. Most notably, there is no support in data for the claim that the substances increase the attractiveness of the wearers of the substances to the other sex. It is concluded that more research using matched homogenous groups of participants is needed. Copyright 2004 Elsevier Inc.

  16. Six Sigma Quality Management System and Design of Risk-based Statistical Quality Control.

    Science.gov (United States)

    Westgard, James O; Westgard, Sten A

    2017-03-01

    Six sigma concepts provide a quality management system (QMS) with many useful tools for managing quality in medical laboratories. This Six Sigma QMS is driven by the quality required for the intended use of a test. The most useful form for this quality requirement is the allowable total error. Calculation of a sigma-metric provides the best predictor of risk for an analytical examination process, as well as a design parameter for selecting the statistical quality control (SQC) procedure necessary to detect medically important errors. Simple point estimates of sigma at medical decision concentrations are sufficient for laboratory applications. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. Non-identical anyons and new statistics for spinons and holons

    International Nuclear Information System (INIS)

    Mor, T.

    1993-01-01

    We discuss the various existing proposals for the statistics of holons and spinons (the proposed fundamental excitations of the tJ model for HTSC), and we present new anyonic alternatives. We generalize Wilczek's realization of anyons to include non-identical anyons with mutual statistics and concentrate on particular cases which preserve time-reversal symmetry. Under the restriction of time-reversal symmetry, we find two additional possibilities for the statistics of holons and spinons - either both holons and spinons are bosons or both are fermions. In addition they obey an antisymmetric mutual statistics. We discuss the pairing mechanism of holons in this model, and the microscopic origins for this statistical behavior. (orig.)

  18. Online Statistical Modeling (Regression Analysis) for Independent Responses

    Science.gov (United States)

    Made Tirta, I.; Anggraeni, Dian; Pandutama, Martinus

    2017-06-01

    Regression analysis (statistical analmodelling) are among statistical methods which are frequently needed in analyzing quantitative data, especially to model relationship between response and explanatory variables. Nowadays, statistical models have been developed into various directions to model various type and complex relationship of data. Rich varieties of advanced and recent statistical modelling are mostly available on open source software (one of them is R). However, these advanced statistical modelling, are not very friendly to novice R users, since they are based on programming script or command line interface. Our research aims to developed web interface (based on R and shiny), so that most recent and advanced statistical modelling are readily available, accessible and applicable on web. We have previously made interface in the form of e-tutorial for several modern and advanced statistical modelling on R especially for independent responses (including linear models/LM, generalized linier models/GLM, generalized additive model/GAM and generalized additive model for location scale and shape/GAMLSS). In this research we unified them in the form of data analysis, including model using Computer Intensive Statistics (Bootstrap and Markov Chain Monte Carlo/ MCMC). All are readily accessible on our online Virtual Statistics Laboratory. The web (interface) make the statistical modeling becomes easier to apply and easier to compare them in order to find the most appropriate model for the data.

  19. Addition of Ceftriaxone and Amikacin to a Ciprofloxacin plus Metronidazole Regimen for Preventing Infectious Complications of Transrectal Ultrasound-Guided Prostate Biopsy: A Randomized Controlled Trial

    Directory of Open Access Journals (Sweden)

    Mohammad-Hossein Izadpanahi

    2017-01-01

    Full Text Available Background. The objective of this study was to evaluate the efficacy of adding single doses of ceftriaxone and amikacin to a ciprofloxacin plus metronidazole regimen on the reduction of infectious complications following transrectal ultrasound-guided prostate biopsy (TRUS Bx. Materials and Methods. Four hundred and fifty patients who were candidates for TRUS Bx were divided into two groups of 225 each. The control group received ciprofloxacin 500 mg orally every 12 hours together with metronidazole 500 mg orally every 8 hours from the day prior to the procedure until the fifth postoperative day. In the second group, single doses of ceftriaxone 1 g by intravenous infusion and amikacin 5 mg/kg intramuscularly were administered 30–60 minutes before TRUS Bx in addition to the oral antimicrobials described for group 1. The incidence of infection was compared between the groups. Results. The incidence of infectious complications in the intervention group was significantly lower than that in the control group (4.6% versus 0.9%, p=0.017. Conclusion. The addition of single doses of intramuscular amikacin and intravenously infused ceftriaxone to our prophylactic regimen of ciprofloxacin plus metronidazole resulted in a statistically significant reduction of infectious complications following TRUS Bx.

  20. Estimating the Time to Benefit for Preventive Drugs with the Statistical Process Control Method: An Example with Alendronate

    NARCIS (Netherlands)

    van de Glind, Esther M. M.; Willems, Hanna C.; Eslami, Saeid; Abu-Hanna, Ameen; Lems, Willem F.; Hooft, Lotty; de Rooij, Sophia E.; Black, Dennis M.; van Munster, Barbara C.

    2016-01-01

    For physicians dealing with patients with a limited life expectancy, knowing the time to benefit (TTB) of preventive medication is essential to support treatment decisions. The aim of this study was to investigate the usefulness of statistical process control (SPC) for determining the TTB in

  1. Estimating the Time to Benefit for Preventive Drugs with the Statistical Process Control Method : An Example with Alendronate

    NARCIS (Netherlands)

    van de Glind, Esther M. M.; Willems, Hanna C.; Eslami, Saeid; Abu-Hanna, Ameen; Lems, Willem F.; Hooft, Lotty; de Rooij, Sophia E.; Black, Dennis M.; van Munster, Barbara C.

    For physicians dealing with patients with a limited life expectancy, knowing the time to benefit (TTB) of preventive medication is essential to support treatment decisions. The aim of this study was to investigate the usefulness of statistical process control (SPC) for determining the TTB in

  2. Comparison of the effect of benzoic acid addition on the fermentation process quality with untreated silages

    Directory of Open Access Journals (Sweden)

    Petr Doležal

    2004-01-01

    Full Text Available The influence of benzoic acid and formic acid (positive control of ensilaged maize and pressed sugar beet pulp on quality fermentation processes was studied in a laboratory experiment. The effect of additive on the quality of fermentation process during maize ensiling was studied in a first model experiment. Preservatives such as formic acid and benzoic acid were added to ensiled maize at the concentration of 1L/t and 1 kg/t, respectively. When benzoic acid was used as a preservative, the pH and the N-NH3/ N total ratio decreased statistically (PSugar beet pulp silages with benzoic acid or formic acid after 32 days of storage had a better sensuous evaluation than the control silage. The most intensive decrease of pH value was observed after formic acid addition as compared with control silage. The statistically significantly (P<0.05 highest lactic acid content (49.64 ± 0.28 as well as the highest ratio of LA/VFA were found in the sugar beet pulp silage with benzoic acid. Lactic acid constituted the highest percentage (P<0.05 of all fermentation acids in the silage with benzoic acid additive (65.12 ± 0.80. Undesirable butyric acid (BA was not found in any variant of silages. The positive correlation between the titration acidity and acids sum in dry matter of silage conserved with formic acid was found. The additive of organic acids reduced significantly TA and fermentation acids content. Between the pH value and lactic acid content, no correlation was found.

  3. A statistical approach to estimating effects of performance shaping factors on human error probabilities of soft controls

    International Nuclear Information System (INIS)

    Kim, Yochan; Park, Jinkyun; Jung, Wondea; Jang, Inseok; Hyun Seong, Poong

    2015-01-01

    Despite recent efforts toward data collection for supporting human reliability analysis, there remains a lack of empirical basis in determining the effects of performance shaping factors (PSFs) on human error probabilities (HEPs). To enhance the empirical basis regarding the effects of the PSFs, a statistical methodology using a logistic regression and stepwise variable selection was proposed, and the effects of the PSF on HEPs related with the soft controls were estimated through the methodology. For this estimation, more than 600 human error opportunities related to soft controls in a computerized control room were obtained through laboratory experiments. From the eight PSF surrogates and combinations of these variables, the procedure quality, practice level, and the operation type were identified as significant factors for screen switch and mode conversion errors. The contributions of these significant factors to HEPs were also estimated in terms of a multiplicative form. The usefulness and limitation of the experimental data and the techniques employed are discussed herein, and we believe that the logistic regression and stepwise variable selection methods will provide a way to estimate the effects of PSFs on HEPs in an objective manner. - Highlights: • It is necessary to develop an empirical basis for the effects of the PSFs on the HEPs. • A statistical method using a logistic regression and variable selection was proposed. • The effects of PSFs on the HEPs of soft controls were empirically investigated. • The significant factors were identified and their effects were estimated

  4. [Statistical Process Control (SPC) can help prevent treatment errors without increasing costs in radiotherapy].

    Science.gov (United States)

    Govindarajan, R; Llueguera, E; Melero, A; Molero, J; Soler, N; Rueda, C; Paradinas, C

    2010-01-01

    Statistical Process Control (SPC) was applied to monitor patient set-up in radiotherapy and, when the measured set-up error values indicated a loss of process stability, its root cause was identified and eliminated to prevent set-up errors. Set up errors were measured for medial-lateral (ml), cranial-caudal (cc) and anterior-posterior (ap) dimensions and then the upper control limits were calculated. Once the control limits were known and the range variability was acceptable, treatment set-up errors were monitored using sub-groups of 3 patients, three times each shift. These values were plotted on a control chart in real time. Control limit values showed that the existing variation was acceptable. Set-up errors, measured and plotted on a X chart, helped monitor the set-up process stability and, if and when the stability was lost, treatment was interrupted, the particular cause responsible for the non-random pattern was identified and corrective action was taken before proceeding with the treatment. SPC protocol focuses on controlling the variability due to assignable cause instead of focusing on patient-to-patient variability which normally does not exist. Compared to weekly sampling of set-up error in each and every patient, which may only ensure that just those sampled sessions were set-up correctly, the SPC method enables set-up error prevention in all treatment sessions for all patients and, at the same time, reduces the control costs. Copyright © 2009 SECA. Published by Elsevier Espana. All rights reserved.

  5. Cost and quality effectiveness of objective-based and statistically-based quality control for volatile organic compounds analyses of gases

    International Nuclear Information System (INIS)

    Bennett, J.T.; Crowder, C.A.; Connolly, M.J.

    1994-01-01

    Gas samples from drums of radioactive waste at the Department of Energy (DOE) Idaho National Engineering Laboratory are being characterized for 29 volatile organic compounds to determine the feasibility of storing the waste in DOE's Waste Isolation Pilot Plant (WIPP) in Carlsbad, New Mexico. Quality requirements for the gas chromatography (GC) and GC/mass spectrometry chemical methods used to analyze the waste are specified in the Quality Assurance Program Plan for the WIPP Experimental Waste Characterization Program. Quality requirements consist of both objective criteria (data quality objectives, DQOs) and statistical criteria (process control). The DQOs apply to routine sample analyses, while the statistical criteria serve to determine and monitor precision and accuracy (P ampersand A) of the analysis methods and are also used to assign upper confidence limits to measurement results close to action levels. After over two years and more than 1000 sample analyses there are two general conclusions concerning the two approaches to quality control: (1) Objective criteria (e.g., ± 25% precision, ± 30% accuracy) based on customer needs and the usually prescribed criteria for similar EPA- approved methods are consistently attained during routine analyses. (2) Statistical criteria based on short term method performance are almost an order of magnitude more stringent than objective criteria and are difficult to satisfy following the same routine laboratory procedures which satisfy the objective criteria. A more cost effective and representative approach to establishing statistical method performances criteria would be either to utilize a moving average of P ampersand A from control samples over a several month time period or to determine within a sample variation by one-way analysis of variance of several months replicate sample analysis results or both. Confidence intervals for results near action levels could also be determined by replicate analysis of the sample in

  6. Statistics for lawyers

    CERN Document Server

    Finkelstein, Michael O

    2015-01-01

    This classic text, first published in 1990, is designed to introduce law students, law teachers, practitioners, and judges to the basic ideas of mathematical probability and statistics as they have been applied in the law. The third edition includes over twenty new sections, including the addition of timely topics, like New York City police stops, exonerations in death-sentence cases, projecting airline costs, and new material on various statistical techniques such as the randomized response survey technique, rare-events meta-analysis, competing risks, and negative binomial regression. The book consists of sections of exposition followed by real-world cases and case studies in which statistical data have played a role. The reader is asked to apply the theory to the facts, to calculate results (a hand calculator is sufficient), and to explore legal issues raised by quantitative findings. The authors' calculations and comments are given in the back of the book. As with previous editions, the cases and case stu...

  7. Active control on high-order coherence and statistic characterization on random phase fluctuation of two classical point sources.

    Science.gov (United States)

    Hong, Peilong; Li, Liming; Liu, Jianji; Zhang, Guoquan

    2016-03-29

    Young's double-slit or two-beam interference is of fundamental importance to understand various interference effects, in which the stationary phase difference between two beams plays the key role in the first-order coherence. Different from the case of first-order coherence, in the high-order optical coherence the statistic behavior of the optical phase will play the key role. In this article, by employing a fundamental interfering configuration with two classical point sources, we showed that the high- order optical coherence between two classical point sources can be actively designed by controlling the statistic behavior of the relative phase difference between two point sources. Synchronous position Nth-order subwavelength interference with an effective wavelength of λ/M was demonstrated, in which λ is the wavelength of point sources and M is an integer not larger than N. Interestingly, we found that the synchronous position Nth-order interference fringe fingerprints the statistic trace of random phase fluctuation of two classical point sources, therefore, it provides an effective way to characterize the statistic properties of phase fluctuation for incoherent light sources.

  8. Evidence-Based Medicine as a Tool for Undergraduate Probability and Statistics Education.

    Science.gov (United States)

    Masel, J; Humphrey, P T; Blackburn, B; Levine, J A

    2015-01-01

    Most students have difficulty reasoning about chance events, and misconceptions regarding probability can persist or even strengthen following traditional instruction. Many biostatistics classes sidestep this problem by prioritizing exploratory data analysis over probability. However, probability itself, in addition to statistics, is essential both to the biology curriculum and to informed decision making in daily life. One area in which probability is particularly important is medicine. Given the preponderance of pre health students, in addition to more general interest in medicine, we capitalized on students' intrinsic motivation in this area to teach both probability and statistics. We use the randomized controlled trial as the centerpiece of the course, because it exemplifies the most salient features of the scientific method, and the application of critical thinking to medicine. The other two pillars of the course are biomedical applications of Bayes' theorem and science and society content. Backward design from these three overarching aims was used to select appropriate probability and statistics content, with a focus on eliciting and countering previously documented misconceptions in their medical context. Pretest/posttest assessments using the Quantitative Reasoning Quotient and Attitudes Toward Statistics instruments are positive, bucking several negative trends previously reported in statistics education. © 2015 J. Masel et al. CBE—Life Sciences Education © 2015 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  9. Foliar Potassium Fertilizer Additives Affect Soybean Response and Weed Control with Glyphosate

    Directory of Open Access Journals (Sweden)

    Kelly A. Nelson

    2012-01-01

    Full Text Available Research in 2004 and 2005 determined the effects of foliar-applied K-fertilizer sources (0-0-62-0 (%N-%P2O5-%K2O-%S, 0-0-25-17, 3-18-18-0, and 5-0-20-13 and additive rates (2.2, 8.8, and 17.6 kg K ha−1 on glyphosate-resistant soybean response and weed control. Field experiments were conducted at Novelty and Portageville with high soil test K and weed populations and at Malden with low soil test K and weed populations. At Novelty, grain yield increased with fertilizer additives at 8.8 kg K ha−1 in a high-yield, weed-free environment in 2004, but fertilizer additives reduced yield up to 470 kg ha−1 in a low-yield year (2005 depending on the K source and rate. At Portageville, K-fertilizer additives increased grain yield from 700 to 1160 kg ha−1 compared to diammonium sulfate, depending on the K source and rate. At Malden, there was no yield response to K sources. Differences in leaf tissue K (P=0.03, S (P=0.03, B (P=0.0001, and Cu (P=0.008 concentrations among treatments were detected 14 d after treatment at Novelty and Malden. Tank mixtures of K-fertilizer additives with glyphosate may provide an option for foliar K applications.

  10. Statistics for environmental science and management

    National Research Council Canada - National Science Library

    Manly, B.F.J

    2009-01-01

    .... Additional topics covered include environmental monitoring, impact assessment, censored data, environmental sampling, the role of statistics in environmental science, assessing site reclamation...

  11. Evaluation of calcium superphosphate as an additive to reduce gas emissions from rabbit manure

    Directory of Open Access Journals (Sweden)

    Fernando Estellés Barber

    2014-12-01

    Full Text Available Techniques to reduce the emission of air pollutants from livestock production are demanded. In this study, the effect of an additive (calcium superphosphate on gas emissions from rabbit manure was investigated and compared with a control where no additive was used. Calcium superphosphate was applied at a rate of 100 g/m2 per week in a manure pit during 2 cycles of growing rabbits. Manure samples were collected weekly and then chemically and microbiologically analysed. Gas emissions (ammonia, carbon dioxide, methane and nitrous oxide were determined in 2 open flux chambers. No differences were observed in gas emissions between the treated and control samples except for ammonia emissions, which were reduced by 33% when the additive was applied (P<0.05. No statistical differences were obtained in the microbial content between control and treatment, as results showed a high variability. Dry matter content and pH were the most influential parameters on the emission of gases from manure. According to these results, the application of calcium superphosphate may be considered as an effective technique to reduce ammonia emission from rabbit manure. The additive may also be potentially effective in other species, but additional research is necessary to investigate its performance.

  12. Maximization of DRAM yield by control of surface charge and particle addition during high dose implantation

    Science.gov (United States)

    Horvath, J.; Moffatt, S.

    1991-04-01

    Ion implantation processing exposes semiconductor devices to an energetic ion beam in order to deposit dopant ions in shallow layers. In addition to this primary process, foreign materials are deposited as particles and surface films. The deposition of particles is a major cause of IC yield loss and becomes even more significant as device dimensions are decreased. Control of particle addition in a high-volume production environment requires procedures to limit beamline and endstation sources, control of particle transport, cleaning procedures and a well grounded preventative maintenance philosophy. Control of surface charge by optimization of the ion beam and electron shower conditions and measurement with a real-time charge sensor has been effective in improving the yield of NMOS and CMOS DRAMs. Control of surface voltages to a range between 0 and -20 V was correlated with good implant yield with PI9200 implanters for p + and n + source-drain implants.

  13. Pedagogical Utilization and Assessment of the Statistic Online Computational Resource in Introductory Probability and Statistics Courses.

    Science.gov (United States)

    Dinov, Ivo D; Sanchez, Juana; Christou, Nicolas

    2008-01-01

    Technology-based instruction represents a new recent pedagogical paradigm that is rooted in the realization that new generations are much more comfortable with, and excited about, new technologies. The rapid technological advancement over the past decade has fueled an enormous demand for the integration of modern networking, informational and computational tools with classical pedagogical instruments. Consequently, teaching with technology typically involves utilizing a variety of IT and multimedia resources for online learning, course management, electronic course materials, and novel tools of communication, engagement, experimental, critical thinking and assessment.The NSF-funded Statistics Online Computational Resource (SOCR) provides a number of interactive tools for enhancing instruction in various undergraduate and graduate courses in probability and statistics. These resources include online instructional materials, statistical calculators, interactive graphical user interfaces, computational and simulation applets, tools for data analysis and visualization. The tools provided as part of SOCR include conceptual simulations and statistical computing interfaces, which are designed to bridge between the introductory and the more advanced computational and applied probability and statistics courses. In this manuscript, we describe our designs for utilizing SOCR technology in instruction in a recent study. In addition, present the results of the effectiveness of using SOCR tools at two different course intensity levels on three outcome measures: exam scores, student satisfaction and choice of technology to complete assignments. Learning styles assessment was completed at baseline. We have used three very different designs for three different undergraduate classes. Each course included a treatment group, using the SOCR resources, and a control group, using classical instruction techniques. Our findings include marginal effects of the SOCR treatment per individual

  14. Additional operations in algebra of structural numbers for control algorithm development

    Directory of Open Access Journals (Sweden)

    Morhun A.V.

    2016-12-01

    Full Text Available The structural numbers and the algebra of the structural numbers due to the simplicity of representation, flexibility and current algebraic operations are the powerful tool for a wide range of applications. In autonomous power supply systems and systems with distributed generation (Micro Grid mathematical apparatus of structural numbers can be effectively used for the calculation of the parameters of the operating modes of consumption of electric energy. The purpose of the article is the representation of the additional algebra of structural numbers. The standard algebra was proposed to be extended by the additional operations and modification current in order to expand the scope of their use, namely to construct a flexible, adaptive algorithms of control systems. It is achieved due to the possibility to consider each individual component of the system with its parameters and provide easy management of entire system and each individual component. Thus, structural numbers and extended algebra are the perspective line of research and further studying is required.

  15. Statistical analysis of the Ft. Calhoun reactor coolant pump system

    International Nuclear Information System (INIS)

    Patel, Bimal; Heising, C.D.

    1997-01-01

    In engineering science, statistical quality control techniques have traditionally been applied to control manufacturing processes. An application to commercial nuclear power plant maintenance and control is presented that can greatly improve plant safety. As a demonstration of such an approach, a specific system is analyzed: the reactor coolant pumps (RCPs) of the Ft. Calhoun nuclear power plant. This research uses capability analysis, Shewhart X-bar, R charts, canonical correlation methods, and design of experiments to analyze the process for the state of statistical control. The results obtained show that six out of ten parameters are under control specification limits and four parameters are not in the state of statistical control. The analysis shows that statistical process control methods can be applied as an early warning system capable of identifying significant equipment problems well in advance of traditional control room alarm indicators. Such a system would provide operators with ample time to respond to possible emergency situations and thus improve plant safety and reliability. (Author)

  16. Influence of red wine fermentation oenological additives on inoculated strain implantation.

    Science.gov (United States)

    Duarte, Filomena L; Alves, Ana Claudia; Alemão, Maria Filomena; Baleiras-Couto, M Margarida

    2013-06-01

    Pure selected cultures of Saccharomyces cerevisiae starters are regularly used in the wine industry. A survey of S. cerevisiae populations during red wine fermentations was performed in order to evaluate the influence of oenological additives on the implantation of the inoculated strain. Pilot scale fermentations (500 L) were conducted with active dry yeast (ADY) and other commercial oenological additives, namely two commercial fermentation activators and two commercial tannins. Six microsatellite markers were used to type S. cerevisiae strains. The methodology proved to be very discriminating as a great diversity of wild strains (48 genotypes) was detected. Statistical analysis confirmed a high detection of the inoculated commercial strain, and for half the samples an effective implantation of ADY (over 80 %) was achieved. At late fermentation time, ADY strain implantation in fermentations conducted with commercial additives was lower than in the control. These results question the efficacy of ADY addition in the presence of other additives, indicating that further studies are needed to improve knowledge on oenological additives' use.

  17. Statistical mechanics principles and selected applications

    CERN Document Server

    Hill, Terrell L

    1956-01-01

    ""Excellent … a welcome addition to the literature on the subject."" - ScienceBefore the publication of this standard, oft-cited book, there were few if any statistical-mechanics texts that incorporated reviews of both fundamental principles and recent developments in the field.In this volume, Professor Hill offers just such a dual presentation - a useful account of basic theory and of its applications, made accessible in a comprehensive format. The book opens with concise, unusually clear introductory chapters on classical statistical mechanics, quantum statistical mechanics and the relatio

  18. Aspects of Additional Psychiatric Disorders in Severe Depression/Melancholia: A Comparison between Suicides and Controls and General Pattern

    Directory of Open Access Journals (Sweden)

    Ulrika Heu

    2018-06-01

    Full Text Available Objective: Additional and comorbid diagnoses are common among suicide victims with major depressive disorder (MDD and have been shown to increase the suicide risk. The aim of the present study was first, to investigate whether patients with severe depression/melancholia who had died by suicide showed more additional psychiatric disorders than a matched control group. Second, general rates of comorbid and additional diagnoses in the total group of patients were estimated and compared with literature on MDD. Method: A blind record evaluation was performed on 100 suicide victims with severe depression/melancholia (MDD with melancholic and/or psychotic features: MDD-M/P and matched controls admitted to the Department of Psychiatry, Lund, Sweden between 1956 and 1969 and monitored to 2010. Diagnoses in addition to severe depression were noted. Results: Less than half of both the suicides and controls had just one psychiatric disorder (47% in the suicide and 46% in the control group. The average number of diagnoses was 1.80 and 1.82, respectively. Additional diagnoses were not related to an increased suicide risk. Anxiety was the most common diagnosis. Occurrence of suspected schizophrenia/schizotypal or additional obsessive-compulsive symptoms were more common than expected, but alcohol use disorders did not appear very frequent. Conclusions: The known increased risk of suicide in MDD with comorbid/additional diagnoses does not seem to apply to persons with MDD-M/P (major depressive disorder-depression/Melancholia. Some diagnoses, such as schizophrenia/schizotypal disorders, were more frequent than expected, which is discussed, and a genetic overlap with MDD-M/P is proposed.

  19. A statistical model of uplink inter-cell interference with slow and fast power control mechanisms

    KAUST Repository

    Tabassum, Hina; Yilmaz, Ferkan; Dawy, Zaher; Alouini, Mohamed-Slim

    2013-01-01

    Uplink power control is in essence an interference mitigation technique that aims at minimizing the inter-cell interference (ICI) in cellular networks by reducing the transmit power levels of the mobile users while maintaining their target received signal quality levels at base stations. Power control mechanisms directly impact the interference dynamics and, thus, affect the overall achievable capacity and consumed power in cellular networks. Due to the stochastic nature of wireless channels and mobile users' locations, it is important to derive theoretical models for ICI that can capture the impact of design alternatives related to power control mechanisms. To this end, we derive and verify a novel statistical model for uplink ICI in Generalized-K composite fading environments as a function of various slow and fast power control mechanisms. The derived expressions are then utilized to quantify numerically key network performance metrics that include average resource fairness, average reduction in power consumption, and ergodic capacity. The accuracy of the derived expressions is validated via Monte-Carlo simulations. Results are generated for multiple network scenarios, and insights are extracted to assess various power control mechanisms as a function of system parameters. © 1972-2012 IEEE.

  20. A statistical model of uplink inter-cell interference with slow and fast power control mechanisms

    KAUST Repository

    Tabassum, Hina

    2013-09-01

    Uplink power control is in essence an interference mitigation technique that aims at minimizing the inter-cell interference (ICI) in cellular networks by reducing the transmit power levels of the mobile users while maintaining their target received signal quality levels at base stations. Power control mechanisms directly impact the interference dynamics and, thus, affect the overall achievable capacity and consumed power in cellular networks. Due to the stochastic nature of wireless channels and mobile users\\' locations, it is important to derive theoretical models for ICI that can capture the impact of design alternatives related to power control mechanisms. To this end, we derive and verify a novel statistical model for uplink ICI in Generalized-K composite fading environments as a function of various slow and fast power control mechanisms. The derived expressions are then utilized to quantify numerically key network performance metrics that include average resource fairness, average reduction in power consumption, and ergodic capacity. The accuracy of the derived expressions is validated via Monte-Carlo simulations. Results are generated for multiple network scenarios, and insights are extracted to assess various power control mechanisms as a function of system parameters. © 1972-2012 IEEE.

  1. A bibliometric analysis of 50 years of worldwide research on statistical process control

    Directory of Open Access Journals (Sweden)

    Fabiane Letícia Lizarelli

    Full Text Available Abstract An increasing number of papers on statistical process control (SPC has emerged in the last fifty years, especially in the last fifteen years. This may be attributed to the increased global competitiveness generated by innovation and the continuous improvement of products and processes. In this sense, SPC has a fundamentally important role in quality and production systems. The research in this paper considers the context of technological improvement and innovation of products and processes to increase corporate competitiveness. There are several other statistical technics and tools for assisting continuous improvement and innovation of products and processes but, despite the limitations in their use in the improvement projects, there is growing concern about the use of SPC. A gap between the SPC technics taught in engineering courses and their practical applications to industrial problems is observed in empirical research; thus, it is important to understand what has been done and identify the trends in SPC research. The bibliometric study in this paper is proposed in this direction and uses the Web of Science (WoS database. Data analysis indicates that there was a growth rate of more than 90% in the number of publications on SPC after 1990. Our results reveal the countries where these publications have come from, the authors with the highest number of papers and their networks. Main sources of publications are also identified; it is observed that the publications of SPC papers are concentrated in some of the international research journals, not necessarily those with the major high-impact factors. Furthermore, the papers are focused on industrial engineering, operations research and management science fields. The most common term found in the papers was cumulative sum control charts, but new topics have emerged and have been researched in the past ten years, such as multivariate methods for process monitoring and nonparametric methods.

  2. IMPORTANCE OF MATERIAL BALANCES AND THEIR STATISTICAL EVALUATION IN RUSSIAN MATERIAL, PROTECTION, CONTROL AND ACCOUNTING

    International Nuclear Information System (INIS)

    Fishbone, L.G.

    1999-01-01

    While substantial work has been performed in the Russian MPC and A Program, much more needs to be done at Russian nuclear facilities to complete four necessary steps. These are (1) periodically measuring the physical inventory of nuclear material, (2) continuously measuring the flows of nuclear material, (3) using the results to close the material balance, particularly at bulk processing facilities, and (4) statistically evaluating any apparent loss of nuclear material. The periodic closing of material balances provides an objective test of the facility's system of nuclear material protection, control and accounting. The statistical evaluation using the uncertainties associated with individual measurement systems involved in the calculation of the material balance provides a fair standard for concluding whether the apparent loss of nuclear material means a diversion or whether the facility's accounting system needs improvement. In particular, if unattractive flow material at a facility is not measured well, the accounting system cannot readily detect the loss of attractive material if the latter substantially derives from the former

  3. Hierarchical tailoring of strut architecture to control permeability of additive manufactured titanium implants

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Z. [Department of Materials, Imperial College London, South Kensington Campus, London, SW7 2AZ (United Kingdom); Jones, D. [School of Engineering, University of Liverpool, Brownlow Hill, Liverpool, L69 3GH (United Kingdom); Yue, S. [Manchester X-ray Imaging Facility, School of Materials, The University of Manchester, Oxford Road, M13 9PL (United Kingdom); Lee, P.D., E-mail: peter.lee@manchester.ac.uk [Manchester X-ray Imaging Facility, School of Materials, The University of Manchester, Oxford Road, M13 9PL (United Kingdom); Jones, J.R. [Department of Materials, Imperial College London, South Kensington Campus, London, SW7 2AZ (United Kingdom); Sutcliffe, C.J. [School of Engineering, University of Liverpool, Brownlow Hill, Liverpool, L69 3GH (United Kingdom); Jones, E. [Department of Advanced Technology, Stryker Orthopaedics, Raheen Business Park, Limerick (Ireland)

    2013-10-15

    Porous titanium implants are a common choice for bone augmentation. Implants for spinal fusion and repair of non-union fractures must encourage blood flow after implantation so that there is sufficient cell migration, nutrient and growth factor transport to stimulate bone ingrowth. Additive manufacturing techniques allow a large number of pore network designs. This study investigates how the design factors offered by selective laser melting technique can be used to alter the implant architecture on multiple length scales to control and even tailor the flow. Permeability is a convenient parameter that characterises flow, correlating to structure openness (interconnectivity and pore window size), tortuosity and hence flow shear rates. Using experimentally validated computational simulations, we demonstrate how additive manufacturing can be used to tailor implant properties by controlling surface roughness at a microstructual level (microns), and by altering the strut ordering and density at a mesoscopic level (millimetre). Highlights: • Experimentally validated permeability prediction tools for hierarchical implants. • Randomised structures form preferential flow channels with stronger shear flows. • Hierarchical strut structures allow independent tailoring of flow and pore size.

  4. Hierarchical tailoring of strut architecture to control permeability of additive manufactured titanium implants

    International Nuclear Information System (INIS)

    Zhang, Z.; Jones, D.; Yue, S.; Lee, P.D.; Jones, J.R.; Sutcliffe, C.J.; Jones, E.

    2013-01-01

    Porous titanium implants are a common choice for bone augmentation. Implants for spinal fusion and repair of non-union fractures must encourage blood flow after implantation so that there is sufficient cell migration, nutrient and growth factor transport to stimulate bone ingrowth. Additive manufacturing techniques allow a large number of pore network designs. This study investigates how the design factors offered by selective laser melting technique can be used to alter the implant architecture on multiple length scales to control and even tailor the flow. Permeability is a convenient parameter that characterises flow, correlating to structure openness (interconnectivity and pore window size), tortuosity and hence flow shear rates. Using experimentally validated computational simulations, we demonstrate how additive manufacturing can be used to tailor implant properties by controlling surface roughness at a microstructual level (microns), and by altering the strut ordering and density at a mesoscopic level (millimetre). Highlights: • Experimentally validated permeability prediction tools for hierarchical implants. • Randomised structures form preferential flow channels with stronger shear flows. • Hierarchical strut structures allow independent tailoring of flow and pore size

  5. A handbook of statistical graphics using SAS ODS

    CERN Document Server

    Der, Geoff

    2014-01-01

    An Introduction to Graphics: Good Graphics, Bad Graphics, Catastrophic Graphics and Statistical GraphicsThe Challenger DisasterGraphical DisplaysA Little History and Some Early Graphical DisplaysGraphical DeceptionAn Introduction to ODS GraphicsGenerating ODS GraphsODS DestinationsStatistical Graphics ProceduresODS Graphs from Statistical ProceduresControlling ODS GraphicsControlling Labelling in GraphsODS Graphics EditorGraphs for Displaying the Characteristics of Univariate Data: Horse Racing, Mortality Rates, Forearm Lengths, Survival Times and Geyser EruptionsIntroductionPie Chart, Bar Cha

  6. Statistical methods to monitor the West Valley off-gas system

    International Nuclear Information System (INIS)

    Eggett, D.L.

    1990-01-01

    This paper reports on the of-gas system for the ceramic melter operated at the West Valley Demonstration Project at West Valley, NY, monitored during melter operation. A one-at-a-time method of monitoring the parameters of the off-gas system is not statistically sound. Therefore, multivariate statistical methods appropriate for the monitoring of many correlated parameters will be used. Monitoring a large number of parameters increases the probability of a false out-of-control signal. If the parameters being monitored are statistically independent, the control limits can be easily adjusted to obtain the desired probability of a false out-of-control signal. The principal component (PC) scores have desirable statistical properties when the original variables are distributed as multivariate normals. Two statistics derived from the PC scores and used to form multivariate control charts are outlined and their distributional properties reviewed

  7. Statistical Yearbook of Norway 2012

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2012-07-01

    The Statistical Yearbook of Norway 2012 contains statistics on Norway and main figures for the Nordic countries and other countries selected from international statistics. The international over-views are integrated with the other tables and figures. The selection of tables in this edition is mostly the same as in the 2011 edition. The yearbook's 480 tables and figures present the main trends in official statistics in most areas of society. The list of tables and figures and an index at the back of the book provide easy access to relevant information. In addition, source information and Internet addresses below the tables make the yearbook a good starting point for those who are looking for more detailed statistics. The statistics are based on data gathered in statistical surveys and from administrative data, which, in cooperation with other public institutions, have been made available for statistical purposes. Some tables have been prepared in their entirety by other public institutions. The statistics follow approved principles, standards and classifications that are in line with international recommendations and guidelines. Content: 00. General subjects; 01. Environment; 02. Population; 03. Health and social conditions; 04. Education; 05. Personal economy and housing conditions; 06. Labour market; 07. Recreational, cultural and sporting activities; 08. Prices and indices; 09. National Economy and external trade; 10. Industrial activities; 11. Financial markets; 12. Public finances; Geographical survey.(eb)

  8. Statistical Yearbook of Norway 2012

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2012-07-01

    The Statistical Yearbook of Norway 2012 contains statistics on Norway and main figures for the Nordic countries and other countries selected from international statistics. The international over-views are integrated with the other tables and figures. The selection of tables in this edition is mostly the same as in the 2011 edition. The yearbook's 480 tables and figures present the main trends in official statistics in most areas of society. The list of tables and figures and an index at the back of the book provide easy access to relevant information. In addition, source information and Internet addresses below the tables make the yearbook a good starting point for those who are looking for more detailed statistics. The statistics are based on data gathered in statistical surveys and from administrative data, which, in cooperation with other public institutions, have been made available for statistical purposes. Some tables have been prepared in their entirety by other public institutions. The statistics follow approved principles, standards and classifications that are in line with international recommendations and guidelines. Content: 00. General subjects; 01. Environment; 02. Population; 03. Health and social conditions; 04. Education; 05. Personal economy and housing conditions; 06. Labour market; 07. Recreational, cultural and sporting activities; 08. Prices and indices; 09. National Economy and external trade; 10. Industrial activities; 11. Financial markets; 12. Public finances; Geographical survey.(eb)

  9. Statistical analysis of the Ft. Calhoun reactor coolant pump system

    International Nuclear Information System (INIS)

    Heising, Carolyn D.

    1998-01-01

    In engineering science, statistical quality control techniques have traditionally been applied to control manufacturing processes. An application to commercial nuclear power plant maintenance and control is presented that can greatly improve plant safety. As a demonstration of such an approach to plant maintenance and control, a specific system is analyzed: the reactor coolant pumps (RCPs) of the Ft. Calhoun nuclear power plant. This research uses capability analysis, Shewhart X-bar, R-charts, canonical correlation methods, and design of experiments to analyze the process for the state of statistical control. The results obtained show that six out of ten parameters are under control specifications limits and four parameters are not in the state of statistical control. The analysis shows that statistical process control methods can be applied as an early warning system capable of identifying significant equipment problems well in advance of traditional control room alarm indicators Such a system would provide operators with ample time to respond to possible emergency situations and thus improve plant safety and reliability. (author)

  10. Estimating the Time to Benefit for Preventive Drugs with the Statistical Process Control Method :  An Example with Alendronate

    NARCIS (Netherlands)

    van de Glind, Esther M M; Willems, Hanna C.; Eslami, Saeid; Abu-Hanna, Ameen; Lems, Willem F.; Hooft, Lotty; de Rooij, Sophia E.; Black, Dennis M.; van Munster, Barbara C.

    2016-01-01

    Background For physicians dealing with patients with a limited life expectancy, knowing the time to benefit (TTB) of preventive medication is essential to support treatment decisions. Objective The aim of this study was to investigate the usefulness of statistical process control (SPC) for

  11. Controllable Microdroplet Splitting via Additional Lateral Flow and its Application in Rapid Synthesis of Multi-scale Microspheres

    KAUST Repository

    Zhou, Bingpu; Wang, Cong; Xiao, Xiao; Hui, Yu Sanna; Cao, Yulin; Wen, Weijia

    2015-01-01

    In this paper, we demonstrate that controllable microdroplet splitting could be obtained via additional lateral flow with simplicity and high controllability. The volume ratio of the two splitting products can be flexibly regulated by adjusting

  12. Statistical inference for financial engineering

    CERN Document Server

    Taniguchi, Masanobu; Ogata, Hiroaki; Taniai, Hiroyuki

    2014-01-01

    This monograph provides the fundamentals of statistical inference for financial engineering and covers some selected methods suitable for analyzing financial time series data. In order to describe the actual financial data, various stochastic processes, e.g. non-Gaussian linear processes, non-linear processes, long-memory processes, locally stationary processes etc. are introduced and their optimal estimation is considered as well. This book also includes several statistical approaches, e.g., discriminant analysis, the empirical likelihood method, control variate method, quantile regression, realized volatility etc., which have been recently developed and are considered to be powerful tools for analyzing the financial data, establishing a new bridge between time series and financial engineering. This book is well suited as a professional reference book on finance, statistics and statistical financial engineering. Readers are expected to have an undergraduate-level knowledge of statistics.

  13. Automation in Siemens fuel manufacturing - the basis for quality improvement by statistical process control (SPC)

    International Nuclear Information System (INIS)

    Drecker, St.; Hoff, A.; Dietrich, M.; Guldner, R.

    1999-01-01

    Statistical Process Control (SPC) is one of the systematic tools to perform a valuable contribution to the control and planning activities for manufacturing processes and product quality. Advanced Nuclear Fuels GmbH (ANF) started a program to introduce SPC in all sections of the manufacturing process of fuel assemblies. The concept phase is based on a realization of SPC in 3 pilot projects. The existing manufacturing devices are reviewed for the utilization of SPC. Subsequent modifications were made to provide the necessary interfaces. The processes 'powder/pellet manufacturing'. 'cladding tube manufacturing' and 'laser-welding of spacers' are located at the different locations of ANF. Due to the completion of the first steps and the experience obtained by the pilot projects, the introduction program for SPC has already been extended to other manufacturing processes. (authors)

  14. Multileaf collimator performance monitoring and improvement using semiautomated quality control testing and statistical process control

    International Nuclear Information System (INIS)

    Létourneau, Daniel; McNiven, Andrea; Keller, Harald; Wang, An; Amin, Md Nurul; Pearce, Jim; Norrlinger, Bernhard; Jaffray, David A.

    2014-01-01

    Purpose: High-quality radiation therapy using highly conformal dose distributions and image-guided techniques requires optimum machine delivery performance. In this work, a monitoring system for multileaf collimator (MLC) performance, integrating semiautomated MLC quality control (QC) tests and statistical process control tools, was developed. The MLC performance monitoring system was used for almost a year on two commercially available MLC models. Control charts were used to establish MLC performance and assess test frequency required to achieve a given level of performance. MLC-related interlocks and servicing events were recorded during the monitoring period and were investigated as indicators of MLC performance variations. Methods: The QC test developed as part of the MLC performance monitoring system uses 2D megavoltage images (acquired using an electronic portal imaging device) of 23 fields to determine the location of the leaves with respect to the radiation isocenter. The precision of the MLC performance monitoring QC test and the MLC itself was assessed by detecting the MLC leaf positions on 127 megavoltage images of a static field. After initial calibration, the MLC performance monitoring QC test was performed 3–4 times/week over a period of 10–11 months to monitor positional accuracy of individual leaves for two different MLC models. Analysis of test results was performed using individuals control charts per leaf with control limits computed based on the measurements as well as two sets of specifications of ±0.5 and ±1 mm. Out-of-specification and out-of-control leaves were automatically flagged by the monitoring system and reviewed monthly by physicists. MLC-related interlocks reported by the linear accelerator and servicing events were recorded to help identify potential causes of nonrandom MLC leaf positioning variations. Results: The precision of the MLC performance monitoring QC test and the MLC itself was within ±0.22 mm for most MLC leaves

  15. Multileaf collimator performance monitoring and improvement using semiautomated quality control testing and statistical process control.

    Science.gov (United States)

    Létourneau, Daniel; Wang, An; Amin, Md Nurul; Pearce, Jim; McNiven, Andrea; Keller, Harald; Norrlinger, Bernhard; Jaffray, David A

    2014-12-01

    High-quality radiation therapy using highly conformal dose distributions and image-guided techniques requires optimum machine delivery performance. In this work, a monitoring system for multileaf collimator (MLC) performance, integrating semiautomated MLC quality control (QC) tests and statistical process control tools, was developed. The MLC performance monitoring system was used for almost a year on two commercially available MLC models. Control charts were used to establish MLC performance and assess test frequency required to achieve a given level of performance. MLC-related interlocks and servicing events were recorded during the monitoring period and were investigated as indicators of MLC performance variations. The QC test developed as part of the MLC performance monitoring system uses 2D megavoltage images (acquired using an electronic portal imaging device) of 23 fields to determine the location of the leaves with respect to the radiation isocenter. The precision of the MLC performance monitoring QC test and the MLC itself was assessed by detecting the MLC leaf positions on 127 megavoltage images of a static field. After initial calibration, the MLC performance monitoring QC test was performed 3-4 times/week over a period of 10-11 months to monitor positional accuracy of individual leaves for two different MLC models. Analysis of test results was performed using individuals control charts per leaf with control limits computed based on the measurements as well as two sets of specifications of ± 0.5 and ± 1 mm. Out-of-specification and out-of-control leaves were automatically flagged by the monitoring system and reviewed monthly by physicists. MLC-related interlocks reported by the linear accelerator and servicing events were recorded to help identify potential causes of nonrandom MLC leaf positioning variations. The precision of the MLC performance monitoring QC test and the MLC itself was within ± 0.22 mm for most MLC leaves and the majority of the

  16. Statistical control chart and neural network classification for improving human fall detection

    KAUST Repository

    Harrou, Fouzi; Zerrouki, Nabil; Sun, Ying; Houacine, Amrane

    2017-01-01

    This paper proposes a statistical approach to detect and classify human falls based on both visual data from camera and accelerometric data captured by accelerometer. Specifically, we first use a Shewhart control chart to detect the presence of potential falls by using accelerometric data. Unfortunately, this chart cannot distinguish real falls from fall-like actions, such as lying down. To bypass this difficulty, a neural network classifier is then applied only on the detected cases through visual data. To assess the performance of the proposed method, experiments are conducted on the publicly available fall detection databases: the University of Rzeszow's fall detection (URFD) dataset. Results demonstrate that the detection phase play a key role in reducing the number of sequences used as input into the neural network classifier for classification, significantly reducing computational burden and achieving better accuracy.

  17. Statistical control chart and neural network classification for improving human fall detection

    KAUST Repository

    Harrou, Fouzi

    2017-01-05

    This paper proposes a statistical approach to detect and classify human falls based on both visual data from camera and accelerometric data captured by accelerometer. Specifically, we first use a Shewhart control chart to detect the presence of potential falls by using accelerometric data. Unfortunately, this chart cannot distinguish real falls from fall-like actions, such as lying down. To bypass this difficulty, a neural network classifier is then applied only on the detected cases through visual data. To assess the performance of the proposed method, experiments are conducted on the publicly available fall detection databases: the University of Rzeszow\\'s fall detection (URFD) dataset. Results demonstrate that the detection phase play a key role in reducing the number of sequences used as input into the neural network classifier for classification, significantly reducing computational burden and achieving better accuracy.

  18. Sustained impact of a short small group course with systematic feedback in addition to regular clinical clerkship activities on musculoskeletal examination skills--a controlled study.

    Science.gov (United States)

    Perrig, Martin; Berendonk, Christoph; Rogausch, Anja; Beyeler, Christine

    2016-01-28

    The discrepancy between the extensive impact of musculoskeletal complaints and the common deficiencies in musculoskeletal examination skills lead to increased emphasis on structured teaching and assessment. However, studies of single interventions are scarce and little is known about the time-dependent effect of assisted learning in addition to a standard curriculum. We therefore evaluated the immediate and long-term impact of a small group course on musculoskeletal examination skills. All 48 Year 4 medical students of a 6 year curriculum, attending their 8 week clerkship of internal medicine at one University department in Berne, participated in this controlled study. Twenty-seven students were assigned to the intervention of a 6×1 h practical course (4-7 students, interactive hands-on examination of real patients; systematic, detailed feedback to each student by teacher, peers and patients). Twenty-one students took part in the regular clerkship activities only and served as controls. In all students clinical skills (CS, 9 items) were assessed in an Objective Structured Clinical Examination (OSCE) station, including specific musculoskeletal examination skills (MSES, 7 items) and interpersonal skills (IPS, 2 items). Two raters assessed the skills on a 4-point Likert scale at the beginning (T0), the end (T1) and 4-12 months after (T2) the clerkship. Statistical analyses included Friedman test, Wilcoxon rank sum test and Mann-Whitney U test. At T0 there were no significant differences between the intervention and control group. At T1 and T2 the control group showed no significant changes of CS, MSES and IPS compared to T0. In contrast, the intervention group significantly improved CS, MSES and IPS at T1 (p skills during regular clinical clerkship activities. However, an additional small group, interactive clinical skills course with feedback from various sources, improved these essential examination skills immediately after the teaching and several months later

  19. 76 FR 33419 - Nationally Recognized Statistical Rating Organizations

    Science.gov (United States)

    2011-06-08

    ... 232, 240, 249, et al. Nationally Recognized Statistical Rating Organizations; Proposed Rule #0;#0...-11] RIN 3235-AL15 Nationally Recognized Statistical Rating Organizations AGENCY: Securities and... rating organizations (``NRSROs''). In addition, in accordance with the Dodd-Frank Act, the Commission is...

  20. Monthly bulletin of statistics. May 1995

    International Nuclear Information System (INIS)

    1995-01-01

    The purpose of this publication is to present current monthly economic statistics for most of the countries and territories of the world. In addition, each month a different selection of special tables is presented showing annual and/or quarterly data on a variety of subjects illustrating important economic long-term trends and developments. Most of these special tables are also reproduced in the United Nations Statistical Yearbook. It is, however, considered to be useful to publish these data in the Bulletin as soon as they become available so that readers may have immediate access to the most current international statistical information

  1. Monthly bulletin of statistics. January 1996

    International Nuclear Information System (INIS)

    1996-01-01

    The purpose of this publication is to present current monthly economic statistics for most of the countries and territories of the world. In addition, each month a different selection of special tables is presented showing annual and/or quarterly data on a variety of subjects illustrating important economic long-term trends and developments. Most of these special tables are also reproduced in the United Nations Statistical Yearbook. It is, however, considered to be useful to publish these data in the Bulletin as soon as they become available so that readers may have immediate access to the most current international statistical information

  2. Monthly bulletin of statistics. July 1997

    International Nuclear Information System (INIS)

    1997-01-01

    The purpose of this publication is to present current monthly economic statistics for most of the countries and territories of the world. In addition, each month a different selection of special tables is presented showing annual and/or quarterly data on a variety of subjects illustrating important economic long-term trends and developments. Most of these special tables are also reproduced in the United Nations Statistical Yearbook. It is, however, considered to be useful to publish these data in the Bulletin as soon as they become available so that readers may have immediate access to the most current international statistical information

  3. Monthly bulletin of statistics. June 2001

    International Nuclear Information System (INIS)

    2001-01-01

    The purpose of this publication is to present current monthly economic statistics for most of the countries and territories of the world. In addition, each month a different selection of special tables is presented showing annual and/or quarterly data on a variety of subjects illustrating important economic long-term trends and developments. Most of these special tables are also reproduced in the United Nations Statistical Yearbook. It is, however, considered to be useful to publish these data in the Bulletin as soon as they become available so that readers may have immediate access to the most current international statistical information

  4. Monthly bulletin of statistics. September 1995

    International Nuclear Information System (INIS)

    1995-01-01

    The purpose of this publication is to present current monthly economic statistics for most of the countries and territories of the world. In addition, each month a different selection of special tables is presented showing annual and/or quarterly data on a variety of subjects illustrating important economic long-term trends and developments. Most of these special tables are also reproduced in the United Nations Statistical Yearbook. It is, however, considered to be useful to publish these data in the Bulletin as soon as they become available so that readers may have immediate access to the most current international statistical information

  5. Monthly bulletin of statistics. December 1994

    International Nuclear Information System (INIS)

    1994-01-01

    The purpose of this publication is to present current monthly economic statistics for most of the countries and territories of the world. In addition, each month a different selection of special tables is presented showing annual and/or quarterly data on a variety of subjects illustrating important economic long-term trends and developments. Most of these special tables are also reproduced in the United Nations Statistical Yearbook. It is, however, considered to be useful to publish these data in the Bulletin as soon as they become available so that readers may have immediate access to the most current international statistical information

  6. Monthly bulletin of statistics. July 1995

    International Nuclear Information System (INIS)

    1995-01-01

    The purpose of this publication is to present current monthly economic statistics for most of the countries and territories of the world. In addition, each month a different selection of special tables is presented showing annual and/or quarterly data on a variety of subjects illustrating important economic long-term trends and developments. Most of these special tables are also reproduced in the United Nations Statistical Yearbook. It is, however, considered to be useful to publish these data in the Bulletin as soon as they become available so that readers may have immediate access to the most current international statistical information

  7. Monthly bulletin of statistics. September 1994

    International Nuclear Information System (INIS)

    1994-01-01

    The purpose of this publication is to present current monthly economic statistics for most of the countries and territories of the world. In addition, each month a different selection of special tables is presented showing annual and/or quarterly data on a variety of subjects illustrating important economic long-term trends and developments. Most of these special tables are also reproduced in the United Nations Statistical Yearbook. It is, however, considered to be useful to publish these data in the Bulletin as soon as they become available so that readers may have immediate access to the most current international statistical information

  8. Monthly bulletin of statistics. March 1995

    International Nuclear Information System (INIS)

    1994-01-01

    The purpose of this publication is to present current monthly economic statistics for most of the countries and territories of the world. In addition, each month a different selection of special tables is presented showing annual and/or quarterly data on a variety of subjects illustrating important economic long-term trends and developments. Most of these special tables are also reproduced in the United Nations Statistical Yearbook. It is, however, considered to be useful to publish these data in the Bulletin as soon as they become available so that readers may have immediate access to the most current international statistical information

  9. Monthly bulletin of statistics. October 2007

    International Nuclear Information System (INIS)

    2007-01-01

    The purpose of this publication is to present current monthly economic statistics for most of the countries and territories of the world. In addition, each month a different selection of special tables is presented showing annual and/or quarterly data on a variety of subjects illustrating important economic long-term trends and developments. Most of these special tables are also reproduced in the United Nations Statistical Yearbook. It is, however, considered to be useful to publish these data in the Bulletin as soon as they become available so that readers may have immediate access to the most current international statistical information

  10. Monthly bulletin of statistics. October 1993

    International Nuclear Information System (INIS)

    1993-01-01

    The purpose of this publication is to present current monthly economic statistics for most of the countries and territories of the world. In addition, each month a different selection of special tables is presented showing annual and/or quarterly data on a variety of subjects illustrating important economic long-term trends and developments. Most of these special tables are also reproduced in the United Nations Statistical Yearbook. It is, however, considered to be useful to publish these data in the Bulletin as soon as they become available so that readers may have immediate access to the most current international statistical information

  11. Monthly Bulletin of Statistics. July 1993

    International Nuclear Information System (INIS)

    1993-01-01

    The purpose of this publication is to present current monthly economic statistics for most of the countries and territories of the world. In addition, each month a different selection of special tables is presented showing annual and/or quarterly data on a variety of subjects illustrating important economic long-term trends and developments. Most of these special tables are also reproduced in the United Nations Statistical Yearbook. It is, however, considered to be useful to publish these data in the Bulletin as soon as they become available so that readers may have immediate access to the most current international statistical information

  12. Monthly bulletin of statistics. March 1994

    International Nuclear Information System (INIS)

    1994-01-01

    The purpose of this publication is to present current monthly economic statistics for most of the countries and territories of the world. In addition, each month a different selection of special tables is presented showing annual and/or quarterly data on a variety of subjects illustrating important economic long-term trends and developments. Most of these special tables are also reproduced in the United Nations Statistical Yearbook. It is, however, considered to be useful to publish these data in the Bulletin as soon as they become available so that readers may have immediate access to the most current international statistical information

  13. Monthly bulletin of statistics. February 1994

    International Nuclear Information System (INIS)

    1994-01-01

    The purpose of this publication is to present current monthly economic statistics for most of the countries and territories of the world. In addition, each month a different selection of special tables is presented showing annual and/or quarterly data on a variety of subjects illustrating important economic long-term trends and developments. Most of these special tables are also reproduced in the United Nations Statistical Yearbook. It is, however, considered to be useful to publish these data in the Bulletin as soon as they become available so that readers may have immediate access to the most current international statistical information

  14. Monthly bulletin of statistics. June 1995

    International Nuclear Information System (INIS)

    1995-01-01

    The purpose of this publication is to present current monthly economic statistics for most of the countries and territories of the world. In addition, each month a different selection of special tables is presented showing annual and/or quarterly data on a variety of subjects illustrating important economic long-term trends and developments. Most of these special tables are also reproduced in the United Nations Statistical Yearbook. It is, however, considered to be useful to publish these data in the Bulletin as soon as they become available so that readers may have immediate access to the most current international statistical information

  15. Monthly bulletin of statistics. November 2008

    International Nuclear Information System (INIS)

    2008-01-01

    The purpose of this publication is to present current monthly economic statistics for most of the countries and territories of the world. In addition, each month a different selection of special tables is presented showing annual and/or quarterly data on a variety of subjects illustrating important economic long-term trends and developments. Most of these special tables are also reproduced in the United Nations Statistical Yearbook. It is, however, considered to be useful to publish these data in the Bulletin as soon as they become available so that readers may have immediate access to the most current international statistical information

  16. Microcanonical ensemble extensive thermodynamics of Tsallis statistics

    International Nuclear Information System (INIS)

    Parvan, A.S.

    2005-01-01

    The microscopic foundation of the generalized equilibrium statistical mechanics based on the Tsallis entropy is given by using the Gibbs idea of statistical ensembles of the classical and quantum mechanics.The equilibrium distribution functions are derived by the thermodynamic method based upon the use of the fundamental equation of thermodynamics and the statistical definition of the functions of the state of the system. It is shown that if the entropic index ξ = 1/q - 1 in the microcanonical ensemble is an extensive variable of the state of the system, then in the thermodynamic limit z bar = 1/(q - 1)N = const the principle of additivity and the zero law of thermodynamics are satisfied. In particular, the Tsallis entropy of the system is extensive and the temperature is intensive. Thus, the Tsallis statistics completely satisfies all the postulates of the equilibrium thermodynamics. Moreover, evaluation of the thermodynamic identities in the microcanonical ensemble is provided by the Euler theorem. The principle of additivity and the Euler theorem are explicitly proved by using the illustration of the classical microcanonical ideal gas in the thermodynamic limit

  17. Microcanonical ensemble extensive thermodynamics of Tsallis statistics

    International Nuclear Information System (INIS)

    Parvan, A.S.

    2006-01-01

    The microscopic foundation of the generalized equilibrium statistical mechanics based on the Tsallis entropy is given by using the Gibbs idea of statistical ensembles of the classical and quantum mechanics. The equilibrium distribution functions are derived by the thermodynamic method based upon the use of the fundamental equation of thermodynamics and the statistical definition of the functions of the state of the system. It is shown that if the entropic index ξ=1/(q-1) in the microcanonical ensemble is an extensive variable of the state of the system, then in the thermodynamic limit z-bar =1/(q-1)N=const the principle of additivity and the zero law of thermodynamics are satisfied. In particular, the Tsallis entropy of the system is extensive and the temperature is intensive. Thus, the Tsallis statistics completely satisfies all the postulates of the equilibrium thermodynamics. Moreover, evaluation of the thermodynamic identities in the microcanonical ensemble is provided by the Euler theorem. The principle of additivity and the Euler theorem are explicitly proved by using the illustration of the classical microcanonical ideal gas in the thermodynamic limit

  18. Update to the study protocol, including statistical analysis plan for a randomized clinical trial comparing comprehensive cardiac rehabilitation after heart valve surgery with control

    DEFF Research Database (Denmark)

    Sibilitz, Kirstine Laerum; Berg, Selina Kikkenborg; Hansen, Tina Birgitte

    2015-01-01

    , either valve replacement or repair, remains the treatment of choice. However, post-surgery, the transition to daily living may become a physical, mental and social challenge. We hypothesize that a comprehensive cardiac rehabilitation program can improve physical capacity and self-assessed mental health...... and reduce hospitalization and healthcare costs after heart valve surgery. METHODS: This randomized clinical trial, CopenHeartVR, aims to investigate whether cardiac rehabilitation in addition to usual care is superior to treatment as usual after heart valve surgery. The trial will randomly allocate 210...... patients 1:1 to an intervention or a control group, using central randomization, and blinded outcome assessment and statistical analyses. The intervention consists of 12 weeks of physical exercise and a psycho-educational intervention comprising five consultations. The primary outcome is peak oxygen uptake...

  19. Additional cash incentive within a conditional cash transfer scheme: a 'controlled before and during' design evaluation study from India.

    Science.gov (United States)

    Lahariya, Chandrakant; Mishra, Ashok; Nandan, Deoki; Gautam, Praveen; Gupta, Sanjay

    2011-01-01

    Conditional Cash Transfer (CCT) schemes have shown largely favorable changes in the health seeking behavior. This evaluation study assesses the process and performance of an Additional Cash Incentive (ACI) scheme within an ongoing CCT scheme in India, and document lessons. A controlled before and during design study was conducted in Madhya Pradesh state of India, from August 2007 to March 2008, with increased in institutional deliveries as a primary outcome. In depth interviews, focus group discussions and household surveys were done for data collection. Lack of awareness about ACI scheme amongst general population and beneficiaries, cumbersome cash disbursement procedure, intricate eligibility criteria, extensive paper work, and insufficient focus on community involvement were the major implementation challenges. There were anecdotal reports of political interference and possible scope for corruption. At the end of implementation period, overall rate of institutional deliveries had increased in both target and control populations; however, the differences were not statistically significant. No cause and effect association could be proven by this study. Poor planning and coordination, and lack of public awareness about the scheme resulted in low utilization. Thus, proper IEC and training, detailed implementation plan, orientation training for implementer, sufficient budgetary allocation, and community participation should be an integral part for successful implementation of any such scheme. The lesson learned this evaluation study may be useful in any developing country setting and may be utilized for planning and implementation of any ACI scheme in future.

  20. Frontiers in statistical quality control

    CERN Document Server

    Wilrich, Peter-Theodor

    1997-01-01

    Like the preceding volumes, and met with a lively response, the present volume is collecting contributions stressed on methodology or successful industrial applications. The papers are classified under four main headings: sampling inspection, process quality control, data analysis and process capability studies and finally experimental design.

  1. Statistical Optimization of Synthesis of Manganese Carbonates Nanoparticles by Precipitation Methods

    International Nuclear Information System (INIS)

    Javidan, A.; Rahimi-Nasrabadi, M.; Davoudi, A.A.

    2011-01-01

    In this study, an orthogonal array design (OAD), OA9, was employed as a statistical experimental method for the controllable, simple and fast synthesis of manganese carbonate nanoparticle. Ultrafine manganese carbonate nanoparticles were synthesized by a precipitation method involving the addition of manganese ion solution to the carbonate reagent. The effects of reaction conditions, for example, manganese and carbonate concentrations, flow rate of reagent addition and temperature, on the diameter of the synthesized manganese carbonate nanoparticle were investigated. The effects of these factors on the width of the manganese carbonate nanoparticle were quantitatively evaluated by the analysis of variance (ANOVA). The results showed that manganese carbonate nanoparticle can be synthesized by controlling the manganese concentration, flow rate and temperature. Finally, the optimum conditions for the synthesis of manganese carbonate nanoparticle by this simple and fast method were proposed. The results of ANOVA showed that 0.001 mol/ L manganese ion and carbonate reagents concentrations, 2.5 mL/ min flow rate for the addition of the manganese reagent to the carbonate solution and 0 degree Celsius temperature are the optimum conditions for producing manganese carbonate nanoparticle with 75 ± 25 nm width. (author)

  2. Statistical methods for quality improvement

    National Research Council Canada - National Science Library

    Ryan, Thomas P

    2011-01-01

    ...."-TechnometricsThis new edition continues to provide the most current, proven statistical methods for quality control and quality improvementThe use of quantitative methods offers numerous benefits...

  3. Robust inference from multiple test statistics via permutations: a better alternative to the single test statistic approach for randomized trials.

    Science.gov (United States)

    Ganju, Jitendra; Yu, Xinxin; Ma, Guoguang Julie

    2013-01-01

    Formal inference in randomized clinical trials is based on controlling the type I error rate associated with a single pre-specified statistic. The deficiency of using just one method of analysis is that it depends on assumptions that may not be met. For robust inference, we propose pre-specifying multiple test statistics and relying on the minimum p-value for testing the null hypothesis of no treatment effect. The null hypothesis associated with the various test statistics is that the treatment groups are indistinguishable. The critical value for hypothesis testing comes from permutation distributions. Rejection of the null hypothesis when the smallest p-value is less than the critical value controls the type I error rate at its designated value. Even if one of the candidate test statistics has low power, the adverse effect on the power of the minimum p-value statistic is not much. Its use is illustrated with examples. We conclude that it is better to rely on the minimum p-value rather than a single statistic particularly when that single statistic is the logrank test, because of the cost and complexity of many survival trials. Copyright © 2013 John Wiley & Sons, Ltd.

  4. Simulation on a car interior aerodynamic noise control based on statistical energy analysis

    Science.gov (United States)

    Chen, Xin; Wang, Dengfeng; Ma, Zhengdong

    2012-09-01

    How to simulate interior aerodynamic noise accurately is an important question of a car interior noise reduction. The unsteady aerodynamic pressure on body surfaces is proved to be the key effect factor of car interior aerodynamic noise control in high frequency on high speed. In this paper, a detail statistical energy analysis (SEA) model is built. And the vibra-acoustic power inputs are loaded on the model for the valid result of car interior noise analysis. The model is the solid foundation for further optimization on car interior noise control. After the most sensitive subsystems for the power contribution to car interior noise are pointed by SEA comprehensive analysis, the sound pressure level of car interior aerodynamic noise can be reduced by improving their sound and damping characteristics. The further vehicle testing results show that it is available to improve the interior acoustic performance by using detailed SEA model, which comprised by more than 80 subsystems, with the unsteady aerodynamic pressure calculation on body surfaces and the materials improvement of sound/damping properties. It is able to acquire more than 2 dB reduction on the central frequency in the spectrum over 800 Hz. The proposed optimization method can be looked as a reference of car interior aerodynamic noise control by the detail SEA model integrated unsteady computational fluid dynamics (CFD) and sensitivity analysis of acoustic contribution.

  5. Poster - Thur Eve - 29: Detecting changes in IMRT QA using statistical process control.

    Science.gov (United States)

    Drever, L; Salomons, G

    2012-07-01

    Statistical process control (SPC) methods were used to analyze 239 measurement based individual IMRT QA events. The selected IMRT QA events were all head and neck (H&N) cases with 70Gy in 35 fractions, and all prostate cases with 76Gy in 38 fractions planned between March 2009 and 2012. The results were used to determine if the tolerance limits currently being used for IMRT QA were able to indicate if the process was under control. The SPC calculations were repeated for IMRT QA of the same type of cases that were planned after the treatment planning system was upgraded from Eclipse version 8.1.18 to version 10.0.39. The initial tolerance limits were found to be acceptable for two of the three metrics tested prior to the upgrade. After the upgrade to the treatment planning system the SPC analysis found that the a priori limits were no longer capable of indicating control for 2 of the 3 metrics analyzed. The changes in the IMRT QA results were clearly identified using SPC, indicating that it is a useful tool for finding changes in the IMRT QA process. Routine application of SPC to IMRT QA results would help to distinguish unintentional trends and changes from the random variation in the IMRT QA results for individual plans. © 2012 American Association of Physicists in Medicine.

  6. Photo control of transport properties in a disordered wire: Average conductance, conductance statistics, and time-reversal symmetry

    International Nuclear Information System (INIS)

    Kitagawa, Takuya; Oka, Takashi; Demler, Eugene

    2012-01-01

    In this paper, we study the full conductance statistics of a disordered 1D wire under the application of light. We develop the transfer matrix method for periodically driven systems to analyze the conductance of a large system with small frequency of light, where coherent photon absorptions play an important role to determine not only the average but also the shape of conductance distributions. The average conductance under the application of light results from the competition between dynamic localization and effective dimension increase, and shows non-monotonic behavior as a function of driving amplitude. On the other hand, the shape of conductance distribution displays a crossover phenomena in the intermediate disorder strength; the application of light dramatically changes the distribution from log-normal to normal distributions. Furthermore, we propose that conductance of disordered systems can be controlled by engineering the shape, frequency and amplitude of light. Change of the shape of driving field controls the time-reversals symmetry and the disordered system shows analogous behavior as negative magneto-resistance known in static weak localization. A small change of frequency and amplitude of light leads to a large change of conductance, displaying giant opto-response. Our work advances the perspective to control the mean as well as the full conductance statistics by coherently driving disordered systems. - Highlights: ► We study conductance of disordered systems under the application of light. ► Full conductance distributions are obtained. ► A transfer matrix method is developed for driven systems. ► Conductances are dramatically modified upon the application of light. ► Time-reversal symmetry can also be controlled by light application.

  7. Semiclassical analysis, Witten Laplacians, and statistical mechanis

    CERN Document Server

    Helffer, Bernard

    2002-01-01

    This important book explains how the technique of Witten Laplacians may be useful in statistical mechanics. It considers the problem of analyzing the decay of correlations, after presenting its origin in statistical mechanics. In addition, it compares the Witten Laplacian approach with other techniques, such as the transfer matrix approach and its semiclassical analysis. The author concludes by providing a complete proof of the uniform Log-Sobolev inequality. Contents: Witten Laplacians Approach; Problems in Statistical Mechanics with Discrete Spins; Laplace Integrals and Transfer Operators; S

  8. An introduction to statistical thermodynamics

    CERN Document Server

    Hill, Terrell L

    1987-01-01

    ""A large number of exercises of a broad range of difficulty make this book even more useful…a good addition to the literature on thermodynamics at the undergraduate level."" - Philosophical MagazineAlthough written on an introductory level, this wide-ranging text provides extensive coverage of topics of current interest in equilibrium statistical mechanics. Indeed, certain traditional topics are given somewhat condensed treatment to allow room for a survey of more recent advances.The book is divided into four major sections. Part I deals with the principles of quantum statistical mechanics a

  9. Complexity control in statistical learning

    Indian Academy of Sciences (India)

    Then we describe how the method of regularization is used to control complexity in learning. We discuss two examples of regularization, one in which the function space used is finite dimensional, and another in which it is a reproducing kernel Hilbert space. Our exposition follows the formulation of Cucker and Smale.

  10. Implementation of Statistical Process Control: Evaluating the Mechanical Performance of a Candidate Silicone Elastomer Docking Seal

    Science.gov (United States)

    Oravec, Heather Ann; Daniels, Christopher C.

    2014-01-01

    The National Aeronautics and Space Administration has been developing a novel docking system to meet the requirements of future exploration missions to low-Earth orbit and beyond. A dynamic gas pressure seal is located at the main interface between the active and passive mating components of the new docking system. This seal is designed to operate in the harsh space environment, but is also to perform within strict loading requirements while maintaining an acceptable level of leak rate. In this study, a candidate silicone elastomer seal was designed, and multiple subscale test articles were manufactured for evaluation purposes. The force required to fully compress each test article at room temperature was quantified and found to be below the maximum allowable load for the docking system. However, a significant amount of scatter was observed in the test results. Due to the stochastic nature of the mechanical performance of this candidate docking seal, a statistical process control technique was implemented to isolate unusual compression behavior from typical mechanical performance. The results of this statistical analysis indicated a lack of process control, suggesting a variation in the manufacturing phase of the process. Further investigation revealed that changes in the manufacturing molding process had occurred which may have influenced the mechanical performance of the seal. This knowledge improves the chance of this and future space seals to satisfy or exceed design specifications.

  11. Estimating the Time to Benefit for Preventive Drugs with the Statistical Process Control Method: An Example with Alendronate.

    Science.gov (United States)

    van de Glind, Esther M M; Willems, Hanna C; Eslami, Saeid; Abu-Hanna, Ameen; Lems, Willem F; Hooft, Lotty; de Rooij, Sophia E; Black, Dennis M; van Munster, Barbara C

    2016-05-01

    For physicians dealing with patients with a limited life expectancy, knowing the time to benefit (TTB) of preventive medication is essential to support treatment decisions. The aim of this study was to investigate the usefulness of statistical process control (SPC) for determining the TTB in relation to fracture risk with alendronate versus placebo in postmenopausal women. We performed a post hoc analysis of the Fracture Intervention Trial (FIT), a randomized, controlled trial that investigated the effect of alendronate versus placebo on fracture risk in postmenopausal women. We used SPC, a statistical method used for monitoring processes for quality control, to determine if and when the intervention group benefited significantly more than the control group. SPC discriminated between the normal variations over time in the numbers of fractures in both groups and the variations that were attributable to alendronate. The TTB was defined as the time point from which the cumulative difference in the number of clinical fractures remained greater than the upper control limit on the SPC chart. For the total group, the TTB was defined as 11 months. For patients aged ≥70 years, the TTB was 8 months [absolute risk reduction (ARR) = 1.4%]; for patients aged <70 years, it was 19 months (ARR = 0.7%). SPC is a clear and understandable graphical method to determine the TTB. Its main advantage is that there is no need to define a prespecified time point, as is the case in traditional survival analyses. Prescribing alendronate to patients who are aged ≥70 years is useful because the TTB shows that they will benefit after 8 months. Investigators should report the TTB to simplify clinical decision making.

  12. Effects of the food additive, citric acid, on kidney cells of mice.

    Science.gov (United States)

    Chen, Xg; Lv, Qx; Liu, Ym; Deng, W

    2015-01-01

    Citric acid is a food additive that is widely used in the food and drink industry. We investigated the effects of citric acid injection on mouse kidney. Forty healthy mice were divided into four groups of 10 including one control group and three citric acid-treated groups. Low dose, middle dose and high dose groups were given doses of 120, 240 and 480 mg/kg of citric acid, respectively. On day 7, kidney tissues were collected for histological, biochemical and molecular biological examination. We observed shrinkage of glomeruli, widened urinary spaces and capillary congestion, narrowing of the tubule lumen, edema and cytoplasmic vacuolated tubule cells, and appearance of pyknotic nuclei. The relation between histopathological changes and citric acid was dose dependent. Compared to the control, T-SOD and GSH-Px activities in the treated groups decreased with increasing doses of citric acid, NOS activity tended to increase, and H2O2 and MDA contents gradually decreased, but the differences between any treated group and the control were not statistically significant. The apoptosis assay showed a dose-dependent increase of caspase-3 activity after administering citrate that was statistically significant. DNA ladder formation occurred after treatment with any dose of citric acid. We concluded that administration of citric acid may cause renal toxicity in mice.

  13. Stochastic optimal control as non-equilibrium statistical mechanics: calculus of variations over density and current

    Science.gov (United States)

    Chernyak, Vladimir Y.; Chertkov, Michael; Bierkens, Joris; Kappen, Hilbert J.

    2014-01-01

    In stochastic optimal control (SOC) one minimizes the average cost-to-go, that consists of the cost-of-control (amount of efforts), cost-of-space (where one wants the system to be) and the target cost (where one wants the system to arrive), for a system participating in forced and controlled Langevin dynamics. We extend the SOC problem by introducing an additional cost-of-dynamics, characterized by a vector potential. We propose derivation of the generalized gauge-invariant Hamilton-Jacobi-Bellman equation as a variation over density and current, suggest hydrodynamic interpretation and discuss examples, e.g., ergodic control of a particle-within-a-circle, illustrating non-equilibrium space-time complexity.

  14. Statistical time lags in ac discharges

    International Nuclear Information System (INIS)

    Sobota, A; Kanters, J H M; Van Veldhuizen, E M; Haverlag, M; Manders, F

    2011-01-01

    The paper presents statistical time lags measured for breakdown events in near-atmospheric pressure argon and xenon. Ac voltage at 100, 400 and 800 kHz was used to drive the breakdown processes, and the voltage amplitude slope was varied between 10 and 1280 V ms -1 . The values obtained for the statistical time lags are roughly between 1 and 150 ms. It is shown that the statistical time lags in ac-driven discharges follow the same general trends as the discharges driven by voltage of monotonic slope. In addition, the validity of the Cobine-Easton expression is tested at an alternating voltage form.

  15. Statistical time lags in ac discharges

    Energy Technology Data Exchange (ETDEWEB)

    Sobota, A; Kanters, J H M; Van Veldhuizen, E M; Haverlag, M [Eindhoven University of Technology, Department of Applied Physics, Postbus 513, 5600MB Eindhoven (Netherlands); Manders, F, E-mail: a.sobota@tue.nl [Philips Lighting, LightLabs, Mathildelaan 1, 5600JM Eindhoven (Netherlands)

    2011-04-06

    The paper presents statistical time lags measured for breakdown events in near-atmospheric pressure argon and xenon. Ac voltage at 100, 400 and 800 kHz was used to drive the breakdown processes, and the voltage amplitude slope was varied between 10 and 1280 V ms{sup -1}. The values obtained for the statistical time lags are roughly between 1 and 150 ms. It is shown that the statistical time lags in ac-driven discharges follow the same general trends as the discharges driven by voltage of monotonic slope. In addition, the validity of the Cobine-Easton expression is tested at an alternating voltage form.

  16. Effectiveness of a healthy lifestyle intervention for low back pain and osteoarthritis of the knee: protocol and statistical analysis plan for two randomised controlled trials

    Directory of Open Access Journals (Sweden)

    Kate M. O’Brien

    Full Text Available ABSTRACT Background These trials are the first randomised controlled trials of telephone-based weight management and healthy lifestyle interventions for low back pain and knee osteoarthritis. This article describes the protocol and statistical analysis plan. Method These trials are parallel randomised controlled trials that investigate and compare the effect of a telephone-based weight management and healthy lifestyle intervention for improving pain intensity in overweight or obese patients with low back pain or knee osteoarthritis. The analysis plan was finalised prior to initiation of analyses. All data collected as part of the trial were reviewed, without stratification by group, and classified by baseline characteristics, process of care and trial outcomes. Trial outcomes were classified as primary and secondary outcomes. Appropriate descriptive statistics and statistical testing of between-group differences, where relevant, have been planned and described. Conclusions A protocol for standard analyses was developed for the results of two randomised controlled trials. This protocol describes the data, and the pre-determined statistical tests of relevant outcome measures. The plan demonstrates transparent and verifiable use of the data collected. This a priori protocol will be followed to ensure rigorous standards of data analysis are strictly adhered to.

  17. Effectiveness of a healthy lifestyle intervention for low back pain and osteoarthritis of the knee: protocol and statistical analysis plan for two randomised controlled trials

    Science.gov (United States)

    O’Brien, Kate M.; Williams, Amanda; Wiggers, John; Wolfenden, Luke; Yoong, Serene; Campbell, Elizabeth; Kamper, Steven J.; McAuley, James; Attia, John; Oldmeadow, Chris; Williams, Christopher M.

    2016-01-01

    ABSTRACT Background These trials are the first randomised controlled trials of telephone-based weight management and healthy lifestyle interventions for low back pain and knee osteoarthritis. This article describes the protocol and statistical analysis plan. Method These trials are parallel randomised controlled trials that investigate and compare the effect of a telephone-based weight management and healthy lifestyle intervention for improving pain intensity in overweight or obese patients with low back pain or knee osteoarthritis. The analysis plan was finalised prior to initiation of analyses. All data collected as part of the trial were reviewed, without stratification by group, and classified by baseline characteristics, process of care and trial outcomes. Trial outcomes were classified as primary and secondary outcomes. Appropriate descriptive statistics and statistical testing of between-group differences, where relevant, have been planned and described. Conclusions A protocol for standard analyses was developed for the results of two randomised controlled trials. This protocol describes the data, and the pre-determined statistical tests of relevant outcome measures. The plan demonstrates transparent and verifiable use of the data collected. This a priori protocol will be followed to ensure rigorous standards of data analysis are strictly adhered to. PMID:27683839

  18. Development of SM-2 emulsion detector and its application to automatic control of deemulsifying agent addition

    International Nuclear Information System (INIS)

    Wu Hongpei.

    1985-01-01

    Emulsion phenomena had ever occurred in trifattyamine solvent extraction in some uranium mills owing to the presence of the colloidal polysilicic acid in feed solutions with the concentration even as high as >= 0.46 g/l (based on SiO 2 ). Polyether has been used as the deemulsifying agent to remove colloidal polysilicic acid in feed solution in question. In order to reduce the amount of polyether consumption, SM-2 emulsion detector was thus developed and used for automatic control of polyether addition into feed solution. The working principle and basic constitutional structure of SM-2 detector are described. When polyether solution is added into feed solution, certain turbidity occurs owing to the flocculated particles of polysilicic acid. It was found that a linear relationship existed between turbidity and photoelectric pressure difference in millivolts which can be detected by the SM-2 detector. Therefore, it is feasible that the minimum concentration of polysilicic acid, over which emulsion may occurs, can be found through experiments. To take advantage of this linear relationship, we can automatically control the addition of polyether solution in an appropriate amount without occurrence of emulsion phenomenon during solvent extraction. The scheme of automatic control of addition of polyether solution is presented too

  19. Radiation effects on cancer mortality among A-bomb survivors, 1950-72. Comparison of some statistical models and analysis based on the additive logit model

    Energy Technology Data Exchange (ETDEWEB)

    Otake, M [Hiroshima Univ. (Japan). Faculty of Science

    1976-12-01

    Various statistical models designed to determine the effects of radiation dose on mortality of atomic bomb survivors in Hiroshima and Nagasaki from specific cancers were evaluated on the basis of a basic k(age) x c(dose) x 2 contingency table. From the aspects of application and fits of different models, analysis based on the additive logit model was applied to the mortality experience of this population during the 22year period from 1 Oct. 1950 to 31 Dec. 1972. The advantages and disadvantages of the additive logit model were demonstrated. Leukemia mortality showed a sharp rise with an increase in dose. The dose response relationship suggests a possible curvature or a log linear model, particularly if the dose estimated to be more than 600 rad were set arbitrarily at 600 rad, since the average dose in the 200+ rad group would then change from 434 to 350 rad. In the 22year period from 1950 to 1972, a high mortality risk due to radiation was observed in survivors with doses of 200 rad and over for all cancers except leukemia. On the other hand, during the latest period from 1965 to 1972 a significant risk was noted also for stomach and breast cancers. Survivors who were 9 year old or less at the time of the bomb and who were exposed to high doses of 200+ rad appeared to show a high mortality risk for all cancers except leukemia, although the number of observed deaths is yet small. A number of interesting areas are discussed from the statistical and epidemiological standpoints, i.e., the numerical comparison of risks in various models, the general evaluation of cancer mortality by the additive logit model, the dose response relationship, the relative risk in the high dose group, the time period of radiation induced cancer mortality, the difference of dose response between Hiroshima and Nagasaki and the relative biological effectiveness of neutrons.

  20. Statistical characterization of discrete conservative systems: The web map

    Science.gov (United States)

    Ruiz, Guiomar; Tirnakli, Ugur; Borges, Ernesto P.; Tsallis, Constantino

    2017-10-01

    We numerically study the two-dimensional, area preserving, web map. When the map is governed by ergodic behavior, it is, as expected, correctly described by Boltzmann-Gibbs statistics, based on the additive entropic functional SB G[p (x ) ] =-k ∫d x p (x ) lnp (x ) . In contrast, possible ergodicity breakdown and transitory sticky dynamical behavior drag the map into the realm of generalized q statistics, based on the nonadditive entropic functional Sq[p (x ) ] =k 1/-∫d x [p(x ) ] q q -1 (q ∈R ;S1=SB G ). We statistically describe the system (probability distribution of the sum of successive iterates, sensitivity to the initial condition, and entropy production per unit time) for typical values of the parameter that controls the ergodicity of the map. For small (large) values of the external parameter K , we observe q -Gaussian distributions with q =1.935 ⋯ (Gaussian distributions), like for the standard map. In contrast, for intermediate values of K , we observe a different scenario, due to the fractal structure of the trajectories embedded in the chaotic sea. Long-standing non-Gaussian distributions are characterized in terms of the kurtosis and the box-counting dimension of chaotic sea.

  1. Application of Snowfall and Wind Statistics to Snow Transport Modeling for Snowdrift Control in Minnesota.

    Science.gov (United States)

    Shulski, Martha D.; Seeley, Mark W.

    2004-11-01

    Models were utilized to determine the snow accumulation season (SAS) and to quantify windblown snow for the purpose of snowdrift control for locations in Minnesota. The models require mean monthly temperature, snowfall, density of snow, and wind frequency distribution statistics. Temperature and precipitation data were obtained from local cooperative observing sites, and wind data came from Automated Surface Observing System (ASOS)/Automated Weather Observing System (AWOS) sites in the region. The temperature-based algorithm used to define the SAS reveals a geographic variability in the starting and ending dates of the season, which is determined by latitude and elevation. Mean seasonal snowfall shows a geographic distribution that is affected by topography and proximity to Lake Superior. Mean snowfall density also exhibits variability, with lower-density snow events displaced to higher-latitude positions. Seasonal wind frequencies show a strong bimodal distribution with peaks from the northwest and southeast vector direction, with an exception for locations in close proximity to the Lake Superior shoreline. In addition, for western and south-central Minnesota there is a considerably higher frequency of wind speeds above the mean snow transport threshold of 7 m s-1. As such, this area is more conducive to higher potential snow transport totals. Snow relocation coefficients in this area are in the range of 0.4 0.9, and, according to the empirical models used in this analysis, this range implies that actual snow transport is 40% 90% of the total potential in south-central and western areas of the state.

  2. A simulation study to evaluate the performance of five statistical monitoring methods when applied to different time-series components in the context of control programs for endemic diseases

    DEFF Research Database (Denmark)

    Lopes Antunes, Ana Carolina; Jensen, Dan; Hisham Beshara Halasa, Tariq

    2017-01-01

    Disease monitoring and surveillance play a crucial role in control and eradication programs, as it is important to track implemented strategies in order to reduce and/or eliminate a specific disease. The objectives of this study were to assess the performance of different statistical monitoring......, decreases and constant sero-prevalence levels (referred as events). Two space-state models were used to model the time series, and different statistical monitoring methods (such as univariate process control algorithms–Shewart Control Chart, Tabular Cumulative Sums, and the V-mask- and monitoring...... of noise in the baseline was greater for the Shewhart Control Chart and Tabular Cumulative Sums than for the V-Mask and trend-based methods. The performance of the different statistical monitoring methods varied when monitoring increases and decreases in disease sero-prevalence. Combining two of more...

  3. A simulation study to evaluate the performance of five statistical monitoring methods when applied to different time-series components in the context of control programs for endemic diseases

    DEFF Research Database (Denmark)

    Lopes Antunes, Ana Carolina; Jensen, Dan; Hisham Beshara Halasa, Tariq

    2017-01-01

    , decreases and constant sero-prevalence levels (referred as events). Two space-state models were used to model the time series, and different statistical monitoring methods (such as univariate process control algorithms–Shewart Control Chart, Tabular Cumulative Sums, and the V-mask- and monitoring......Disease monitoring and surveillance play a crucial role in control and eradication programs, as it is important to track implemented strategies in order to reduce and/or eliminate a specific disease. The objectives of this study were to assess the performance of different statistical monitoring...... of noise in the baseline was greater for the Shewhart Control Chart and Tabular Cumulative Sums than for the V-Mask and trend-based methods. The performance of the different statistical monitoring methods varied when monitoring increases and decreases in disease sero-prevalence. Combining two of more...

  4. Additive Effect of Plasma Rich in Growth Factors With Guided Tissue Regeneration in Treatment of Intrabony Defects in Patients With Chronic Periodontitis: A Split-Mouth Randomized Controlled Clinical Trial.

    Science.gov (United States)

    Ravi, Sheethalan; Malaiappan, Sankari; Varghese, Sheeja; Jayakumar, Nadathur D; Prakasam, Gopinath

    2017-09-01

    Periodontal regeneration can be defined as complete restoration of lost periodontal tissues to their original architecture and function. A variety of treatment modalities have been proposed to achieve it. Plasma rich in growth factors (PRGF) is a concentrated suspension of growth factors that promotes restoration of lost periodontal tissues. The objective of the present study is to assess the effect of PRGF associated with guided tissue regeneration (GTR) versus GTR only in the treatment of intrabony defects (IBDs) in patients with chronic periodontitis (CP). Patients with CP (n = 14) with 42 contralateral 2- and 3-walled defects were randomly assigned to test (PRGF+GTR) and control (GTR alone) treatment groups. Clinical and radiographic assessments performed at baseline and after 6 months were: 1) gingival index (GI), 2) probing depth (PD), 3) clinical attachment level (CAL), 4) radiologic defect depth, and 5) bone fill. Comparison of parameters measured at baseline and after 6 months showed mean PD reduction of 3.37 ± 1.62 mm in the control group (P <0.001) and 4.13 ± 1.59 mm in the test group (P <0.001). There was a significant difference in mean change in CAL (P <0.001) in the control group (5.42 ± 1.99) and the test group (5.99 ± 1.77). Mean change in GI was 1.89 ± 0.32 and 1.68 ± 0.58 in the control group and test group, respectively, and the difference was statistically significant (P <0.001). When compared between groups, clinical parameters did not show any statistically significant variations. Mean radiographic bone fill was 1.06 ± 0.81 and 1.0 ± 0.97 in the control group and test group, respectively. However, the difference was not statistically significant. PRGF with GTR, as well as GTR alone, was effective in improving clinical and radiographic parameters of patients with CP at the 6-month follow-up. There was no additive effect of PRGF when used along with GTR in the treatment of IBDs in patients with CP in terms of both clinical and

  5. A d-statistic for single-case designs that is equivalent to the usual between-groups d-statistic.

    Science.gov (United States)

    Shadish, William R; Hedges, Larry V; Pustejovsky, James E; Boyajian, Jonathan G; Sullivan, Kristynn J; Andrade, Alma; Barrientos, Jeannette L

    2014-01-01

    We describe a standardised mean difference statistic (d) for single-case designs that is equivalent to the usual d in between-groups experiments. We show how it can be used to summarise treatment effects over cases within a study, to do power analyses in planning new studies and grant proposals, and to meta-analyse effects across studies of the same question. We discuss limitations of this d-statistic, and possible remedies to them. Even so, this d-statistic is better founded statistically than other effect size measures for single-case design, and unlike many general linear model approaches such as multilevel modelling or generalised additive models, it produces a standardised effect size that can be integrated over studies with different outcome measures. SPSS macros for both effect size computation and power analysis are available.

  6. Graph theory applied to noise and vibration control in statistical energy analysis models.

    Science.gov (United States)

    Guasch, Oriol; Cortés, Lluís

    2009-06-01

    A fundamental aspect of noise and vibration control in statistical energy analysis (SEA) models consists in first identifying and then reducing the energy flow paths between subsystems. In this work, it is proposed to make use of some results from graph theory to address both issues. On the one hand, linear and path algebras applied to adjacency matrices of SEA graphs are used to determine the existence of any order paths between subsystems, counting and labeling them, finding extremal paths, or determining the power flow contributions from groups of paths. On the other hand, a strategy is presented that makes use of graph cut algorithms to reduce the energy flow from a source subsystem to a receiver one, modifying as few internal and coupling loss factors as possible.

  7. Renyi statistics in equilibrium statistical mechanics

    International Nuclear Information System (INIS)

    Parvan, A.S.; Biro, T.S.

    2010-01-01

    The Renyi statistics in the canonical and microcanonical ensembles is examined both in general and in particular for the ideal gas. In the microcanonical ensemble the Renyi statistics is equivalent to the Boltzmann-Gibbs statistics. By the exact analytical results for the ideal gas, it is shown that in the canonical ensemble, taking the thermodynamic limit, the Renyi statistics is also equivalent to the Boltzmann-Gibbs statistics. Furthermore it satisfies the requirements of the equilibrium thermodynamics, i.e. the thermodynamical potential of the statistical ensemble is a homogeneous function of first degree of its extensive variables of state. We conclude that the Renyi statistics arrives at the same thermodynamical relations, as those stemming from the Boltzmann-Gibbs statistics in this limit.

  8. An Update on Statistical Boosting in Biomedicine.

    Science.gov (United States)

    Mayr, Andreas; Hofner, Benjamin; Waldmann, Elisabeth; Hepp, Tobias; Meyer, Sebastian; Gefeller, Olaf

    2017-01-01

    Statistical boosting algorithms have triggered a lot of research during the last decade. They combine a powerful machine learning approach with classical statistical modelling, offering various practical advantages like automated variable selection and implicit regularization of effect estimates. They are extremely flexible, as the underlying base-learners (regression functions defining the type of effect for the explanatory variables) can be combined with any kind of loss function (target function to be optimized, defining the type of regression setting). In this review article, we highlight the most recent methodological developments on statistical boosting regarding variable selection, functional regression, and advanced time-to-event modelling. Additionally, we provide a short overview on relevant applications of statistical boosting in biomedicine.

  9. Integrating Statistical Machine Learning in a Semantic Sensor Web for Proactive Monitoring and Control

    Directory of Open Access Journals (Sweden)

    Jude Adekunle Adeleke

    2017-04-01

    Full Text Available Proactive monitoring and control of our natural and built environments is important in various application scenarios. Semantic Sensor Web technologies have been well researched and used for environmental monitoring applications to expose sensor data for analysis in order to provide responsive actions in situations of interest. While these applications provide quick response to situations, to minimize their unwanted effects, research efforts are still necessary to provide techniques that can anticipate the future to support proactive control, such that unwanted situations can be averted altogether. This study integrates a statistical machine learning based predictive model in a Semantic Sensor Web using stream reasoning. The approach is evaluated in an indoor air quality monitoring case study. A sliding window approach that employs the Multilayer Perceptron model to predict short term PM 2 . 5 pollution situations is integrated into the proactive monitoring and control framework. Results show that the proposed approach can effectively predict short term PM 2 . 5 pollution situations: precision of up to 0.86 and sensitivity of up to 0.85 is achieved over half hour prediction horizons, making it possible for the system to warn occupants or even to autonomously avert the predicted pollution situations within the context of Semantic Sensor Web.

  10. Integrating Statistical Machine Learning in a Semantic Sensor Web for Proactive Monitoring and Control.

    Science.gov (United States)

    Adeleke, Jude Adekunle; Moodley, Deshendran; Rens, Gavin; Adewumi, Aderemi Oluyinka

    2017-04-09

    Proactive monitoring and control of our natural and built environments is important in various application scenarios. Semantic Sensor Web technologies have been well researched and used for environmental monitoring applications to expose sensor data for analysis in order to provide responsive actions in situations of interest. While these applications provide quick response to situations, to minimize their unwanted effects, research efforts are still necessary to provide techniques that can anticipate the future to support proactive control, such that unwanted situations can be averted altogether. This study integrates a statistical machine learning based predictive model in a Semantic Sensor Web using stream reasoning. The approach is evaluated in an indoor air quality monitoring case study. A sliding window approach that employs the Multilayer Perceptron model to predict short term PM 2 . 5 pollution situations is integrated into the proactive monitoring and control framework. Results show that the proposed approach can effectively predict short term PM 2 . 5 pollution situations: precision of up to 0.86 and sensitivity of up to 0.85 is achieved over half hour prediction horizons, making it possible for the system to warn occupants or even to autonomously avert the predicted pollution situations within the context of Semantic Sensor Web.

  11. The food additive vanillic acid controls transgene expression in mammalian cells and mice.

    Science.gov (United States)

    Gitzinger, Marc; Kemmer, Christian; Fluri, David A; El-Baba, Marie Daoud; Weber, Wilfried; Fussenegger, Martin

    2012-03-01

    Trigger-inducible transcription-control devices that reversibly fine-tune transgene expression in response to molecular cues have significantly advanced the rational reprogramming of mammalian cells. When designed for use in future gene- and cell-based therapies the trigger molecules have to be carefully chosen in order to provide maximum specificity, minimal side-effects and optimal pharmacokinetics in a mammalian organism. Capitalizing on control components that enable Caulobacter crescentus to metabolize vanillic acid originating from lignin degradation that occurs in its oligotrophic freshwater habitat, we have designed synthetic devices that specifically adjust transgene expression in mammalian cells when exposed to vanillic acid. Even in mice transgene expression was robust, precise and tunable in response to vanillic acid. As a licensed food additive that is regularly consumed by humans via flavoured convenience food and specific fresh vegetable and fruits, vanillic acid can be considered as a safe trigger molecule that could be used for diet-controlled transgene expression in future gene- and cell-based therapies.

  12. Statistical models and NMR analysis of polymer microstructure

    Science.gov (United States)

    Statistical models can be used in conjunction with NMR spectroscopy to study polymer microstructure and polymerization mechanisms. Thus, Bernoullian, Markovian, and enantiomorphic-site models are well known. Many additional models have been formulated over the years for additional situations. Typica...

  13. Re-analysis of survival data of cancer patients utilizing additive homeopathy.

    Science.gov (United States)

    Gleiss, Andreas; Frass, Michael; Gaertner, Katharina

    2016-08-01

    In this short communication we present a re-analysis of homeopathic patient data in comparison to control patient data from the same Outpatient´s Unit "Homeopathy in malignant diseases" of the Medical University of Vienna. In this analysis we took account of a probable immortal time bias. For patients suffering from advanced stages of cancer and surviving the first 6 or 12 months after diagnosis, respectively, the results show that utilizing homeopathy gives a statistically significant (p<0.001) advantage over control patients regarding survival time. In conclusion, bearing in mind all limitations, the results of this retrospective study suggest that patients with advanced stages of cancer might benefit from additional homeopathic treatment until a survival time of up to 12 months after diagnosis. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Multisample adjusted U-statistics that account for confounding covariates.

    Science.gov (United States)

    Satten, Glen A; Kong, Maiying; Datta, Somnath

    2018-06-19

    Multisample U-statistics encompass a wide class of test statistics that allow the comparison of 2 or more distributions. U-statistics are especially powerful because they can be applied to both numeric and nonnumeric data, eg, ordinal and categorical data where a pairwise similarity or distance-like measure between categories is available. However, when comparing the distribution of a variable across 2 or more groups, observed differences may be due to confounding covariates. For example, in a case-control study, the distribution of exposure in cases may differ from that in controls entirely because of variables that are related to both exposure and case status and are distributed differently among case and control participants. We propose to use individually reweighted data (ie, using the stratification score for retrospective data or the propensity score for prospective data) to construct adjusted U-statistics that can test the equality of distributions across 2 (or more) groups in the presence of confounding covariates. Asymptotic normality of our adjusted U-statistics is established and a closed form expression of their asymptotic variance is presented. The utility of our approach is demonstrated through simulation studies, as well as in an analysis of data from a case-control study conducted among African-Americans, comparing whether the similarity in haplotypes (ie, sets of adjacent genetic loci inherited from the same parent) occurring in a case and a control participant differs from the similarity in haplotypes occurring in 2 control participants. Copyright © 2018 John Wiley & Sons, Ltd.

  15. Fluxional additives: a second generation control in enantioselective catalysis.

    Science.gov (United States)

    Sibi, Mukund P; Manyem, Shankar; Palencia, Hector

    2006-10-25

    The concept of "fluxional additives", additives that can adopt enantiomeric conformations depending on the chiral information in the ligand, is demonstrated in enantioselective Diels-Alder and nitrone cycloaddition reactions. The additive design is modular, and diverse structures are accessible in three steps. Chiral Lewis acids from main group and transition metals show enhancements in enantioselectivity in the presence of these additives.

  16. Prediction of lacking control power in power plants using statistical models

    DEFF Research Database (Denmark)

    Odgaard, Peter Fogh; Mataji, B.; Stoustrup, Jakob

    2007-01-01

    Prediction of the performance of plants like power plants is of interest, since the plant operator can use these predictions to optimize the plant production. In this paper the focus is addressed on a special case where a combination of high coal moisture content and a high load limits the possible...... plant load, meaning that the requested plant load cannot be met. The available models are in this case uncertain. Instead statistical methods are used to predict upper and lower uncertainty bounds on the prediction. Two different methods are used. The first relies on statistics of recent prediction...... errors; the second uses operating point depending statistics of prediction errors. Using these methods on the previous mentioned case, it can be concluded that the second method can be used to predict the power plant performance, while the first method has problems predicting the uncertain performance...

  17. Bulk tank somatic cell counts analyzed by statistical process control tools to identify and monitor subclinical mastitis incidence.

    Science.gov (United States)

    Lukas, J M; Hawkins, D M; Kinsel, M L; Reneau, J K

    2005-11-01

    The objective of this study was to examine the relationship between monthly Dairy Herd Improvement (DHI) subclinical mastitis and new infection rate estimates and daily bulk tank somatic cell count (SCC) summarized by statistical process control tools. Dairy Herd Improvement Association test-day subclinical mastitis and new infection rate estimates along with daily or every other day bulk tank SCC data were collected for 12 mo of 2003 from 275 Upper Midwest dairy herds. Herds were divided into 5 herd production categories. A linear score [LNS = ln(BTSCC/100,000)/0.693147 + 3] was calculated for each individual bulk tank SCC. For both the raw SCC and the transformed data, the mean and sigma were calculated using the statistical quality control individual measurement and moving range chart procedure of Statistical Analysis System. One hundred eighty-three herds of the 275 herds from the study data set were then randomly selected and the raw (method 1) and transformed (method 2) bulk tank SCC mean and sigma were used to develop models for predicting subclinical mastitis and new infection rate estimates. Herd production category was also included in all models as 5 dummy variables. Models were validated by calculating estimates of subclinical mastitis and new infection rates for the remaining 92 herds and plotting them against observed values of each of the dependents. Only herd production category and bulk tank SCC mean were significant and remained in the final models. High R2 values (0.83 and 0.81 for methods 1 and 2, respectively) indicated a strong correlation between the bulk tank SCC and herd's subclinical mastitis prevalence. The standard errors of the estimate were 4.02 and 4.28% for methods 1 and 2, respectively, and decreased with increasing herd production. As a case study, Shewhart Individual Measurement Charts were plotted from the bulk tank SCC to identify shifts in mastitis incidence. Four of 5 charts examined signaled a change in bulk tank SCC before

  18. Environmental restoration and statistics: Issues and needs

    International Nuclear Information System (INIS)

    Gilbert, R.O.

    1991-10-01

    Statisticians have a vital role to play in environmental restoration (ER) activities. One facet of that role is to point out where additional work is needed to develop statistical sampling plans and data analyses that meet the needs of ER. This paper is an attempt to show where statistics fits into the ER process. The statistician, as member of the ER planning team, works collaboratively with the team to develop the site characterization sampling design, so that data of the quality and quantity required by the specified data quality objectives (DQOs) are obtained. At the same time, the statistician works with the rest of the planning team to design and implement, when appropriate, the observational approach to streamline the ER process and reduce costs. The statistician will also provide the expertise needed to select or develop appropriate tools for statistical analysis that are suited for problems that are common to waste-site data. These data problems include highly heterogeneous waste forms, large variability in concentrations over space, correlated data, data that do not have a normal (Gaussian) distribution, and measurements below detection limits. Other problems include environmental transport and risk models that yield highly uncertain predictions, and the need to effectively communicate to the public highly technical information, such as sampling plans, site characterization data, statistical analysis results, and risk estimates. Even though some statistical analysis methods are available ''off the shelf'' for use in ER, these problems require the development of additional statistical tools, as discussed in this paper. 29 refs

  19. Statistical Learning and Adaptive Decision-Making Underlie Human Response Time Variability in Inhibitory Control

    Directory of Open Access Journals (Sweden)

    Ning eMa

    2015-08-01

    Full Text Available Response time (RT is an oft-reported behavioral measure in psychological and neurocognitive experiments, but the high level of observed trial-to-trial variability in this measure has often limited its usefulness. Here, we combine computational modeling and psychophysics to examine the hypothesis that fluctuations in this noisy measure reflect dynamic computations in human statistical learning and corresponding cognitive adjustments. We present data from the stop-signal task, in which subjects respond to a go stimulus on each trial, unless instructed not to by a subsequent, infrequently presented stop signal. We model across-trial learning of stop signal frequency, P(stop, and stop-signal onset time, SSD (stop-signal delay, with a Bayesian hidden Markov model, and within-trial decision-making with an optimal stochastic control model. The combined model predicts that RT should increase with both expected P(stop and SSD. The human behavioral data (n=20 bear out this prediction, showing P(stop and SSD both to be significant, independent predictors of RT, with P(stop being a more prominent predictor in 75% of the subjects, and SSD being more prominent in the remaining 25%. The results demonstrate that humans indeed readily internalize environmental statistics and adjust their cognitive/behavioral strategy accordingly, and that subtle patterns in RT variability can serve as a valuable tool for validating models of statistical learning and decision-making. More broadly, the modeling tools presented in this work can be generalized to a large body of behavioral paradigms, in order to extract insights about cognitive and neural processing from apparently quite noisy behavioral measures. We also discuss how this behaviorally validated model can then be used to conduct model-based analysis of neural data, in order to help identify specific brain areas for representing and encoding key computational quantities in learning and decision-making.

  20. Statistical learning and adaptive decision-making underlie human response time variability in inhibitory control.

    Science.gov (United States)

    Ma, Ning; Yu, Angela J

    2015-01-01

    Response time (RT) is an oft-reported behavioral measure in psychological and neurocognitive experiments, but the high level of observed trial-to-trial variability in this measure has often limited its usefulness. Here, we combine computational modeling and psychophysics to examine the hypothesis that fluctuations in this noisy measure reflect dynamic computations in human statistical learning and corresponding cognitive adjustments. We present data from the stop-signal task (SST), in which subjects respond to a go stimulus on each trial, unless instructed not to by a subsequent, infrequently presented stop signal. We model across-trial learning of stop signal frequency, P(stop), and stop-signal onset time, SSD (stop-signal delay), with a Bayesian hidden Markov model, and within-trial decision-making with an optimal stochastic control model. The combined model predicts that RT should increase with both expected P(stop) and SSD. The human behavioral data (n = 20) bear out this prediction, showing P(stop) and SSD both to be significant, independent predictors of RT, with P(stop) being a more prominent predictor in 75% of the subjects, and SSD being more prominent in the remaining 25%. The results demonstrate that humans indeed readily internalize environmental statistics and adjust their cognitive/behavioral strategy accordingly, and that subtle patterns in RT variability can serve as a valuable tool for validating models of statistical learning and decision-making. More broadly, the modeling tools presented in this work can be generalized to a large body of behavioral paradigms, in order to extract insights about cognitive and neural processing from apparently quite noisy behavioral measures. We also discuss how this behaviorally validated model can then be used to conduct model-based analysis of neural data, in order to help identify specific brain areas for representing and encoding key computational quantities in learning and decision-making.

  1. The Statistical Value Chain - a Benchmarking Checklist for Decision Makers to Evaluate Decision Support Seen from a Statistical Point-Of-View

    DEFF Research Database (Denmark)

    Herrmann, Ivan Tengbjerg; Henningsen, Geraldine; Wood, Christian D.

    2013-01-01

    quantitative methods exist for evaluating uncertainty—for example, Monte Carlo simulation—and such methods work very well when the AN is in full control of the data collection and model-building processes. In many cases, however, the AN is not in control of these processes. In this article we develop a simple...... method that a DM can employ in order to evaluate the process of decision support from a statistical point-of-view. We call this approach the “Statistical Value Chain” (SVC): a consecutive benchmarking checklist with eight steps that can be used to evaluate decision support seen from a statistical point-of-view....

  2. EFFECTS OF PROPERTIES POLYMERIC ADDITIVES IN RHEOLOGIC AND DRILLING FLUIDS

    Directory of Open Access Journals (Sweden)

    Danielly Vieira de Lucena

    2014-03-01

    Full Text Available The influence of carboxymethylcellulose, CMC (filtrate reducer and xanthan gum (viscosifier in plastic and apparent viscosity at yield strength and the volume of filtrate in the composition of drilling fluids based on water was investigated based on statistical design. Five formulations consist of a range of concentrations used commercially were utilized in the design of the experiment. The formulations were prepared in accordance with company standards Petrobras. Regression models were calculated and correlated with the properties of the compositions. The relevance and validation of the models were confirmed by statistical analysis. The design can be applied to statistically optimize the mud properties considering the addition of CMC and xanthan gum, and to provide a better understanding of the influence of additives on the properties of polymer-based fluid system water. From the study it was observed that the values of the rheological properties vary with the concentration of additives, increasing with increasing concentration of the same, and that the concentration of the additives caused a decline of parameter values filtration.

  3. Conformity and statistical tolerancing

    Science.gov (United States)

    Leblond, Laurent; Pillet, Maurice

    2018-02-01

    Statistical tolerancing was first proposed by Shewhart (Economic Control of Quality of Manufactured Product, (1931) reprinted 1980 by ASQC), in spite of this long history, its use remains moderate. One of the probable reasons for this low utilization is undoubtedly the difficulty for designers to anticipate the risks of this approach. The arithmetic tolerance (worst case) allows a simple interpretation: conformity is defined by the presence of the characteristic in an interval. Statistical tolerancing is more complex in its definition. An interval is not sufficient to define the conformance. To justify the statistical tolerancing formula used by designers, a tolerance interval should be interpreted as the interval where most of the parts produced should probably be located. This tolerance is justified by considering a conformity criterion of the parts guaranteeing low offsets on the latter characteristics. Unlike traditional arithmetic tolerancing, statistical tolerancing requires a sustained exchange of information between design and manufacture to be used safely. This paper proposes a formal definition of the conformity, which we apply successively to the quadratic and arithmetic tolerancing. We introduce a concept of concavity, which helps us to demonstrate the link between tolerancing approach and conformity. We use this concept to demonstrate the various acceptable propositions of statistical tolerancing (in the space decentring, dispersion).

  4. Testing for Statistical Discrimination based on Gender

    DEFF Research Database (Denmark)

    Lesner, Rune Vammen

    . It is shown that the implications of both screening discrimination and stereotyping are consistent with observable wage dynamics. In addition, it is found that the gender wage gap decreases in tenure but increases in job transitions and that the fraction of women in high-ranking positions within a firm does......This paper develops a model which incorporates the two most commonly cited strands of the literature on statistical discrimination, namely screening discrimination and stereotyping. The model is used to provide empirical evidence of statistical discrimination based on gender in the labour market...... not affect the level of statistical discrimination by gender....

  5. Applied multivariate statistics with R

    CERN Document Server

    Zelterman, Daniel

    2015-01-01

    This book brings the power of multivariate statistics to graduate-level practitioners, making these analytical methods accessible without lengthy mathematical derivations. Using the open source, shareware program R, Professor Zelterman demonstrates the process and outcomes for a wide array of multivariate statistical applications. Chapters cover graphical displays, linear algebra, univariate, bivariate and multivariate normal distributions, factor methods, linear regression, discrimination and classification, clustering, time series models, and additional methods. Zelterman uses practical examples from diverse disciplines to welcome readers from a variety of academic specialties. Those with backgrounds in statistics will learn new methods while they review more familiar topics. Chapters include exercises, real data sets, and R implementations. The data are interesting, real-world topics, particularly from health and biology-related contexts. As an example of the approach, the text examines a sample from the B...

  6. Tips and Tricks for Successful Application of Statistical Methods to Biological Data.

    Science.gov (United States)

    Schlenker, Evelyn

    2016-01-01

    This chapter discusses experimental design and use of statistics to describe characteristics of data (descriptive statistics) and inferential statistics that test the hypothesis posed by the investigator. Inferential statistics, based on probability distributions, depend upon the type and distribution of the data. For data that are continuous, randomly and independently selected, as well as normally distributed more powerful parametric tests such as Student's t test and analysis of variance (ANOVA) can be used. For non-normally distributed or skewed data, transformation of the data (using logarithms) may normalize the data allowing use of parametric tests. Alternatively, with skewed data nonparametric tests can be utilized, some of which rely on data that are ranked prior to statistical analysis. Experimental designs and analyses need to balance between committing type 1 errors (false positives) and type 2 errors (false negatives). For a variety of clinical studies that determine risk or benefit, relative risk ratios (random clinical trials and cohort studies) or odds ratios (case-control studies) are utilized. Although both use 2 × 2 tables, their premise and calculations differ. Finally, special statistical methods are applied to microarray and proteomics data, since the large number of genes or proteins evaluated increase the likelihood of false discoveries. Additional studies in separate samples are used to verify microarray and proteomic data. Examples in this chapter and references are available to help continued investigation of experimental designs and appropriate data analysis.

  7. Numerical consideration for multiscale statistical process control method applied to nuclear material accountancy

    International Nuclear Information System (INIS)

    Suzuki, Mitsutoshi; Hori, Masato; Asou, Ryoji; Usuda, Shigekazu

    2006-01-01

    The multiscale statistical process control (MSSPC) method is applied to clarify the elements of material unaccounted for (MUF) in large scale reprocessing plants using numerical calculations. Continuous wavelet functions are used to decompose the process data, which simulate batch operation superimposed by various types of disturbance, and the disturbance components included in the data are divided into time and frequency spaces. The diagnosis of MSSPC is applied to distinguish abnormal events from the process data and shows how to detect abrupt and protracted diversions using principle component analysis. Quantitative performance of MSSPC for the time series data is shown with average run lengths given by Monte-Carlo simulation to compare to the non-detection probability β. Recent discussion about bias corrections in material balances is introduced and another approach is presented to evaluate MUF without assuming the measurement error model. (author)

  8. A Statistical Method for Aggregated Wind Power Plants to Provide Secondary Frequency Control

    DEFF Research Database (Denmark)

    Hu, Junjie; Ziras, Charalampos; Bindner, Henrik W.

    2017-01-01

    curtailment for aggregated wind power plants providing secondary frequency control (SFC) to the power system. By using historical SFC signals and wind speed data, we calculate metrics for the reserve provision error as a function of the scheduled wind power. We show that wind curtailment can be significantly......The increasing penetration of wind power brings significant challenges to power system operators due to the wind’s inherent uncertainty and variability. Traditionally, power plants and more recently demand response have been used to balance the power system. However, the use of wind power...... as a balancing-power source has also been investigated, especially for wind power dominated power systems such as Denmark. The main drawback is that wind power must be curtailed by setting a lower operating point, in order to offer upward regulation. We propose a statistical approach to reduce wind power...

  9. Macro-/Micro-Controlled 3D Lithium-Ion Batteries via Additive Manufacturing and Electric Field Processing.

    Science.gov (United States)

    Li, Jie; Liang, Xinhua; Liou, Frank; Park, Jonghyun

    2018-01-30

    This paper presents a new concept for making battery electrodes that can simultaneously control macro-/micro-structures and help address current energy storage technology gaps and future energy storage requirements. Modern batteries are fabricated in the form of laminated structures that are composed of randomly mixed constituent materials. This randomness in conventional methods can provide a possibility of developing new breakthrough processing techniques to build well-organized structures that can improve battery performance. In the proposed processing, an electric field (EF) controls the microstructures of manganese-based electrodes, while additive manufacturing controls macro-3D structures and the integration of both scales. The synergistic control of micro-/macro-structures is a novel concept in energy material processing that has considerable potential for providing unprecedented control of electrode structures, thereby enhancing performance. Electrochemical tests have shown that these new electrodes exhibit superior performance in their specific capacity, areal capacity, and life cycle.

  10. An Update on Statistical Boosting in Biomedicine

    Directory of Open Access Journals (Sweden)

    Andreas Mayr

    2017-01-01

    Full Text Available Statistical boosting algorithms have triggered a lot of research during the last decade. They combine a powerful machine learning approach with classical statistical modelling, offering various practical advantages like automated variable selection and implicit regularization of effect estimates. They are extremely flexible, as the underlying base-learners (regression functions defining the type of effect for the explanatory variables can be combined with any kind of loss function (target function to be optimized, defining the type of regression setting. In this review article, we highlight the most recent methodological developments on statistical boosting regarding variable selection, functional regression, and advanced time-to-event modelling. Additionally, we provide a short overview on relevant applications of statistical boosting in biomedicine.

  11. Properties of permutation-based gene tests and controlling type 1 error using a summary statistic based gene test.

    Science.gov (United States)

    Swanson, David M; Blacker, Deborah; Alchawa, Taofik; Ludwig, Kerstin U; Mangold, Elisabeth; Lange, Christoph

    2013-11-07

    The advent of genome-wide association studies has led to many novel disease-SNP associations, opening the door to focused study on their biological underpinnings. Because of the importance of analyzing these associations, numerous statistical methods have been devoted to them. However, fewer methods have attempted to associate entire genes or genomic regions with outcomes, which is potentially more useful knowledge from a biological perspective and those methods currently implemented are often permutation-based. One property of some permutation-based tests is that their power varies as a function of whether significant markers are in regions of linkage disequilibrium (LD) or not, which we show from a theoretical perspective. We therefore develop two methods for quantifying the degree of association between a genomic region and outcome, both of whose power does not vary as a function of LD structure. One method uses dimension reduction to "filter" redundant information when significant LD exists in the region, while the other, called the summary-statistic test, controls for LD by scaling marker Z-statistics using knowledge of the correlation matrix of markers. An advantage of this latter test is that it does not require the original data, but only their Z-statistics from univariate regressions and an estimate of the correlation structure of markers, and we show how to modify the test to protect the type 1 error rate when the correlation structure of markers is misspecified. We apply these methods to sequence data of oral cleft and compare our results to previously proposed gene tests, in particular permutation-based ones. We evaluate the versatility of the modification of the summary-statistic test since the specification of correlation structure between markers can be inaccurate. We find a significant association in the sequence data between the 8q24 region and oral cleft using our dimension reduction approach and a borderline significant association using the

  12. Understanding Statistics - Cancer Statistics

    Science.gov (United States)

    Annual reports of U.S. cancer statistics including new cases, deaths, trends, survival, prevalence, lifetime risk, and progress toward Healthy People targets, plus statistical summaries for a number of common cancer types.

  13. [Cause-of-death statistics and ICD, quo vadis?

    Science.gov (United States)

    Eckert, Olaf; Vogel, Ulrich

    2018-07-01

    The International Statistical Classification of Diseases and Related Health Problems (ICD) is the worldwide binding standard for generating underlying cause-of-death statistics. What are the effects of former revisions of the ICD on underlying cause-of-death statistics and which opportunities and challenges are becoming apparent in a possible transition process from ICD-10 to ICD-11?This article presents the calculation of the exploitation grade of ICD-9 and ICD-10 in the German cause-of-death statistics and quality of documentation. Approximately 67,000 anonymized German death certificates are processed by Iris/MUSE and official German cause-of-death statistics are analyzed.In addition to substantial changes in the exploitation grade in the transition from ICD-9 to ICD-10, regional effects become visible. The rate of so-called "ill-defined" conditions exceeds 10%.Despite substantial improvement of ICD revisions there are long-known deficits in the coroner's inquest, filling death certificates and quality of coding. To make better use of the ICD as a methodological framework for mortality statistics and health reporting in Germany, the following measures are necessary: 1. General use of Iris/MUSE, 2. Establishing multiple underlying cause-of-death statistics, 3. Introduction of an electronic death certificate, 4. Improvement of the medical assessment of cause of death.Within short time the WHO will release the 11th revision of the ICD that will provide additional opportunities for the development of underlying cause-of-death statistics and their use in science, public health and politics. A coordinated effort including participants in the process and users is necessary to meet the related challenges.

  14. Digestive microbiota is different in pigs receiving antimicrobials or a feed additive during the nursery period.

    Directory of Open Access Journals (Sweden)

    Cassandra Soler

    Full Text Available Antimicrobials have been used in a prophylactic way to decrease the incidence of digestive disorders during the piglet post-weaning period. Nowadays, it is urgent to reduce their consumption in livestock to address the problem of antimicrobial resistance. In this study, the effect of a product on piglet microbiota has been investigated as an alternative to antimicrobials. Three groups of ten post-weaning pigs were sampled at 0, 15 and 30 days one week post-weaning; the control, antibiotic and feed additive group received a standard post-weaning diet without antibiotics or additives, the same diet as the control group but with amoxicillin and colistin sulphate and the same diet as the control group but with a feed additive (Sanacore-EN, Nutriad International N.V., respectively. The total DNA extracted from faeces was used to amplify the 16S RNA gene for massive sequencing under manufacturer's conditions. Sequencing data was quality filtered and analyzed using QIIME software and suitable statistical methods. In general terms, age modifies significantly the microbiota of the piglets. Thus, the oldest the animal, the highest bacterial diversity observed for the control and the feed additive groups. However, this diversity was very similar in the antibiotic group throughout the trial. Interestingly, a clear increase in abundance of Bacillus and Lactobacillus spp was detected within the feed additive group versus the antibiotic and control groups. In conclusion, the feed additive group had a positive effect in the endogenous microbiota of post-weaning pigs increasing both, the diversity of bacterial families and the abundance of lactic acid bacteria during the post-weaning period.

  15. Controlling photo-oxidation processes of a polyfluorene derivative: The effect of additives and mechanism

    Energy Technology Data Exchange (ETDEWEB)

    Ferreira, G.R. [Laboratory of Polymers and Electronic Properties of Materials – UFOP, Ouro Preto, MG (Brazil); Nowacki, B. [Paulo Scarpa Polymer Laboratory – UFPR, Curitiba, PR (Brazil); Magalhães, A. [Instituto de Química, Universidade Estadual de Campinas – UNICAMP, Campinas, SP (Brazil); Azevedo, E.R. de [Instituto de Física de São Carlos, Universidade de São Paulo – USP, São Carlos, SP (Brazil); Sá, E.L. de [Chemistry Department, Federal University of Parana, Curitiba, PR (Brazil); Akcelrud, L.C. [Paulo Scarpa Polymer Laboratory – UFPR, Curitiba, PR (Brazil); Bianchi, R.F., E-mail: bianchi@iceb.ufop.br [Laboratory of Polymers and Electronic Properties of Materials – UFOP, Ouro Preto, MG (Brazil)

    2014-08-01

    The control of the photo degradation of a fluorene–vinylene–phenylene based-polymer, poly(9,9-di-hexylfluorenediylvinylene-alt-1,4-phenylenevinylene) (LaPPS16) was achieved by addition of a radical scavenger (RS) (enhancing photo resistance) or a radical initiator (RI) (reducing photo resistance). Photoluminescence, UV–Vis absorption, {sup 1}H NMR spectroscopies and gel permeation chromatography (GPC) revealed that the incorporating small amounts of RS or RI is an efficient way to control the rates of the photo-oxidation reactions, and thus to obtain the conjugated polymer with foreseeable degradation rates for applications in blue-light sensitive detectors for neonatal phototherapy. - Highlights: • Photo degradation control of a fluorene–vinylene–phenylene based polymer was achieved. • A radical scavenger enhanced photo resistance and radical initiator decreased it. • Color change rate with irradiation dose provided a basis for dosimeter construction.

  16. Statistical evaluation of failures and repairs of the V-1 measuring and control system

    International Nuclear Information System (INIS)

    Laurinec, R.; Korec, J.; Mitosinka, J.; Zarnovican, V.

    1984-01-01

    A failure record card system was introduced for evaluating the reliability of the measurement and control equipment of the V-1 nuclear power plant. The SPU-800 microcomputer system is used for recording data on magnetic tape and their transmission to the central data processing department. The data are used for evaluating the reliability of components and circuits and a selection is made of the most failure-prone components, and the causes of failures are evaluated as are failure identification, repair and causes of outages. The system provides monthly, annual and total assessment data since the system was commissioned. The results of the statistical evaluation of failures are used for planning preventive maintenance and for determining optimal repair intervals. (E.S.)

  17. QUALITY IMPROVEMENT USING STATISTICAL PROCESS CONTROL TOOLS IN GLASS BOTTLES MANUFACTURING COMPANY

    Directory of Open Access Journals (Sweden)

    Yonatan Mengesha Awaj

    2013-03-01

    Full Text Available In order to survive in a competitive market, improving quality and productivity of product or process is a must for any company. This study is about to apply the statistical process control (SPC tools in the production processing line and on final product in order to reduce defects by identifying where the highest waste is occur at and to give suggestion for improvement. The approach used in this study is direct observation, thorough examination of production process lines, brain storming session, fishbone diagram, and information has been collected from potential customers and company's workers through interview and questionnaire, Pareto chart/analysis and control chart (p-chart was constructed. It has been found that the company has many problems; specifically there is high rejection or waste in the production processing line. The highest waste occurs in melting process line which causes loss due to trickle and in the forming process line which causes loss due to defective product rejection. The vital few problems were identified, it was found that the blisters, double seam, stone, pressure failure and overweight are the vital few problems. The principal aim of the study is to create awareness to quality team how to use SPC tools in the problem analysis, especially to train quality team on how to held an effective brainstorming session, and exploit these data in cause-and-effect diagram construction, Pareto analysis and control chart construction. The major causes of non-conformities and root causes of the quality problems were specified, and possible remedies were proposed. Although the company has many constraints to implement all suggestion for improvement within short period of time, the company recognized that the suggestion will provide significant productivity improvement in the long run.

  18. Perception in statistical graphics

    Science.gov (United States)

    VanderPlas, Susan Ruth

    There has been quite a bit of research on statistical graphics and visualization, generally focused on new types of graphics, new software to create graphics, interactivity, and usability studies. Our ability to interpret and use statistical graphics hinges on the interface between the graph itself and the brain that perceives and interprets it, and there is substantially less research on the interplay between graph, eye, brain, and mind than is sufficient to understand the nature of these relationships. The goal of the work presented here is to further explore the interplay between a static graph, the translation of that graph from paper to mental representation (the journey from eye to brain), and the mental processes that operate on that graph once it is transferred into memory (mind). Understanding the perception of statistical graphics should allow researchers to create more effective graphs which produce fewer distortions and viewer errors while reducing the cognitive load necessary to understand the information presented in the graph. Taken together, these experiments should lay a foundation for exploring the perception of statistical graphics. There has been considerable research into the accuracy of numerical judgments viewers make from graphs, and these studies are useful, but it is more effective to understand how errors in these judgments occur so that the root cause of the error can be addressed directly. Understanding how visual reasoning relates to the ability to make judgments from graphs allows us to tailor graphics to particular target audiences. In addition, understanding the hierarchy of salient features in statistical graphics allows us to clearly communicate the important message from data or statistical models by constructing graphics which are designed specifically for the perceptual system.

  19. Statistical Design of an Adaptive Synthetic X- Control Chart with Run Rule on Service and Management Operation

    Directory of Open Access Journals (Sweden)

    Shucheng Yu

    2016-01-01

    Full Text Available An improved synthetic X- control chart based on hybrid adaptive scheme and run rule scheme is introduced to enhance the statistical performance of traditional synthetic X- control chart on service and management operation. The proposed scientific hybrid adaptive schemes consider both variable sampling interval and variable sample size scheme. The properties of the proposed chart are obtained using Markov chain approach. An extensive set of numerical results is presented to test the effectiveness of the proposed model in detecting small and moderate shifts in the process mean. The results show that the proposed chart is quicker than the standard synthetic X- chart and CUSUM chart in detecting small and moderate shifts in the process of service and management operation.

  20. Testing for Statistical Discrimination based on Gender

    OpenAIRE

    Lesner, Rune Vammen

    2016-01-01

    This paper develops a model which incorporates the two most commonly cited strands of the literature on statistical discrimination, namely screening discrimination and stereotyping. The model is used to provide empirical evidence of statistical discrimination based on gender in the labour market. It is shown that the implications of both screening discrimination and stereotyping are consistent with observable wage dynamics. In addition, it is found that the gender wage gap decreases in tenure...

  1. Statistics of spatially integrated speckle intensity difference

    DEFF Research Database (Denmark)

    Hanson, Steen Grüner; Yura, Harold

    2009-01-01

    We consider the statistics of the spatially integrated speckle intensity difference obtained from two separated finite collecting apertures. For fully developed speckle, closed-form analytic solutions for both the probability density function and the cumulative distribution function are derived...... here for both arbitrary values of the mean number of speckles contained within an aperture and the degree of coherence of the optical field. Additionally, closed-form expressions are obtained for the corresponding nth statistical moments....

  2. Controlling cyclic combustion timing variations using a symbol-statistics predictive approach in an HCCI engine

    International Nuclear Information System (INIS)

    Ghazimirsaied, Ahmad; Koch, Charles Robert

    2012-01-01

    Highlights: ► Misfire reduction in a combustion engine based on chaotic theory methods. ► Chaotic theory analysis of cyclic variation of a HCCI engine near misfire. ► Symbol sequence approach is used to predict ignition timing one cycle-ahead. ► Prediction is combined with feedback control to lower HCCI combustion variation. ► Feedback control extends the HCCI operating range into the misfire region. -- Abstract: Cyclic variation of a Homogeneous Charge Compression Ignition (HCCI) engine near misfire is analyzed using chaotic theory methods and feedback control is used to stabilize high cyclic variations. Variation of consecutive cycles of θ Pmax (the crank angle of maximum cylinder pressure over an engine cycle) for a Primary Reference Fuel engine is analyzed near misfire operation for five test points with similar conditions but different octane numbers. The return map of the time series of θ Pmax at each combustion cycle reveals the deterministic and random portions of the dynamics near misfire for this HCCI engine. A symbol-statistic approach is used to predict θ Pmax one cycle-ahead. Predicted θ Pmax has similar dynamical behavior to the experimental measurements. Based on this cycle ahead prediction, and using fuel octane as the input, feedback control is used to stabilize the instability of θ Pmax variations at this engine condition near misfire.

  3. Statistical estimate of mercury removal efficiencies for air pollution control devices of municipal solid waste incinerators.

    Science.gov (United States)

    Takahashi, Fumitake; Kida, Akiko; Shimaoka, Takayuki

    2010-10-15

    Although representative removal efficiencies of gaseous mercury for air pollution control devices (APCDs) are important to prepare more reliable atmospheric emission inventories of mercury, they have been still uncertain because they depend sensitively on many factors like the type of APCDs, gas temperature, and mercury speciation. In this study, representative removal efficiencies of gaseous mercury for several types of APCDs of municipal solid waste incineration (MSWI) were offered using a statistical method. 534 data of mercury removal efficiencies for APCDs used in MSWI were collected. APCDs were categorized as fixed-bed absorber (FA), wet scrubber (WS), electrostatic precipitator (ESP), and fabric filter (FF), and their hybrid systems. Data series of all APCD types had Gaussian log-normality. The average removal efficiency with a 95% confidence interval for each APCD was estimated. The FA, WS, and FF with carbon and/or dry sorbent injection systems had 75% to 82% average removal efficiencies. On the other hand, the ESP with/without dry sorbent injection had lower removal efficiencies of up to 22%. The type of dry sorbent injection in the FF system, dry or semi-dry, did not make more than 1% difference to the removal efficiency. The injection of activated carbon and carbon-containing fly ash in the FF system made less than 3% difference. Estimation errors of removal efficiency were especially high for the ESP. The national average of removal efficiency of APCDs in Japanese MSWI plants was estimated on the basis of incineration capacity. Owing to the replacement of old APCDs for dioxin control, the national average removal efficiency increased from 34.5% in 1991 to 92.5% in 2003. This resulted in an additional reduction of about 0.86Mg emission in 2003. Further study using the methodology in this study to other important emission sources like coal-fired power plants will contribute to better emission inventories. Copyright © 2010 Elsevier B.V. All rights

  4. Statistical aspects of fish stock assessment

    DEFF Research Database (Denmark)

    Berg, Casper Willestofte

    for stock assessment by application of state-of-the-art statistical methodology. The main contributions are presented in the form of six research papers. The major part of the thesis deals with age-structured assessment models, which is the most common approach. Conversion from length to age distributions...... statistical aspects of fish stocks assessment, which includes topics such as time series analysis, generalized additive models (GAMs), and non-linear state-space/mixed models capable of handling missing data and a high number of latent states and parameters. The aim is to improve the existing methods...

  5. Bayesian approach to inverse statistical mechanics

    Science.gov (United States)

    Habeck, Michael

    2014-05-01

    Inverse statistical mechanics aims to determine particle interactions from ensemble properties. This article looks at this inverse problem from a Bayesian perspective and discusses several statistical estimators to solve it. In addition, a sequential Monte Carlo algorithm is proposed that draws the interaction parameters from their posterior probability distribution. The posterior probability involves an intractable partition function that is estimated along with the interactions. The method is illustrated for inverse problems of varying complexity, including the estimation of a temperature, the inverse Ising problem, maximum entropy fitting, and the reconstruction of molecular interaction potentials.

  6. Integer Set Compression and Statistical Modeling

    DEFF Research Database (Denmark)

    Larsson, N. Jesper

    2014-01-01

    enumeration of elements may be arbitrary or random, but where statistics is kept in order to estimate probabilities of elements. We present a recursive subset-size encoding method that is able to benefit from statistics, explore the effects of permuting the enumeration order based on element probabilities......Compression of integer sets and sequences has been extensively studied for settings where elements follow a uniform probability distribution. In addition, methods exist that exploit clustering of elements in order to achieve higher compression performance. In this work, we address the case where...

  7. Distinguish Dynamic Basic Blocks by Structural Statistical Testing

    DEFF Research Database (Denmark)

    Petit, Matthieu; Gotlieb, Arnaud

    Statistical testing aims at generating random test data that respect selected probabilistic properties. A distribution probability is associated with the program input space in order to achieve statistical test purpose: to test the most frequent usage of software or to maximize the probability of...... control flow path) during the test data selection. We implemented this algorithm in a statistical test data generator for Java programs. A first experimental validation is presented...

  8. Reaming process improvement and control: An application of statistical engineering

    DEFF Research Database (Denmark)

    Müller, Pavel; Genta, G.; Barbato, G.

    2012-01-01

    A reaming operation had to be performed within given technological and economical constraints. Process improvement under realistic conditions was the goal of a statistical engineering project, supported by a comprehensive experimental investigation providing detailed information on single...

  9. Statistical and Computational Techniques in Manufacturing

    CERN Document Server

    2012-01-01

    In recent years, interest in developing statistical and computational techniques for applied manufacturing engineering has been increased. Today, due to the great complexity of manufacturing engineering and the high number of parameters used, conventional approaches are no longer sufficient. Therefore, in manufacturing, statistical and computational techniques have achieved several applications, namely, modelling and simulation manufacturing processes, optimization manufacturing parameters, monitoring and control, computer-aided process planning, etc. The present book aims to provide recent information on statistical and computational techniques applied in manufacturing engineering. The content is suitable for final undergraduate engineering courses or as a subject on manufacturing at the postgraduate level. This book serves as a useful reference for academics, statistical and computational science researchers, mechanical, manufacturing and industrial engineers, and professionals in industries related to manu...

  10. Effect of Internet-Based Cognitive Apprenticeship Model (i-CAM) on Statistics Learning among Postgraduate Students.

    Science.gov (United States)

    Saadati, Farzaneh; Ahmad Tarmizi, Rohani; Mohd Ayub, Ahmad Fauzi; Abu Bakar, Kamariah

    2015-01-01

    Because students' ability to use statistics, which is mathematical in nature, is one of the concerns of educators, embedding within an e-learning system the pedagogical characteristics of learning is 'value added' because it facilitates the conventional method of learning mathematics. Many researchers emphasize the effectiveness of cognitive apprenticeship in learning and problem solving in the workplace. In a cognitive apprenticeship learning model, skills are learned within a community of practitioners through observation of modelling and then practice plus coaching. This study utilized an internet-based Cognitive Apprenticeship Model (i-CAM) in three phases and evaluated its effectiveness for improving statistics problem-solving performance among postgraduate students. The results showed that, when compared to the conventional mathematics learning model, the i-CAM could significantly promote students' problem-solving performance at the end of each phase. In addition, the combination of the differences in students' test scores were considered to be statistically significant after controlling for the pre-test scores. The findings conveyed in this paper confirmed the considerable value of i-CAM in the improvement of statistics learning for non-specialized postgraduate students.

  11. Development and Assessment of a Preliminary Randomization-Based Introductory Statistics Curriculum

    Science.gov (United States)

    Tintle, Nathan; VanderStoep, Jill; Holmes, Vicki-Lynn; Quisenberry, Brooke; Swanson, Todd

    2011-01-01

    The algebra-based introductory statistics course is the most popular undergraduate course in statistics. While there is a general consensus for the content of the curriculum, the recent Guidelines for Assessment and Instruction in Statistics Education (GAISE) have challenged the pedagogy of this course. Additionally, some arguments have been made…

  12. National Statistical Commission and Indian Official Statistics*

    Indian Academy of Sciences (India)

    IAS Admin

    a good collection of official statistics of that time. With more .... statistical agencies and institutions to provide details of statistical activities .... ing several training programmes. .... ful completion of Indian Statistical Service examinations, the.

  13. Control device for a nuclear reactor with a multitude of control rods, extending into the reactor core from above, with linear drive mechanisms and additional gripper devices

    International Nuclear Information System (INIS)

    Bevilacqua, F.

    1979-01-01

    The components of the additional gripper devices with magnetically operated finger-shaped latches are separated from the also magnetically operated latches of the linear drive mechanisms in order to avoid common-mode failures when fast shutdown is required. Only part of the safety rods are held by the additional gripping devices in the withdrawn position. There is provided for recording elements indicating positively which one of the safety locks is gearing with the control rods. At the upper end of each control rod there is a coupling head held by electromagnetically operated locking devices in the withdrawn position, if control power is available. (DG) [de

  14. Official Statistics and Statistics Education: Bridging the Gap

    Directory of Open Access Journals (Sweden)

    Gal Iddo

    2017-03-01

    Full Text Available This article aims to challenge official statistics providers and statistics educators to ponder on how to help non-specialist adult users of statistics develop those aspects of statistical literacy that pertain to official statistics. We first document the gap in the literature in terms of the conceptual basis and educational materials needed for such an undertaking. We then review skills and competencies that may help adults to make sense of statistical information in areas of importance to society. Based on this review, we identify six elements related to official statistics about which non-specialist adult users should possess knowledge in order to be considered literate in official statistics: (1 the system of official statistics and its work principles; (2 the nature of statistics about society; (3 indicators; (4 statistical techniques and big ideas; (5 research methods and data sources; and (6 awareness and skills for citizens’ access to statistical reports. Based on this ad hoc typology, we discuss directions that official statistics providers, in cooperation with statistics educators, could take in order to (1 advance the conceptualization of skills needed to understand official statistics, and (2 expand educational activities and services, specifically by developing a collaborative digital textbook and a modular online course, to improve public capacity for understanding of official statistics.

  15. Statistics for X-chromosome associations.

    Science.gov (United States)

    Özbek, Umut; Lin, Hui-Min; Lin, Yan; Weeks, Daniel E; Chen, Wei; Shaffer, John R; Purcell, Shaun M; Feingold, Eleanor

    2018-06-13

    In a genome-wide association study (GWAS), association between genotype and phenotype at autosomal loci is generally tested by regression models. However, X-chromosome data are often excluded from published analyses of autosomes because of the difference between males and females in number of X chromosomes. Failure to analyze X-chromosome data at all is obviously less than ideal, and can lead to missed discoveries. Even when X-chromosome data are included, they are often analyzed with suboptimal statistics. Several mathematically sensible statistics for X-chromosome association have been proposed. The optimality of these statistics, however, is based on very specific simple genetic models. In addition, while previous simulation studies of these statistics have been informative, they have focused on single-marker tests and have not considered the types of error that occur even under the null hypothesis when the entire X chromosome is scanned. In this study, we comprehensively tested several X-chromosome association statistics using simulation studies that include the entire chromosome. We also considered a wide range of trait models for sex differences and phenotypic effects of X inactivation. We found that models that do not incorporate a sex effect can have large type I error in some cases. We also found that many of the best statistics perform well even when there are modest deviations, such as trait variance differences between the sexes or small sex differences in allele frequencies, from assumptions. © 2018 WILEY PERIODICALS, INC.

  16. Conversion factors and oil statistics

    International Nuclear Information System (INIS)

    Karbuz, Sohbet

    2004-01-01

    World oil statistics, in scope and accuracy, are often far from perfect. They can easily lead to misguided conclusions regarding the state of market fundamentals. Without proper attention directed at statistic caveats, the ensuing interpretation of oil market data opens the door to unnecessary volatility, and can distort perception of market fundamentals. Among the numerous caveats associated with the compilation of oil statistics, conversion factors, used to produce aggregated data, play a significant role. Interestingly enough, little attention is paid to conversion factors, i.e. to the relation between different units of measurement for oil. Additionally, the underlying information regarding the choice of a specific factor when trying to produce measurements of aggregated data remains scant. The aim of this paper is to shed some light on the impact of conversion factors for two commonly encountered issues, mass to volume equivalencies (barrels to tonnes) and for broad energy measures encountered in world oil statistics. This paper will seek to demonstrate how inappropriate and misused conversion factors can yield wildly varying results and ultimately distort oil statistics. Examples will show that while discrepancies in commonly used conversion factors may seem trivial, their impact on the assessment of a world oil balance is far from negligible. A unified and harmonised convention for conversion factors is necessary to achieve accurate comparisons and aggregate oil statistics for the benefit of both end-users and policy makers

  17. Statistical Learning and Dyslexia: A Systematic Review

    Science.gov (United States)

    Schmalz, Xenia; Altoè, Gianmarco; Mulatti, Claudio

    2017-01-01

    The existing literature on developmental dyslexia (hereafter: dyslexia) often focuses on isolating cognitive skills which differ across dyslexic and control participants. Among potential correlates, previous research has studied group differences between dyslexic and control participants in performance on statistical learning tasks. A statistical…

  18. Statistical implications in Monte Carlo depletions - 051

    International Nuclear Information System (INIS)

    Zhiwen, Xu; Rhodes, J.; Smith, K.

    2010-01-01

    As a result of steady advances of computer power, continuous-energy Monte Carlo depletion analysis is attracting considerable attention for reactor burnup calculations. The typical Monte Carlo analysis is set up as a combination of a Monte Carlo neutron transport solver and a fuel burnup solver. Note that the burnup solver is a deterministic module. The statistical errors in Monte Carlo solutions are introduced into nuclide number densities and propagated along fuel burnup. This paper is towards the understanding of the statistical implications in Monte Carlo depletions, including both statistical bias and statistical variations in depleted fuel number densities. The deterministic Studsvik lattice physics code, CASMO-5, is modified to model the Monte Carlo depletion. The statistical bias in depleted number densities is found to be negligible compared to its statistical variations, which, in turn, demonstrates the correctness of the Monte Carlo depletion method. Meanwhile, the statistical variation in number densities generally increases with burnup. Several possible ways of reducing the statistical errors are discussed: 1) to increase the number of individual Monte Carlo histories; 2) to increase the number of time steps; 3) to run additional independent Monte Carlo depletion cases. Finally, a new Monte Carlo depletion methodology, called the batch depletion method, is proposed, which consists of performing a set of independent Monte Carlo depletions and is thus capable of estimating the overall statistical errors including both the local statistical error and the propagated statistical error. (authors)

  19. Estimating the Time to Benefit for Preventive Drugs with the Statistical Process Control Method: An Example with Alendronate

    OpenAIRE

    van de Glind, Esther M. M.; Willems, Hanna C.; Eslami, Saeid; Abu-Hanna, Ameen; Lems, Willem F.; Hooft, Lotty; de Rooij, Sophia E.; Black, Dennis M.; van Munster, Barbara C.

    2016-01-01

    Background For physicians dealing with patients with a limited life expectancy, knowing the time to benefit (TTB) of preventive medication is essential to support treatment decisions. Objective The aim of this study was to investigate the usefulness of statistical process control (SPC) for determining the TTB in relation to fracture risk with alendronate versus placebo in postmenopausal women. Methods We performed a post?hoc analysis of the Fracture Intervention Trial (FIT), a randomized, con...

  20. Statistics of high-level scene context.

    Science.gov (United States)

    Greene, Michelle R

    2013-01-01

    CONTEXT IS CRITICAL FOR RECOGNIZING ENVIRONMENTS AND FOR SEARCHING FOR OBJECTS WITHIN THEM: contextual associations have been shown to modulate reaction time and object recognition accuracy, as well as influence the distribution of eye movements and patterns of brain activations. However, we have not yet systematically quantified the relationships between objects and their scene environments. Here I seek to fill this gap by providing descriptive statistics of object-scene relationships. A total of 48, 167 objects were hand-labeled in 3499 scenes using the LabelMe tool (Russell et al., 2008). From these data, I computed a variety of descriptive statistics at three different levels of analysis: the ensemble statistics that describe the density and spatial distribution of unnamed "things" in the scene; the bag of words level where scenes are described by the list of objects contained within them; and the structural level where the spatial distribution and relationships between the objects are measured. The utility of each level of description for scene categorization was assessed through the use of linear classifiers, and the plausibility of each level for modeling human scene categorization is discussed. Of the three levels, ensemble statistics were found to be the most informative (per feature), and also best explained human patterns of categorization errors. Although a bag of words classifier had similar performance to human observers, it had a markedly different pattern of errors. However, certain objects are more useful than others, and ceiling classification performance could be achieved using only the 64 most informative objects. As object location tends not to vary as a function of category, structural information provided little additional information. Additionally, these data provide valuable information on natural scene redundancy that can be exploited for machine vision, and can help the visual cognition community to design experiments guided by statistics

  1. READING STATISTICS AND RESEARCH

    Directory of Open Access Journals (Sweden)

    Reviewed by Yavuz Akbulut

    2008-10-01

    Full Text Available The book demonstrates the best and most conservative ways to decipher and critique research reports particularly for social science researchers. In addition, new editions of the book are always better organized, effectively structured and meticulously updated in line with the developments in the field of research statistics. Even the most trivial issues are revisited and updated in new editions. For instance, purchaser of the previous editions might check the interpretation of skewness and kurtosis indices in the third edition (p. 34 and in the fifth edition (p.29 to see how the author revisits every single detail. Theory and practice always go hand in hand in all editions of the book. Re-reading previous editions (e.g. third edition before reading the fifth edition gives the impression that the author never stops ameliorating his instructional text writing methods. In brief, “Reading Statistics and Research” is among the best sources showing research consumers how to understand and critically assess the statistical information and research results contained in technical research reports. In this respect, the review written by Mirko Savić in Panoeconomicus (2008, 2, pp. 249-252 will help the readers to get a more detailed overview of each chapters. I cordially urge the beginning researchers to pick a highlighter to conduct a detailed reading with the book. A thorough reading of the source will make the researchers quite selective in appreciating the harmony between the data analysis, results and discussion sections of typical journal articles. If interested, beginning researchers might begin with this book to grasp the basics of research statistics, and prop up their critical research reading skills with some statistics package applications through the help of Dr. Andy Field’s book, Discovering Statistics using SPSS (second edition published by Sage in 2005.

  2. A statistical rationale for establishing process quality control limits using fixed sample size, for critical current verification of SSC superconducting wire

    International Nuclear Information System (INIS)

    Pollock, D.A.; Brown, G.; Capone, D.W. II; Christopherson, D.; Seuntjens, J.M.; Woltz, J.

    1992-03-01

    The purpose of this paper is to demonstrate a statistical method for verifying superconducting wire process stability as represented by I c . The paper does not propose changing the I c testing frequency for wire during Phase 1 of the present Vendor Qualification Program. The actual statistical limits demonstrated for one supplier's data are not expected to be suitable for all suppliers. However, the method used to develop the limits and the potential for improved process through their use, may be applied equally. Implementing the demonstrated method implies that the current practice of testing all pieces of wire from each billet, for the purpose of detecting manufacturing process errors (i.e. missing a heat-treatment cycle for a part of the billet, etc.) can be replaced by other less costly process control measures. As used in this paper process control limits for critical current are quantitative indicators of the source manufacturing process uniformity. The limits serve as alarms indicating the need for manufacturing process investigation

  3. Modern Thermodynamics with Statistical Mechanics

    CERN Document Server

    Helrich, Carl S

    2009-01-01

    With the aim of presenting thermodynamics in as simple and as unified a form as possible, this textbook starts with an introduction to the first and second laws and then promptly addresses the complete set of the potentials in a subsequent chapter and as a central theme throughout. Before discussing modern laboratory measurements, the book shows that the fundamental quantities sought in the laboratory are those which are required for determining the potentials. Since the subjects of thermodynamics and statistical mechanics are a seamless whole, statistical mechanics is treated as integral part of the text. Other key topics such as irreversibility, the ideas of Ilya Prigogine, chemical reaction rates, equilibrium of heterogeneous systems, and transition-state theory serve to round out this modern treatment. An additional chapter covers quantum statistical mechanics due to active current research in Bose-Einstein condensation. End-of-chapter exercises, chapter summaries, and an appendix reviewing fundamental pr...

  4. Analysis and Design of a Maglev Permanent Magnet Synchronous Linear Motor to Reduce Additional Torque in dq Current Control

    Directory of Open Access Journals (Sweden)

    Feng Xing

    2018-03-01

    Full Text Available The maglev linear motor has three degrees of motion freedom, which are respectively realized by the thrust force in the x-axis, the levitation force in the z-axis and the torque around the y-axis. Both the thrust force and levitation force can be seen as the sum of the forces on the three windings. The resultant thrust force and resultant levitation force are independently controlled by d-axis current and q-axis current respectively. Thus, the commonly used dq transformation control strategy is suitable for realizing the control of the resultant force, either thrust force and levitation force. However, the forces on the three windings also generate additional torque because they do not pass the mover mass center. To realize the maglev system high-precision control, a maglev linear motor with a new structure is proposed in this paper to decrease this torque. First, the electromagnetic model of the motor can be deduced through the Lorenz force formula. Second, the analytic method and finite element method are used to explore the reason of this additional torque and what factors affect its change trend. Furthermore, a maglev linear motor with a new structure is proposed, with two sets of 90 degrees shifted winding designed on the mover. Under such a structure, the mover position dependent periodic part of the additional torque can be offset. Finally, the theoretical analysis is validated by the simulation result that the additionally generated rotating torque can be offset with little fluctuation in the proposed new-structure maglev linear motor. Moreover, the control system is built in MATLAB/Simulink, which shows that it has small thrust ripple and high-precision performance.

  5. Robust Tests for Additive Gene-Environment Interaction in Case-Control Studies Using Gene-Environment Independence

    DEFF Research Database (Denmark)

    Liu, Gang; Lee, Seunggeun; Lee, Alice W

    2018-01-01

    test with case-control data. Our simulation studies suggest that the EB approach uses the gene-environment independence assumption in a data-adaptive way and provides power gain compared to the standard logistic regression analysis and better control of Type I error when compared to the analysis......There have been recent proposals advocating the use of additive gene-environment interaction instead of the widely used multiplicative scale, as a more relevant public health measure. Using gene-environment independence enhances the power for testing multiplicative interaction in case......-control studies. However, under departure from this assumption, substantial bias in the estimates and inflated Type I error in the corresponding tests can occur. This paper extends the empirical Bayes (EB) approach previously developed for multiplicative interaction that trades off between bias and efficiency...

  6. Limit temperature for entanglement in generalized statistics

    International Nuclear Information System (INIS)

    Rossignoli, R.; Canosa, N.

    2004-01-01

    We discuss the main properties of general thermal states derived from non-additive entropic forms and their use for studying quantum entanglement. It is shown that all these states become more mixed as the temperature increases, approaching the full random state for T→∞. The formalism is then applied to examine the limit temperature for entanglement in a two-qubit XXZ Heisenberg chain, which exhibits the peculiar feature of being independent of the applied magnetic field in the conventional von Neumann based statistics. In contrast, this temperature is shown to be field dependent in a generalized statistics, even for small deviations from the standard form. Results for the Tsallis-based statistics are examined in detail

  7. Reading Statistics And Research

    OpenAIRE

    Akbulut, Reviewed By Yavuz

    2008-01-01

    The book demonstrates the best and most conservative ways to decipher and critique research reports particularly for social science researchers. In addition, new editions of the book are always better organized, effectively structured and meticulously updated in line with the developments in the field of research statistics. Even the most trivial issues are revisited and updated in new editions. For instance, purchaser of the previous editions might check the interpretation of skewness and ku...

  8. Introduction to statistics using interactive MM*Stat elements

    CERN Document Server

    Härdle, Wolfgang Karl; Rönz, Bernd

    2015-01-01

    MM*Stat, together with its enhanced online version with interactive examples, offers a flexible tool that facilitates the teaching of basic statistics. It covers all the topics found in introductory descriptive statistics courses, including simple linear regression and time series analysis, the fundamentals of inferential statistics (probability theory, random sampling and estimation theory), and inferential statistics itself (confidence intervals, testing). MM*Stat is also designed to help students rework class material independently and to promote comprehension with the help of additional examples. Each chapter starts with the necessary theoretical background, which is followed by a variety of examples. The core examples are based on the content of the respective chapter, while the advanced examples, designed to deepen students’ knowledge, also draw on information and material from previous chapters. The enhanced online version helps students grasp the complexity and the practical relevance of statistical...

  9. Audit sampling: A qualitative study on the role of statistical and non-statistical sampling approaches on audit practices in Sweden

    OpenAIRE

    Ayam, Rufus Tekoh

    2011-01-01

    PURPOSE: The two approaches to audit sampling; statistical and nonstatistical have been examined in this study. The overall purpose of the study is to explore the current extent at which statistical and nonstatistical sampling approaches are utilized by independent auditors during auditing practices. Moreover, the study also seeks to achieve two additional purposes; the first is to find out whether auditors utilize different sampling techniques when auditing SME´s (Small and Medium-Sized Ente...

  10. The Scythe Statistical Library: An Open Source C++ Library for Statistical Computation

    Directory of Open Access Journals (Sweden)

    Daniel Pemstein

    2011-08-01

    Full Text Available The Scythe Statistical Library is an open source C++ library for statistical computation. It includes a suite of matrix manipulation functions, a suite of pseudo-random number generators, and a suite of numerical optimization routines. Programs written using Scythe are generally much faster than those written in commonly used interpreted languages, such as R and proglang{MATLAB}; and can be compiled on any system with the GNU GCC compiler (and perhaps with other C++ compilers. One of the primary design goals of the Scythe developers has been ease of use for non-expert C++ programmers. Ease of use is provided through three primary mechanisms: (1 operator and function over-loading, (2 numerous pre-fabricated utility functions, and (3 clear documentation and example programs. Additionally, Scythe is quite flexible and entirely extensible because the source code is available to all users under the GNU General Public License.

  11. Controllable Microdroplet Splitting via Additional Lateral Flow and its Application in Rapid Synthesis of Multi-scale Microspheres

    KAUST Repository

    Zhou, Bingpu

    2015-01-01

    In this paper, we demonstrate that controllable microdroplet splitting could be obtained via additional lateral flow with simplicity and high controllability. The volume ratio of the two splitting products can be flexibly regulated by adjusting the flow rate ratio between the main and additional lateral flows. The splitting phenomena under different main flow rates were investigated. A volume ratio up to 200 : 1 of the two daughter droplets under a relatively higher main flow rate was experimentally achieved based on our approach. In this case, we have successfully achieved uniform daughter droplets with a smallest diameter of ∼19.5 ± 1.6 μm. With such a design, we have synthesized uniform PEGDA hydrogel microspheres with diameters ranging from ∼30 μm to over hundred of micrometers simultaneously.

  12. Equilibrium statistical mechanics for self-gravitating systems: local ergodicity and extended Boltzmann-Gibbs/White-Narayan statistics

    Science.gov (United States)

    He, Ping

    2012-01-01

    The long-standing puzzle surrounding the statistical mechanics of self-gravitating systems has not yet been solved successfully. We formulate a systematic theoretical framework of entropy-based statistical mechanics for spherically symmetric collisionless self-gravitating systems. We use an approach that is very different from that of the conventional statistical mechanics of short-range interaction systems. We demonstrate that the equilibrium states of self-gravitating systems consist of both mechanical and statistical equilibria, with the former characterized by a series of velocity-moment equations and the latter by statistical equilibrium equations, which should be derived from the entropy principle. The velocity-moment equations of all orders are derived from the steady-state collisionless Boltzmann equation. We point out that the ergodicity is invalid for the whole self-gravitating system, but it can be re-established locally. Based on the local ergodicity, using Fermi-Dirac-like statistics, with the non-degenerate condition and the spatial independence of the local microstates, we rederive the Boltzmann-Gibbs entropy. This is consistent with the validity of the collisionless Boltzmann equation, and should be the correct entropy form for collisionless self-gravitating systems. Apart from the usual constraints of mass and energy conservation, we demonstrate that the series of moment or virialization equations must be included as additional constraints on the entropy functional when performing the variational calculus; this is an extension to the original prescription by White & Narayan. Any possible velocity distribution can be produced by the statistical-mechanical approach that we have developed with the extended Boltzmann-Gibbs/White-Narayan statistics. Finally, we discuss the questions of negative specific heat and ensemble inequivalence for self-gravitating systems.

  13. Use of cation selective membrane and acid addition for PH control in two-dimensional electrokinetic remediation of copper

    Energy Technology Data Exchange (ETDEWEB)

    Chan, M.S.M.; Lynch, R.J. [Cambridge Univ., Engineering Dept. (United Kingdom); Ilett, D.J. [AEA Technology, Harwell, Oxfordshire (United Kingdom)

    2001-07-01

    The feasibility of using a combination of a cation selective membrane and acid addition for pH control in electrokinetic remediation to toxic and heavy metals from low-permeability soil has been investigated. The high pH generated during the remediation process, as a result of surplus OH{sup -} ions, may cause metal ions to precipitate as hydroxides at or near the cathodes. This region of high pH is known to be associated with high electrical resistance, which limits the remediation efficiency by inhibiting current flow through the soil. One way to control pH is by adding acid to neutralize the OH{sup -} ions. However, preliminary work showed that addition of acid to the cathodic region was not effective in preventing the spread of the alkaline zone from cathodes toward anodes. Precipitates were formed before metal ions reached the cathodic region. Therefore, another method of pH control was investigated, using a cation selective membrane to enhance the electrokinetic process. The membrane was placed in front of the cathodes to contain the OH{sup -} ions generated, and confine the precipitates of metal hydroxide to a small cathodic region. The clean-up of a contaminated site was modelled in a rectangular tank, using silt as the low permeability soul and copper to simulate the contamination. The objective was to redistribute the contaminant so as to concentrate it into a small area. Three experiments were performed with the following methods of pH control: (1) acid addition, (2) use of a cation selective membrane and (3) a combination of acid addition and a cation selective membrane. Using the combined approach, it was found that 75% of the target clean-up section (bounded by the cation selective membrane and the anodes) had more than 40% of the initial copper removed. The general efficiency of remediation increased in the following order. (orig.)

  14. Instructional Theory for Teaching Statistics.

    Science.gov (United States)

    Atwood, Jan R.; Dinham, Sarah M.

    Metatheoretical analysis of Ausubel's Theory of Meaningful Verbal Learning and Gagne's Theory of Instruction using the Dickoff and James paradigm produced two instructional systems for basic statistics. The systems were tested with a pretest-posttest control group design utilizing students enrolled in an introductory-level graduate statistics…

  15. Teaching biology through statistics: application of statistical methods in genetics and zoology courses.

    Science.gov (United States)

    Colon-Berlingeri, Migdalisel; Burrowes, Patricia A

    2011-01-01

    Incorporation of mathematics into biology curricula is critical to underscore for undergraduate students the relevance of mathematics to most fields of biology and the usefulness of developing quantitative process skills demanded in modern biology. At our institution, we have made significant changes to better integrate mathematics into the undergraduate biology curriculum. The curricular revision included changes in the suggested course sequence, addition of statistics and precalculus as prerequisites to core science courses, and incorporating interdisciplinary (math-biology) learning activities in genetics and zoology courses. In this article, we describe the activities developed for these two courses and the assessment tools used to measure the learning that took place with respect to biology and statistics. We distinguished the effectiveness of these learning opportunities in helping students improve their understanding of the math and statistical concepts addressed and, more importantly, their ability to apply them to solve a biological problem. We also identified areas that need emphasis in both biology and mathematics courses. In light of our observations, we recommend best practices that biology and mathematics academic departments can implement to train undergraduates for the demands of modern biology.

  16. Statistical thermodynamics

    International Nuclear Information System (INIS)

    Lim, Gyeong Hui

    2008-03-01

    This book consists of 15 chapters, which are basic conception and meaning of statistical thermodynamics, Maxwell-Boltzmann's statistics, ensemble, thermodynamics function and fluctuation, statistical dynamics with independent particle system, ideal molecular system, chemical equilibrium and chemical reaction rate in ideal gas mixture, classical statistical thermodynamics, ideal lattice model, lattice statistics and nonideal lattice model, imperfect gas theory on liquid, theory on solution, statistical thermodynamics of interface, statistical thermodynamics of a high molecule system and quantum statistics

  17. On Nonextensive Statistics, Chaos and Fractal Strings

    CERN Document Server

    Castro, C

    2004-01-01

    Motivated by the growing evidence of universality and chaos in QFT and string theory, we study the Tsallis non-extensive statistics ( with a non-additive $ q$-entropy ) of an ensemble of fractal strings and branes of different dimensionalities. Non-equilibrium systems with complex dynamics in stationary states may exhibit large fluctuations of intensive quantities which are described in terms of generalized statistics. Tsallis statistics is a particular representative of such class. The non-extensive entropy and probability distribution of a canonical ensemble of fractal strings and branes is studied in terms of their dimensional spectrum which leads to a natural upper cutoff in energy and establishes a direct correlation among dimensions, energy and temperature. The absolute zero temperature ( Kelvin ) corresponds to zero dimensions (energy ) and an infinite temperature corresponds to infinite dimensions. In the concluding remarks some applications of fractal statistics, quasi-particles, knot theory, quantum...

  18. Shape-control by microwave-assisted hydrothermal method for the synthesis of magnetite nanoparticles using organic additives

    Energy Technology Data Exchange (ETDEWEB)

    Rizzuti, Antonino [Politecnico di Bari, Dipartimento di Ingegneria Civile, Ambientale, del Territorio, Edile e di Chimica (Italy); Dassisti, Michele [Politecnico di Bari, Dipartimento di Meccanica, Management e Matematica (Italy); Mastrorilli, Piero, E-mail: p.mastrorilli@poliba.it [Politecnico di Bari, Dipartimento di Ingegneria Civile, Ambientale, del Territorio, Edile e di Chimica (Italy); Sportelli, Maria C.; Cioffi, Nicola; Picca, Rosaria A. [Università di Bari, Dipartimento di Chimica (Italy); Agostinelli, Elisabetta; Varvaro, Gaspare [Consiglio Nazionale delle Ricerche, Istituto di Struttura della Materia (Italy); Caliandro, Rocco [Consiglio Nazionale delle Ricerche, Istituto di Cristallografia (Italy)

    2015-10-15

    A simple and fast microwave-assisted hydrothermal method is proposed for the synthesis of magnetite nanoparticles. The addition of different surfactants (polyvinylpyrrolidone, oleic acid, or trisodium citrate) was studied to investigate the effect on size distribution, morphology, and functionalization of the magnetite nanoparticles. Microwave irradiation at 150 °C for 2 h of aqueous ferrous chloride and hydrazine without additives resulted in hexagonal magnetite nanoplatelets with a facet-to-facet distance of 116 nm and a thickness of 40 nm having a saturation magnetization of ∼65 Am{sup 2} kg{sup −1}. The use of polyvinylpyrrolidone led to hexagonal nanoparticles with a facet-to-facet distance of 120 nm and a thickness of 53 nm with a saturation magnetization of ∼54 Am{sup 2} kg{sup −1}. Additives such as oleic acid and trisodium citrate yielded quasi-spherical nanoparticles of 25 nm in size with a saturation magnetization of ∼70 Am{sup 2} kg{sup −1} and spheroidal nanoparticles of 60 nm in size with a saturation magnetization up to ∼82 Am{sup 2} kg{sup −1}, respectively. A kinetic control of the crystal growth is believed to be responsible for the hexagonal habit of the nanoparticles obtained without additive. Conversely, a thermodynamic control of the crystal growth, leading to spheroidal nanoparticles, seems to occur when additives which strongly interact with the nanoparticle surface are used. A thorough characterization of the materials was performed. Magnetic properties were investigated by Superconducting Quantum Interference Device and Vibrating Sample magnetometers. Based on the observed magnetic properties, the magnetite obtained using citrate appears to be a promising support for magnetically transportable catalysts.

  19. Probabilistic fuzzy systems as additive fuzzy systems

    NARCIS (Netherlands)

    Almeida, R.J.; Verbeek, N.; Kaymak, U.; Costa Sousa, da J.M.; Laurent, A.; Strauss, O.; Bouchon-Meunier, B.; Yager, R.

    2014-01-01

    Probabilistic fuzzy systems combine a linguistic description of the system behaviour with statistical properties of data. It was originally derived based on Zadeh’s concept of probability of a fuzzy event. Two possible and equivalent additive reasoning schemes were proposed, that lead to the

  20. Statistical thermodynamics understanding the properties of macroscopic systems

    CERN Document Server

    Fai, Lukong Cornelius

    2012-01-01

    Basic Principles of Statistical PhysicsMicroscopic and Macroscopic Description of StatesBasic PostulatesGibbs Ergodic AssumptionGibbsian EnsemblesExperimental Basis of Statistical MechanicsDefinition of Expectation ValuesErgodic Principle and Expectation ValuesProperties of Distribution FunctionRelative Fluctuation of an Additive Macroscopic ParameterLiouville TheoremGibbs Microcanonical EnsembleMicrocanonical Distribution in Quantum MechanicsDensity MatrixDensity Matrix in Energy RepresentationEntropyThermodynamic FunctionsTemperatureAdiabatic ProcessesPressureThermodynamic IdentityLaws of Th

  1. Optimal allocation of testing resources for statistical simulations

    Science.gov (United States)

    Quintana, Carolina; Millwater, Harry R.; Singh, Gulshan; Golden, Patrick

    2015-07-01

    Statistical estimates from simulation involve uncertainty caused by the variability in the input random variables due to limited data. Allocating resources to obtain more experimental data of the input variables to better characterize their probability distributions can reduce the variance of statistical estimates. The methodology proposed determines the optimal number of additional experiments required to minimize the variance of the output moments given single or multiple constraints. The method uses multivariate t-distribution and Wishart distribution to generate realizations of the population mean and covariance of the input variables, respectively, given an amount of available data. This method handles independent and correlated random variables. A particle swarm method is used for the optimization. The optimal number of additional experiments per variable depends on the number and variance of the initial data, the influence of the variable in the output function and the cost of each additional experiment. The methodology is demonstrated using a fretting fatigue example.

  2. [Statistics for statistics?--Thoughts about psychological tools].

    Science.gov (United States)

    Berger, Uwe; Stöbel-Richter, Yve

    2007-12-01

    Statistical methods take a prominent place among psychologists' educational programs. Being known as difficult to understand and heavy to learn, students fear of these contents. Those, who do not aspire after a research carrier at the university, will forget the drilled contents fast. Furthermore, because it does not apply for the work with patients and other target groups at a first glance, the methodological education as a whole was often questioned. For many psychological practitioners the statistical education makes only sense by enforcing respect against other professions, namely physicians. For the own business, statistics is rarely taken seriously as a professional tool. The reason seems to be clear: Statistics treats numbers, while psychotherapy treats subjects. So, does statistics ends in itself? With this article, we try to answer the question, if and how statistical methods were represented within the psychotherapeutical and psychological research. Therefore, we analyzed 46 Originals of a complete volume of the journal Psychotherapy, Psychosomatics, Psychological Medicine (PPmP). Within the volume, 28 different analyse methods were applied, from which 89 per cent were directly based upon statistics. To be able to write and critically read Originals as a backbone of research, presumes a high degree of statistical education. To ignore statistics means to ignore research and at least to reveal the own professional work to arbitrariness.

  3. Statistical Methods for Unusual Count Data

    DEFF Research Database (Denmark)

    Guthrie, Katherine A.; Gammill, Hilary S.; Kamper-Jørgensen, Mads

    2016-01-01

    microchimerism data present challenges for statistical analysis, including a skewed distribution, excess zero values, and occasional large values. Methods for comparing microchimerism levels across groups while controlling for covariates are not well established. We compared statistical models for quantitative...... microchimerism values, applied to simulated data sets and 2 observed data sets, to make recommendations for analytic practice. Modeling the level of quantitative microchimerism as a rate via Poisson or negative binomial model with the rate of detection defined as a count of microchimerism genome equivalents per...

  4. Statistics of foreign trade in radioactive materials

    International Nuclear Information System (INIS)

    Anon.

    2001-01-01

    The German Federal Office for Industry and Foreign Trade Control (BAFA) keeps annual statistics of the imports and exports of radioactive materials, nuclear fuels included. The entries, some of them with precise details, cover the participating countries and the radionuclides concerned as well as all kinds of radioactive materials. The tables listed in the article represent the overall balance of the development of imports and exports of radioactive materials for the years 1983 to 2000 arranged by activity levels, including the development of nuclear fuel imports and exports. For the year 2000, an additional trade balance for irradiated and unirradiated nuclear fuels and source materials differentiated by enrichment is presented for the countries involved. In 2000, some 2446 t of nuclear fuels and source materials were imported into the Federal Republic, while approx. 2720 t were exported. The chief trading partners are countries of the European Union and Russia, South Korea, and Brazil. (orig.) [de

  5. Soft Time-Suboptimal Controlling Structure for Mechanical Systems

    DEFF Research Database (Denmark)

    Kulczycki, Piotr; Wisniewski, Rafal; Kowalski, Piotr

    2004-01-01

    The paper presents conception of a soft control structure based on the time-optimal approach. Its parameters are selected in accordance with the rules of the statistical decision theory and additionally it allows to eliminate rapid changes in control values. The object is a basic mechanical system......, with uncertain (also non-stationary) mass treated as a stochastic process. The methodology proposed here is of a universal nature and may easily be applied with respect to other uncertainty elements of timeoptimal controlled mechanical systems....

  6. Effect of Internet-Based Cognitive Apprenticeship Model (i-CAM on Statistics Learning among Postgraduate Students.

    Directory of Open Access Journals (Sweden)

    Farzaneh Saadati

    Full Text Available Because students' ability to use statistics, which is mathematical in nature, is one of the concerns of educators, embedding within an e-learning system the pedagogical characteristics of learning is 'value added' because it facilitates the conventional method of learning mathematics. Many researchers emphasize the effectiveness of cognitive apprenticeship in learning and problem solving in the workplace. In a cognitive apprenticeship learning model, skills are learned within a community of practitioners through observation of modelling and then practice plus coaching. This study utilized an internet-based Cognitive Apprenticeship Model (i-CAM in three phases and evaluated its effectiveness for improving statistics problem-solving performance among postgraduate students. The results showed that, when compared to the conventional mathematics learning model, the i-CAM could significantly promote students' problem-solving performance at the end of each phase. In addition, the combination of the differences in students' test scores were considered to be statistically significant after controlling for the pre-test scores. The findings conveyed in this paper confirmed the considerable value of i-CAM in the improvement of statistics learning for non-specialized postgraduate students.

  7. Automatic optimisation of beam orientations using the simplex algorithm and optimisation of quality control using statistical process control (S.P.C.) for intensity modulated radiation therapy (I.M.R.T.); Optimisation automatique des incidences des faisceaux par l'algorithme du simplexe et optimisation des controles qualite par la Maitrise Statistique des Processus (MSP) en Radiotherapie Conformationnelle par Modulation d'Intensite (RCMI)

    Energy Technology Data Exchange (ETDEWEB)

    Gerard, K

    2008-11-15

    Intensity Modulated Radiation Therapy (I.M.R.T.) is currently considered as a technique of choice to increase the local control of the tumour while reducing the dose to surrounding organs at risk. However, its routine clinical implementation is partially held back by the excessive amount of work required to prepare the patient treatment. In order to increase the efficiency of the treatment preparation, two axes of work have been defined. The first axis concerned the automatic optimisation of beam orientations. We integrated the simplex algorithm in the treatment planning system. Starting from the dosimetric objectives set by the user, it can automatically determine the optimal beam orientations that best cover the target volume while sparing organs at risk. In addition to time sparing, the simplex results of three patients with a cancer of the oropharynx, showed that the quality of the plan is also increased compared to a manual beam selection. Indeed, for an equivalent or even a better target coverage, it reduces the dose received by the organs at risk. The second axis of work concerned the optimisation of pre-treatment quality control. We used an industrial method: Statistical Process Control (S.P.C.) to retrospectively analyse the absolute dose quality control results performed using an ionisation chamber at Centre Alexis Vautrin (C.A.V.). This study showed that S.P.C. is an efficient method to reinforce treatment security using control charts. It also showed that our dose delivery process was stable and statistically capable for prostate treatments, which implies that a reduction of the number of controls can be considered for this type of treatment at the C.A.V.. (author)

  8. Control of a Robotic Hand Using a Tongue Control System-A Prosthesis Application.

    Science.gov (United States)

    Johansen, Daniel; Cipriani, Christian; Popovic, Dejan B; Struijk, Lotte N S A

    2016-07-01

    The aim of this study was to investigate the feasibility of using an inductive tongue control system (ITCS) for controlling robotic/prosthetic hands and arms. This study presents a novel dual modal control scheme for multigrasp robotic hands combining standard electromyogram (EMG) with the ITCS. The performance of the ITCS control scheme was evaluated in a comparative study. Ten healthy subjects used both the ITCS control scheme and a conventional EMG control scheme to complete grasping exercises with the IH1 Azzurra robotic hand implementing five grasps. Time to activate a desired function or grasp was used as the performance metric. Statistically significant differences were found when comparing the performance of the two control schemes. On average, the ITCS control scheme was 1.15 s faster than the EMG control scheme, corresponding to a 35.4% reduction in the activation time. The largest difference was for grasp 5 with a mean AT reduction of 45.3% (2.38 s). The findings indicate that using the ITCS control scheme could allow for faster activation of specific grasps or functions compared with a conventional EMG control scheme. For transhumeral and especially bilateral amputees, the ITCS control scheme could have a significant impact on the prosthesis control. In addition, the ITCS would provide bilateral amputees with the additional advantage of environmental and computer control for which the ITCS was originally developed.

  9. Statistical inference for the additive hazards model under outcome-dependent sampling.

    Science.gov (United States)

    Yu, Jichang; Liu, Yanyan; Sandler, Dale P; Zhou, Haibo

    2015-09-01

    Cost-effective study design and proper inference procedures for data from such designs are always of particular interests to study investigators. In this article, we propose a biased sampling scheme, an outcome-dependent sampling (ODS) design for survival data with right censoring under the additive hazards model. We develop a weighted pseudo-score estimator for the regression parameters for the proposed design and derive the asymptotic properties of the proposed estimator. We also provide some suggestions for using the proposed method by evaluating the relative efficiency of the proposed method against simple random sampling design and derive the optimal allocation of the subsamples for the proposed design. Simulation studies show that the proposed ODS design is more powerful than other existing designs and the proposed estimator is more efficient than other estimators. We apply our method to analyze a cancer study conducted at NIEHS, the Cancer Incidence and Mortality of Uranium Miners Study, to study the risk of radon exposure to cancer.

  10. Nonparametric statistical inference

    CERN Document Server

    Gibbons, Jean Dickinson

    2014-01-01

    Thoroughly revised and reorganized, the fourth edition presents in-depth coverage of the theory and methods of the most widely used nonparametric procedures in statistical analysis and offers example applications appropriate for all areas of the social, behavioral, and life sciences. The book presents new material on the quantiles, the calculation of exact and simulated power, multiple comparisons, additional goodness-of-fit tests, methods of analysis of count data, and modern computer applications using MINITAB, SAS, and STATXACT. It includes tabular guides for simplified applications of tests and finding P values and confidence interval estimates.

  11. Safeguards agreement and additional protocol - IAEA instruments for control of nuclear materials distribution and their application in Tajikistan

    International Nuclear Information System (INIS)

    Nasrulloev, Kh.; Mirsaidov, U.

    2010-01-01

    Full text: It is known that IAEA plays an important role in facilitation of nuclear non-proliferation as international authority which carries out nuclear inspections. Republic of Tajikistan in 1997 signed nuclear weapon non-proliferation treaty. Then in 2004 Safeguards agreement, additional protocol and small quantity protocol were signed. During 5 years Republic of Tajikistan submits information on its nuclear activity as declarations, foreseen in article 2.3 of Additional protocol to Safeguards agreement. Currently 66 declarations are submitted. Information required in accordance with Safeguards agreement and Additional Protocol is figured on that IAEA could compile more detailed and exact conception about nuclear activity in Tajikistan and it has the following purpose: information will lead to more transparency, and make it possible to IAEA to ensure with high extent of confidence that in the framework of declared program, any unstated nuclear activity is concealed; the more exact and comprehensive information, the rare is questions and discrepancies are originating; required information is the basis for effective planning and IAEA activity realization, related not only with safeguards implementation in regard to declared nuclear material but also ensuring of confidence in absence of undeclared nuclear activity in Tajikistan. IAEA inspection mission consisting of Messrs. N.Lazarev and F. Coillou visited Dushanbe in 2008 for verification of republic’s declarations on account for and control of nuclear materials under Additional protocol and Small quantity protocol, as well as consultations were provided on correct declaration completing and providing information on all nuclear materials. Besides, in 2006, the training course was conducted in Chkalovsk with participation of Commonwealth of Independent States countries on Safeguards agreement and Additional protocol. These visits and events will facilitate to strengthening of weapons of mass destruction non

  12. Nuclear material statistical accountancy system

    International Nuclear Information System (INIS)

    Argentest, F.; Casilli, T.; Franklin, M.

    1979-01-01

    The statistical accountancy system developed at JRC Ispra is refered as 'NUMSAS', ie Nuclear Material Statistical Accountancy System. The principal feature of NUMSAS is that in addition to an ordinary material balance calcultation, NUMSAS can calculate an estimate of the standard deviation of the measurement error accumulated in the material balance calculation. The purpose of the report is to describe in detail, the statistical model on wich the standard deviation calculation is based; the computational formula which is used by NUMSAS in calculating the standard deviation and the information about nuclear material measurements and the plant measurement system which are required as data for NUMSAS. The material balance records require processing and interpretation before the material balance calculation is begun. The material balance calculation is the last of four phases of data processing undertaken by NUMSAS. Each of these phases is implemented by a different computer program. The activities which are carried out in each phase can be summarised as follows; the pre-processing phase; the selection and up-date phase; the transformation phase, and the computation phase

  13. Statistical mechanics of economics I

    Energy Technology Data Exchange (ETDEWEB)

    Kusmartsev, F.V., E-mail: F.Kusmartsev@lboro.ac.u [Department of Physics, Loughborough University, Leicestershire, LE11 3TU (United Kingdom)

    2011-02-07

    We show that statistical mechanics is useful in the description of financial crisis and economics. Taking a large amount of instant snapshots of a market over an interval of time we construct their ensembles and study their statistical interference. This results in a probability description of the market and gives capital, money, income, wealth and debt distributions, which in the most cases takes the form of the Bose-Einstein distribution. In addition, statistical mechanics provides the main market equations and laws which govern the correlations between the amount of money, debt, product, prices and number of retailers. We applied the found relations to a study of the evolution of the economics in USA between the years 1996 to 2008 and observe that over that time the income of a major population is well described by the Bose-Einstein distribution which parameters are different for each year. Each financial crisis corresponds to a peak in the absolute activity coefficient. The analysis correctly indicates the past crises and predicts the future one.

  14. Statistical mechanics of economics I

    International Nuclear Information System (INIS)

    Kusmartsev, F.V.

    2011-01-01

    We show that statistical mechanics is useful in the description of financial crisis and economics. Taking a large amount of instant snapshots of a market over an interval of time we construct their ensembles and study their statistical interference. This results in a probability description of the market and gives capital, money, income, wealth and debt distributions, which in the most cases takes the form of the Bose-Einstein distribution. In addition, statistical mechanics provides the main market equations and laws which govern the correlations between the amount of money, debt, product, prices and number of retailers. We applied the found relations to a study of the evolution of the economics in USA between the years 1996 to 2008 and observe that over that time the income of a major population is well described by the Bose-Einstein distribution which parameters are different for each year. Each financial crisis corresponds to a peak in the absolute activity coefficient. The analysis correctly indicates the past crises and predicts the future one.

  15. Improving Fuel Statistics for Danish Aviation

    DEFF Research Database (Denmark)

    Winther, M.

    This report contains fuel use figures for Danish civil aviation broken down into domestic and international numbers from 1985 to 2000, using a refined fuel split procedure and official fuel sale totals. The results from two different models are used. The NERI (National Environmental Research...... Institute) model estimates the fuel use per flight for all flights leaving Danish airports in 1998, while the annual Danish CORINAIR inventories are based on improved LTO/aircraft type statistics. A time series of fuel use from 1985 to 2000 is also shown for flights between Denmark and Greenland/the Faroe...... Islands, obtained with the NERI model. In addition a complete overview of the aviation fuel use from the two latter areas is given, based on fuel sale information from Statistics Greenland and Statistics Faroe Islands, and fuel use data from airline companies. The fuel use figures are presented on a level...

  16. Statistics and analysis of scientific data

    CERN Document Server

    Bonamente, Massimiliano

    2017-01-01

    The revised second edition of this textbook provides the reader with a solid foundation in probability theory and statistics as applied to the physical sciences, engineering and related fields. It covers a broad range of numerical and analytical methods that are essential for the correct analysis of scientific data, including probability theory, distribution functions of statistics, fits to two-dimensional data and parameter estimation, Monte Carlo methods and Markov chains. Features new to this edition include: • a discussion of statistical techniques employed in business science, such as multiple regression analysis of multivariate datasets. • a new chapter on the various measures of the mean including logarithmic averages. • new chapters on systematic errors and intrinsic scatter, and on the fitting of data with bivariate errors. • a new case study and additional worked examples. • mathematical derivations and theoretical background material have been appropriately marked,to improve the readabili...

  17. Calculating statistical distributions from operator relations: The statistical distributions of various intermediate statistics

    International Nuclear Information System (INIS)

    Dai, Wu-Sheng; Xie, Mi

    2013-01-01

    In this paper, we give a general discussion on the calculation of the statistical distribution from a given operator relation of creation, annihilation, and number operators. Our result shows that as long as the relation between the number operator and the creation and annihilation operators can be expressed as a † b=Λ(N) or N=Λ −1 (a † b), where N, a † , and b denote the number, creation, and annihilation operators, i.e., N is a function of quadratic product of the creation and annihilation operators, the corresponding statistical distribution is the Gentile distribution, a statistical distribution in which the maximum occupation number is an arbitrary integer. As examples, we discuss the statistical distributions corresponding to various operator relations. In particular, besides the Bose–Einstein and Fermi–Dirac cases, we discuss the statistical distributions for various schemes of intermediate statistics, especially various q-deformation schemes. Our result shows that the statistical distributions corresponding to various q-deformation schemes are various Gentile distributions with different maximum occupation numbers which are determined by the deformation parameter q. This result shows that the results given in much literature on the q-deformation distribution are inaccurate or incomplete. -- Highlights: ► A general discussion on calculating statistical distribution from relations of creation, annihilation, and number operators. ► A systemic study on the statistical distributions corresponding to various q-deformation schemes. ► Arguing that many results of q-deformation distributions in literature are inaccurate or incomplete

  18. Pattern formation and control of spatiotemporal chaos in a reaction diffusion prey–predator system supplying additional food

    International Nuclear Information System (INIS)

    Ghorai, Santu; Poria, Swarup

    2016-01-01

    Spatiotemporal dynamics of a predator–prey system in presence of spatial diffusion is investigated in presence of additional food exists for predators. Conditions for stability of Hopf as well as Turing patterns in a spatial domain are determined by making use of the linear stability analysis. Impact of additional food is clear from these conditions. Numerical simulation results are presented in order to validate the analytical findings. Finally numerical simulations are carried out around the steady state under zero flux boundary conditions. With the help of numerical simulations, the different types of spatial patterns (including stationary spatial pattern, oscillatory pattern, and spatiotemporal chaos) are identified in this diffusive predator–prey system in presence of additional food, depending on the quantity, quality of the additional food and the spatial domain and other parameters of the model. The key observation is that spatiotemporal chaos can be controlled supplying suitable additional food to predator. These investigations may be useful to understand complex spatiotemporal dynamics of population dynamical models in presence of additional food.

  19. Pierre Gy's sampling theory and sampling practice heterogeneity, sampling correctness, and statistical process control

    CERN Document Server

    Pitard, Francis F

    1993-01-01

    Pierre Gy's Sampling Theory and Sampling Practice, Second Edition is a concise, step-by-step guide for process variability management and methods. Updated and expanded, this new edition provides a comprehensive study of heterogeneity, covering the basic principles of sampling theory and its various applications. It presents many practical examples to allow readers to select appropriate sampling protocols and assess the validity of sampling protocols from others. The variability of dynamic process streams using variography is discussed to help bridge sampling theory with statistical process control. Many descriptions of good sampling devices, as well as descriptions of poor ones, are featured to educate readers on what to look for when purchasing sampling systems. The book uses its accessible, tutorial style to focus on professional selection and use of methods. The book will be a valuable guide for mineral processing engineers; metallurgists; geologists; miners; chemists; environmental scientists; and practit...

  20. FEATURES OF THE APPLICATION OF STATISTICAL INDICATORS OF SCHEDULED FLIGHTS OF AIRCRAFT

    Directory of Open Access Journals (Sweden)

    2016-01-01

    Full Text Available Тhe possibilities of increasing the effectiveness of management of safety of regular aircraft operations on the basis of systematic approach, under normal operating conditions are considered. These new opportunities within the airline are based on Flight Safety Management System integration with quality management system. So far, however, these possibili- ties are practically not implemented due to the limited application of statistical methods. A necessary condition for the implementation of the proposed approach is the use of statistical flight data results of the quality control flight. The proper- ties and peculiarities of application of statistical indicators of flight parameters during the monitoring of flight data are analyzed. It is shown that the main statistical indicators of the controlled process are averages and variations. The features of the application of theoretical models of mathematical statistics in the analysis of flight information are indicated. It is noted that in practice the theoretical models often do not fit into the framework of its application because of the violation of the initial assumptions. Recommendations are given for the integrated use of statistical indicators of the current quality control of flights. Ultimately, the article concludes that the capabilities of the proposed approach allows on the basis of knowledge about the dynamics of statistical indicators of controlled flight process to identify hazards and develop safety indicators for the new information based on data flight operation aircraft.

  1. A tale of two audits: statistical process control for improving diabetes care in primary care settings.

    Science.gov (United States)

    Al-Hussein, Fahad Abdullah

    2008-01-01

    Diabetes constitutes a major burden of disease globally. Both primary and secondary prevention need to improve in order to face this challenge. Improving management of diabetes in primary care is therefore of fundamental importance. The objective of these series of audits was to find means of improving diabetes management in chronic disease mini-clinics in primary health care. In the process, we were able to study the effect and practical usefulness of different audit designs - those measuring clinical outcomes, process of care, or both. King Saud City Family and Community Medicine Centre, Saudi National Guard Health Affairs in Riyadh city, Saudi Arabia. Simple random samples of 30 files were selected every two weeks from a sampling frame of file numbers for all diabetes clients seen over the period. Information was transferred to a form, entered on the computer and an automated response was generated regarding the appropriateness of management, a criterion mutually agreed upon by care providers. The results were plotted on statistical process control charts, p charts, displayed for all employees. Data extraction, archiving, entry, analysis, plotting and design and preparation of p charts were managed by nursing staff specially trained for the purpose by physicians with relevant previous experience. Audit series with mixed outcome and process measures failed to detect any changes in the proportion of non-conforming cases over a period of one year. The process measures series, on the other hand, showed improvement in care corresponding to a reduction in the proportion non-conforming by 10% within a period of 3 months. Non-conformities dropped from a mean of 5.0 to 1.4 over the year (P process audits and feedbacks. Frequent process audits in the context of statistical process control should be supplemented with concurrent outcome audits, once or twice a year.

  2. Applying Statistical Design to Control the Risk of Over-Design with Stochastic Simulation

    Directory of Open Access Journals (Sweden)

    Yi Wu

    2010-02-01

    Full Text Available By comparing a hard real-time system and a soft real-time system, this article elicits the risk of over-design in soft real-time system designing. To deal with this risk, a novel concept of statistical design is proposed. The statistical design is the process accurately accounting for and mitigating the effects of variation in part geometry and other environmental conditions, while at the same time optimizing a target performance factor. However, statistical design can be a very difficult and complex task when using clas-sical mathematical methods. Thus, a simulation methodology to optimize the design is proposed in order to bridge the gap between real-time analysis and optimization for robust and reliable system design.

  3. Statistical analysis of trace metals in the plasma of cancer patients versus controls

    International Nuclear Information System (INIS)

    Pasha, Qaisara; Malik, Salman A.; Shah, Munir H.

    2008-01-01

    The plasma of cancer patients (n = 112) and controls (n = 118) were analysed for selected trace metals (Al, Ca, Cd, Co, Cr, Cu, Fe, K, Li, Mg, Mn, Mo, Na, Ni, Pb, Sb, Sr and Zn) by flame atomic absorption spectroscopy. In the plasma of cancer patients, mean concentrations of macronutrients/essential metals, Na, K, Ca, Mg, Fe and Zn were 3971, 178, 44.1, 7.59, 4.38 and 3.90 ppm, respectively, while the mean metal levels in the plasma of controls were 3844, 151, 74.2, 18.0, 6.60 and 2.50 ppm, respectively. Average concentrations of Cd, Cr, Cu, Mn, Mo, Ni, Pb, Sb, Sr and Zn were noted to be significantly higher in the plasma of cancer patients compared with controls. Very strong mutual correlations (r > 0.70) in the plasma of cancer patients were observed between Fe-Mn, Ca-Mn, Ca-Ni, Ca-Co, Cd-Pb, Co-Ni, Mn-Ni, Mn-Zn, Cr-Li, Ca-Zn and Fe-Ni, whereas, Ca-Mn, Ca-Mg, Fe-Zn, Ca-Zn, Mg-Mn, Mg-Zn, Cd-Sb, Cd-Co, Cd-Zn, Co-Sb and Sb-Zn exhibited strong relationships (r > 0.50) in the plasma of controls, all were significant at p < 0.01. Principal component analysis (PCA) of the data extracted five PCs, both for cancer patients and controls, but with considerably different loadings. The average metals levels in male and female donors of the two groups were also evaluated and in addition, the general role of trace metals in the carcinogenesis was discussed. The study indicated appreciably different pattern of metal distribution and mutual relationships in the plasma of cancer patients in comparison with controls

  4. Statistical analysis of trace metals in the plasma of cancer patients versus controls

    Energy Technology Data Exchange (ETDEWEB)

    Pasha, Qaisara; Malik, Salman A. [Department of Biochemistry, Quaid-i-Azam University, Islamabad 45320 (Pakistan); Shah, Munir H. [Department of Chemistry, Quaid-i-Azam University, Islamabad 45320 (Pakistan)], E-mail: munir_qau@yahoo.com

    2008-05-30

    The plasma of cancer patients (n = 112) and controls (n = 118) were analysed for selected trace metals (Al, Ca, Cd, Co, Cr, Cu, Fe, K, Li, Mg, Mn, Mo, Na, Ni, Pb, Sb, Sr and Zn) by flame atomic absorption spectroscopy. In the plasma of cancer patients, mean concentrations of macronutrients/essential metals, Na, K, Ca, Mg, Fe and Zn were 3971, 178, 44.1, 7.59, 4.38 and 3.90 ppm, respectively, while the mean metal levels in the plasma of controls were 3844, 151, 74.2, 18.0, 6.60 and 2.50 ppm, respectively. Average concentrations of Cd, Cr, Cu, Mn, Mo, Ni, Pb, Sb, Sr and Zn were noted to be significantly higher in the plasma of cancer patients compared with controls. Very strong mutual correlations (r > 0.70) in the plasma of cancer patients were observed between Fe-Mn, Ca-Mn, Ca-Ni, Ca-Co, Cd-Pb, Co-Ni, Mn-Ni, Mn-Zn, Cr-Li, Ca-Zn and Fe-Ni, whereas, Ca-Mn, Ca-Mg, Fe-Zn, Ca-Zn, Mg-Mn, Mg-Zn, Cd-Sb, Cd-Co, Cd-Zn, Co-Sb and Sb-Zn exhibited strong relationships (r > 0.50) in the plasma of controls, all were significant at p < 0.01. Principal component analysis (PCA) of the data extracted five PCs, both for cancer patients and controls, but with considerably different loadings. The average metals levels in male and female donors of the two groups were also evaluated and in addition, the general role of trace metals in the carcinogenesis was discussed. The study indicated appreciably different pattern of metal distribution and mutual relationships in the plasma of cancer patients in comparison with controls.

  5. Statistical theory of neutron-nuclear reactions

    International Nuclear Information System (INIS)

    Moldauer, P.A.

    1981-01-01

    In addition to the topics dealt with by the author in his lectures at the Joint IAEA/ICTP Course held at Trieste in 1978, recent developments in the statistical theory of multistep reactions are reviewed as well as the transport theory and intranuclear cascade approaches to the description of nuclear multi-step processes. (author)

  6. Animated sulfonated or sulformethylated lignins as cement fluid loss control additives

    Energy Technology Data Exchange (ETDEWEB)

    Schilling, P.

    1991-05-07

    This patent describes a method of cementing a zone in a well penetrating a subterranean formation comprising injecting down the well and positioning in the zone to be cemented a hydraulic aqueous cement slurry composition. It comprises: a hydraulic cement, and the following expressed as parts by weight per 100 parts of the hydraulic cement, water from about 25 to 105 parts, and a fluid loss control additive comprising from about 0.5 to 2.5 parts of a compound selected from the group consisting of a sulfonated lignin and a sulfomethylated lignin, wherein the lignin has been aminated by reacting it with between about 2-5 moles of a polyamine and 2-5 moles of an aldehyde per 1,000g of the lignin, and 0.1 to 1.5 parts of a compound selected from the group consisting of sodium carbonate, sodium metasilicate, sodium phosphate, sodium sulfite and sodium naphthalene sulfonate and a combination thereof.

  7. Complex Data Modeling and Computationally Intensive Statistical Methods

    CERN Document Server

    Mantovan, Pietro

    2010-01-01

    The last years have seen the advent and development of many devices able to record and store an always increasing amount of complex and high dimensional data; 3D images generated by medical scanners or satellite remote sensing, DNA microarrays, real time financial data, system control datasets. The analysis of this data poses new challenging problems and requires the development of novel statistical models and computational methods, fueling many fascinating and fast growing research areas of modern statistics. The book offers a wide variety of statistical methods and is addressed to statistici

  8. Increasing Statistical Literacy by Exploiting Lexical Ambiguity of Technical Terms

    Directory of Open Access Journals (Sweden)

    Jennifer Kaplan

    2018-01-01

    Full Text Available Instructional inattention to language poses a barrier for students in entry-level science courses, in part because students may perceive a subject as difficult solely based on the lack of understanding of the vocabulary. In addition, the technical use of terms that have different everyday meanings may cause students to misinterpret statements made by instructors, leading to an incomplete or incorrect understanding of the domain. Terms that have different technical and everyday meanings are said to have lexical ambiguity and statistics, as a discipline, has many lexically ambiguous terms. This paper presents a cyclic process for designing activities to address lexical ambiguity in statistics. In addition, it describes three short activities aimed to have high impact on student learning associated with two different lexically ambiguous words or word pairs in statistics. Preliminary student-level data are used to assess the efficacy of the activities, and future directions for development of activities and research about lexical ambiguity in statistics in particular and STEM in general are discussed.

  9. Development of statistical and analytical techniques for use in national quality control schemes for steroid hormones

    International Nuclear Information System (INIS)

    Wilson, D.W.; Gaskell, S.J.; Fahmy, D.R.; Joyce, B.G.; Groom, G.V.; Griffiths, K.; Kemp, K.W.; Nix, A.B.J.; Rowlands, R.J.

    1979-01-01

    Adopting the rationale that the improvement of intra-laboratory performance of immunometric assays will enable the assessment of national QC schemes to become more meaningful, the group of participating laboratories has developed statistical and analytical techniques for the improvement of accuracy, precision and monitoring of error for the determination of steroid hormones. These developments are now described and their relevance to NQC schemes discussed. Attention has been focussed on some of the factors necessary for improving standards of quality in immunometric assays and their relevance to laboratories participating in NQC schemes as described. These have included the 'accuracy', precision and robustness of assay procedures as well as improved methods for internal quality control. (Auth.)

  10. A robust statistical method for association-based eQTL analysis.

    Directory of Open Access Journals (Sweden)

    Ning Jiang

    Full Text Available It has been well established that theoretical kernel for recently surging genome-wide association study (GWAS is statistical inference of linkage disequilibrium (LD between a tested genetic marker and a putative locus affecting a disease trait. However, LD analysis is vulnerable to several confounding factors of which population stratification is the most prominent. Whilst many methods have been proposed to correct for the influence either through predicting the structure parameters or correcting inflation in the test statistic due to the stratification, these may not be feasible or may impose further statistical problems in practical implementation.We propose here a novel statistical method to control spurious LD in GWAS from population structure by incorporating a control marker into testing for significance of genetic association of a polymorphic marker with phenotypic variation of a complex trait. The method avoids the need of structure prediction which may be infeasible or inadequate in practice and accounts properly for a varying effect of population stratification on different regions of the genome under study. Utility and statistical properties of the new method were tested through an intensive computer simulation study and an association-based genome-wide mapping of expression quantitative trait loci in genetically divergent human populations.The analyses show that the new method confers an improved statistical power for detecting genuine genetic association in subpopulations and an effective control of spurious associations stemmed from population structure when compared with other two popularly implemented methods in the literature of GWAS.

  11. Statistics

    CERN Document Server

    Hayslett, H T

    1991-01-01

    Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the

  12. Feasibility and effects of newly developed balance control trainer for mobility and balance in chronic stroke patients: a randomized controlled trial.

    Science.gov (United States)

    Lee, So Hyun; Byun, Seung Deuk; Kim, Chul Hyun; Go, Jin Young; Nam, Hyeon Uk; Huh, Jin Seok; Jung, Tae Du

    2012-08-01

    To investigate the feasibility and effects of balance training with a newly developed Balance Control Trainer (BCT) that applied the concept of vertical movement for the improvements of mobility and balance in chronic stroke patients. Forty chronic stroke patients were randomly assigned to an experimental or a control group. The experimental group (n=20) underwent training with a BCT for 20 minutes a day, 5 days a week for 4 weeks, in addition to concurrent conventional physical therapy. The control group (n=20) underwent only conventional therapy for 4 weeks. All participants were assessed by: the Functional Ambulation Categories (FAC), 10-meter Walking Test (10mWT), Timed Up and Go test (TUG), Berg Balance Scale (BBS), Korean Modified Barthel Index (MBI), and Manual Muscle Test (MMT) before training, and at 2 and 4 weeks of training. There were statistically significant improvements in all parameters except knee extensor power at 2 weeks of treatment, and in all parameters except MBI which showed further statistically significant progress in the experimental group over the next two weeks (pgait in ambulatory chronic stroke patients. Furthermore, it may provide additional benefits when used in conjunction with conventional therapies.

  13. Sensometrics: Thurstonian and Statistical Models

    DEFF Research Database (Denmark)

    Christensen, Rune Haubo Bojesen

    . sensR is a package for sensory discrimination testing with Thurstonian models and ordinal supports analysis of ordinal data with cumulative link (mixed) models. While sensR is closely connected to the sensometrics field, the ordinal package has developed into a generic statistical package applicable......This thesis is concerned with the development and bridging of Thurstonian and statistical models for sensory discrimination testing as applied in the scientific discipline of sensometrics. In sensory discrimination testing sensory differences between products are detected and quantified by the use...... and sensory discrimination testing in particular in a series of papers by advancing Thurstonian models for a range of sensory discrimination protocols in addition to facilitating their application by providing software for fitting these models. The main focus is on identifying Thurstonian models...

  14. Industrial commodity statistics yearbook 2001. Production statistics (1992-2001)

    International Nuclear Information System (INIS)

    2003-01-01

    This is the thirty-fifth in a series of annual compilations of statistics on world industry designed to meet both the general demand for information of this kind and the special requirements of the United Nations and related international bodies. Beginning with the 1992 edition, the title of the publication was changed to industrial Commodity Statistics Yearbook as the result of a decision made by the United Nations Statistical Commission at its twenty-seventh session to discontinue, effective 1994, publication of the Industrial Statistics Yearbook, volume I, General Industrial Statistics by the Statistics Division of the United Nations. The United Nations Industrial Development Organization (UNIDO) has become responsible for the collection and dissemination of general industrial statistics while the Statistics Division of the United Nations continues to be responsible for industrial commodity production statistics. The previous title, Industrial Statistics Yearbook, volume II, Commodity Production Statistics, was introduced in the 1982 edition. The first seven editions in this series were published under the title The Growth of World industry and the next eight editions under the title Yearbook of Industrial Statistics. This edition of the Yearbook contains annual quantity data on production of industrial commodities by country, geographical region, economic grouping and for the world. A standard list of about 530 commodities (about 590 statistical series) has been adopted for the publication. The statistics refer to the ten-year period 1992-2001 for about 200 countries and areas

  15. Industrial commodity statistics yearbook 2002. Production statistics (1993-2002)

    International Nuclear Information System (INIS)

    2004-01-01

    This is the thirty-sixth in a series of annual compilations of statistics on world industry designed to meet both the general demand for information of this kind and the special requirements of the United Nations and related international bodies. Beginning with the 1992 edition, the title of the publication was changed to industrial Commodity Statistics Yearbook as the result of a decision made by the United Nations Statistical Commission at its twenty-seventh session to discontinue, effective 1994, publication of the Industrial Statistics Yearbook, volume I, General Industrial Statistics by the Statistics Division of the United Nations. The United Nations Industrial Development Organization (UNIDO) has become responsible for the collection and dissemination of general industrial statistics while the Statistics Division of the United Nations continues to be responsible for industrial commodity production statistics. The previous title, Industrial Statistics Yearbook, volume II, Commodity Production Statistics, was introduced in the 1982 edition. The first seven editions in this series were published under the title 'The Growth of World industry' and the next eight editions under the title 'Yearbook of Industrial Statistics'. This edition of the Yearbook contains annual quantity data on production of industrial commodities by country, geographical region, economic grouping and for the world. A standard list of about 530 commodities (about 590 statistical series) has been adopted for the publication. The statistics refer to the ten-year period 1993-2002 for about 200 countries and areas

  16. Industrial commodity statistics yearbook 2000. Production statistics (1991-2000)

    International Nuclear Information System (INIS)

    2002-01-01

    This is the thirty-third in a series of annual compilations of statistics on world industry designed to meet both the general demand for information of this kind and the special requirements of the United Nations and related international bodies. Beginning with the 1992 edition, the title of the publication was changed to industrial Commodity Statistics Yearbook as the result of a decision made by the United Nations Statistical Commission at its twenty-seventh session to discontinue, effective 1994, publication of the Industrial Statistics Yearbook, volume I, General Industrial Statistics by the Statistics Division of the United Nations. The United Nations Industrial Development Organization (UNIDO) has become responsible for the collection and dissemination of general industrial statistics while the Statistics Division of the United Nations continues to be responsible for industrial commodity production statistics. The previous title, Industrial Statistics Yearbook, volume II, Commodity Production Statistics, was introduced in the 1982 edition. The first seven editions in this series were published under the title The Growth of World industry and the next eight editions under the title Yearbook of Industrial Statistics. This edition of the Yearbook contains annual quantity data on production of industrial commodities by country, geographical region, economic grouping and for the world. A standard list of about 530 commodities (about 590 statistical series) has been adopted for the publication. Most of the statistics refer to the ten-year period 1991-2000 for about 200 countries and areas

  17. Modeling the Aneuploidy Control of Cancer

    Directory of Open Access Journals (Sweden)

    Wang Zhong

    2010-07-01

    Full Text Available Abstract Background Aneuploidy has long been recognized to be associated with cancer. A growing body of evidence suggests that tumorigenesis, the formation of new tumors, can be attributed to some extent to errors occurring at the mitotic checkpoint, a major cell cycle control mechanism that acts to prevent chromosome missegregation. However, so far no statistical model has been available quantify the role aneuploidy plays in determining cancer. Methods We develop a statistical model for testing the association between aneuploidy loci and cancer risk in a genome-wide association study. The model incorporates quantitative genetic principles into a mixture-model framework in which various genetic effects, including additive, dominant, imprinting, and their interactions, are estimated by implementing the EM algorithm. Results Under the new model, a series of hypotheses tests are formulated to explain the pattern of the genetic control of cancer through aneuploid loci. Simulation studies were performed to investigate the statistical behavior of the model. Conclusions The model will provide a tool for estimating the effects of genetic loci on aneuploidy abnormality in genome-wide studies of cancer cells.

  18. Topological and statistical properties of quantum control transition landscapes

    International Nuclear Information System (INIS)

    Hsieh, Michael; Wu Rebing; Rabitz, Herschel; Rosenthal, Carey

    2008-01-01

    A puzzle arising in the control of quantum dynamics is to explain the relative ease with which high-quality control solutions can be found in the laboratory and in simulations. The emerging explanation appears to lie in the nature of the quantum control landscape, which is an observable as a function of the control variables. This work considers the common case of the observable being the transition probability between an initial and a target state. For any controllable quantum system, this landscape contains only global maxima and minima, and no local extrema traps. The probability distribution function for the landscape value is used to calculate the relative volume of the region of the landscape corresponding to good control solutions. The topology of the global optima of the landscape is analysed and the optima are shown to have inherent robustness to variations in the controls. Although the relative landscape volume of good control solutions is found to shrink rapidly as the system Hilbert space dimension increases, the highly favourable landscape topology at and away from the global optima provides a rationale for understanding the relative ease of finding high-quality, stable quantum optimal control solutions

  19. Risk of adverse events with bevacizumab addition to therapy in advanced non-small-cell lung cancer: a meta-analysis of randomized controlled trials

    Directory of Open Access Journals (Sweden)

    Lai XX

    2016-04-01

    Full Text Available Xi-Xi Lai, Ren-Ai Xu, Yu-Ping Li, Han Yang Department of Respiratory Medicine, The First Affiliated Hospital of Wenzhou Medical University, Wenzhou, Zhejiang, People’s Republic of China Background: Bevacizumab, a monoclonal antibody against vascular endothelial growth factor ligand, has shown survival benefits in the treatment of many types of malignant tumors, including non-small-cell lung cancer (NSCLC. We conducted this systematic review and meta-analysis to investigate the risk of the most clinically relevant adverse events related to bevacizumab in advanced NSCLC.Methods: Databases from PubMed, Web of Science, and Cochrane Library up to August 2015, were searched to identify relevant studies. We included prospective randomized controlled Phase II/III clinical trials that compared therapy with or without bevacizumab for advanced NSCLC. Summary relative risk (RR and 95% confidence intervals were calculated using random effects or fixed effects according to the heterogeneity among included trials.Results: A total of 3,745 patients from nine clinical trials were included in the meta-analysis. Summary RRs showed a statistically significant bevacizumab-associated increased risk in three of the adverse outcomes studied: proteinuria (RR =7.55, hypertension (RR =5.34, and hemorrhagic events (RR =2.61. No statistically significant differences were found for gastrointestinal perforation (P=0.60, arterial and venous thromboembolic events (P=0.35 and P=0.92, respectively, or fatal events (P=0.29.Conclusion: The addition of bevacizumab to therapy in advanced NSCLC did significantly increase the risk of proteinuria, hypertension, and hemorrhagic events but not arterial/venous thromboembolic events, gastrointestinal perforation, or fatal adverse events. Keywords: toxicities, angiogenesis inhibitors, non-small-cell lung carcinoma, meta-analysis, safety

  20. STATISTICS IN SERVICE QUALITY ASSESSMENT

    Directory of Open Access Journals (Sweden)

    Dragana Gardašević

    2012-09-01

    Full Text Available For any quality evaluation in sports, science, education, and so, it is useful to collect data to construct a strategy to improve the quality of services offered to the user. For this purpose, we use statistical software packages for data processing data collected in order to increase customer satisfaction. The principle is demonstrated by the example of the level of student satisfaction ratings Belgrade Polytechnic (as users the quality of institutions (Belgrade Polytechnic. Here, the emphasis on statistical analysis as a tool for quality control in order to improve the same, and not the interpretation of results. Therefore, the above can be used as a model in sport to improve the overall results.

  1. The effect of solvents and hydrophilic additive on stable coating and controllable sirolimus release system for drug-eluting stent.

    Science.gov (United States)

    Kim, Seong Min; Park, Sung-Bin; Bedair, Tarek M; Kim, Man-Ho; Park, Bang Ju; Joung, Yoon Ki; Han, Dong Keun

    2017-09-01

    Various drug-eluting stents (DESs) have been developed to prevent restenosis after stent implantation. However, DES still needs to improve the drug-in-polymer coating stability and control of drug release for effective clinical treatment. In this study, the cobalt-chromium (CoCr) alloy surface was coated with biodegradable poly(D,L-lactide) (PDLLA) and sirolimus (SRL) mixed with hydrophilic Pluronic F127 additive by using ultrasonic spray coating system in order to achieve a stable coating surface and control SRL release. The degradation of PDLLA/SRL coating was studied under physiological solution. It was found that adding F127 reduced the degradation of PDLLA and improved the coating stability during 60days. The effects of organic solvent such as chloroform and tetrahydrofuran (THF) on the coating uniformity were also examined. It was revealed that THF produced a very smooth and uniform coating compared to chloroform. The patterns of in vitro drug release according to the type of organic solvent and hydrophilic additive proposed the possibility of controllable drug release design in DES. It was found that using F127 the drug release was sustained regardless of the organic solvent used. In addition, THF was able to get faster and controlled release profile when compared to chloroform. The structure of SRL molecules in different organic solvents was investigated using ultra-small angle neutron scattering. Furthermore, the structure of SRL is concentration-dependent in chloroform with tight nature under high concentration, but concentration-independent in THF. These results strongly demonstrated that coating stability and drug release patterns can be changed by physicochemical properties of various parameters such as organic solvents, additive, and coating strategy. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. 34 CFR 668.49 - Institutional fire safety policies and fire statistics.

    Science.gov (United States)

    2010-07-01

    ... 34 Education 3 2010-07-01 2010-07-01 false Institutional fire safety policies and fire statistics... fire statistics. (a) Additional definitions that apply to this section. Cause of fire: The factor or...; however, it does not include indirect loss, such as business interruption. (b) Annual fire safety report...

  3. Decision Support Systems: Applications in Statistics and Hypothesis Testing.

    Science.gov (United States)

    Olsen, Christopher R.; Bozeman, William C.

    1988-01-01

    Discussion of the selection of appropriate statistical procedures by educators highlights a study conducted to investigate the effectiveness of decision aids in facilitating the use of appropriate statistics. Experimental groups and a control group using a printed flow chart, a computer-based decision aid, and a standard text are described. (11…

  4. Statistics and Corporate Environmental Management: Relations and Problems

    DEFF Research Database (Denmark)

    Madsen, Henning; Ulhøi, John Parm

    1997-01-01

    Statistical methods have long been used to analyse the macroeconomic consequences of environmentally damaging activities, political actions to control, prevent, or reduce these damages, and environmental problems in the natural environment. Up to now, however, they have had a limited and not very...... in the external environment. The nature and extent of the practical use of quantitative techniques in corporate environmental management systems is discussed on the basis of a number of company surveys in four European countries.......Statistical methods have long been used to analyse the macroeconomic consequences of environmentally damaging activities, political actions to control, prevent, or reduce these damages, and environmental problems in the natural environment. Up to now, however, they have had a limited and not very...... specific use in corporate environmental management systems. This paper will address some of the special problems related to the use of statistical techniques in corporate environmental management systems. One important aspect of this is the interaction of internal decisions and activities with conditions...

  5. Statistical aspects of food safety sampling

    NARCIS (Netherlands)

    Jongenburger, I.; Besten, den H.M.W.; Zwietering, M.H.

    2015-01-01

    In food safety management, sampling is an important tool for verifying control. Sampling by nature is a stochastic process. However, uncertainty regarding results is made even greater by the uneven distribution of microorganisms in a batch of food. This article reviews statistical aspects of

  6. A review of the statistical principles of geochronometry. II. Additional concepts pertinent to radiogenic U-Pb studies

    International Nuclear Information System (INIS)

    Eglington, B.M.; Harmer, R.E.

    1993-01-01

    A summary is provided of statistical regression techniques as applied to radiogenic uranium-lead data. The model-dependent nature of U-Pb regression calculations, both for isochrons and errorchrons, is emphasized throughout. Near concordant U-Pb radiogenic data preserve better information about the original age of the samples than do more discordant data, yet most conventional regression techniques assign more importance to the discordant data than to those near concordia. The links between mathematical techniques for regression and conceptual models are highlighted and critically examined and methods illustrated to deal with the discordant data. Comparison of dates from different laboratories or researchers requires that the techniques applied be statistically valid and, in most cases, that the model-dependent assumptions be compatible. This is particularly important for U-Pb radiogenic data where the influence of model-dependent assumptions may have a greater influence than in the case of whole-rock techniques. A consistent approach is proposed for treating data at South African laboratories in order ro facilitate comparison of results. Recommendations are presented as regards the minimum requirements to be met when reporting radiogenic U-Pb isotope data so that future geochronologists may benefit. 35 refs., 2 tabs., 6 figs

  7. Combination of antagonistic yeasts with two food additives for control of brown rot caused by Monilinia fructicola on sweet cherry fruit.

    Science.gov (United States)

    Qin, G Z; Tian, S P; Xu, Y; Chan, Z L; Li, B Q

    2006-03-01

    To evaluate beneficial effect of two food additives, ammonium molybdate (NH4-Mo) and sodium bicarbonate (NaBi), on antagonistic yeasts for control of brown rot caused by Monilinia fructicola in sweet cherry fruit under various storage conditions. The mechanisms of action by which food additives enhance the efficacy of antagonistic yeasts were also evaluated. Biocontrol activity of Pichia membranefaciens and Cryptococcus laurentii against brown rot in sweet cherry fruit was improved by addition of 5 mmol l(-1) NH4-Mo or 2% NaBi when stored in air at 20 and 0 degrees C, and in controlled atmosphere (CA) storage with 10% O2 + 10% CO2 at 0 degrees C. Population dynamics of P. membranefaciens in the wounds of fruit were inhibited by NH4-Mo at 20 degrees C after 1 day of incubation and growth of C. laurentii was inhibited by NH4-Mo at 0 degrees C in CA storage after 60 days. In contrast, NaBi did not significantly influence growth of the two yeasts in fruit wounds under various storage conditions except that the growth of P. membranefaciens was stimulated after storage for 45 days at 0 degrees C in CA storage. When used alone, the two additives showed effective control of brown rot in sweet cherry fruit and the efficacy was closely correlated with the concentrations used. The result of in vitro indicated that growth of M. fructicola was significantly inhibited by NH4-Mo and NaBi. Application of additives improved biocontrol of brown rot on sweet cherry fruit under various storage conditions. It is postulated that the enhancement of disease control is directly because of the inhibitory effects of additives on pathogen growth, and indirectly because of the relatively little influence of additives on the growth of antagonistic yeasts. The results obtained in this study suggest that an integration of NH4-Mo or NaBi with biocontrol agents has great potential in commercial management of postharvest diseases of fruit.

  8. Harmonic statistics

    International Nuclear Information System (INIS)

    Eliazar, Iddo

    2017-01-01

    The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.

  9. Harmonic statistics

    Energy Technology Data Exchange (ETDEWEB)

    Eliazar, Iddo, E-mail: eliazar@post.tau.ac.il

    2017-05-15

    The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.

  10. 12th Workshop on Stochastic Models, Statistics and Their Applications

    CERN Document Server

    Rafajłowicz, Ewaryst; Szajowski, Krzysztof

    2015-01-01

    This volume presents the latest advances and trends in stochastic models and related statistical procedures. Selected peer-reviewed contributions focus on statistical inference, quality control, change-point analysis and detection, empirical processes, time series analysis, survival analysis and reliability, statistics for stochastic processes, big data in technology and the sciences, statistical genetics, experiment design, and stochastic models in engineering. Stochastic models and related statistical procedures play an important part in furthering our understanding of the challenging problems currently arising in areas of application such as the natural sciences, information technology, engineering, image analysis, genetics, energy and finance, to name but a few. This collection arises from the 12th Workshop on Stochastic Models, Statistics and Their Applications, Wroclaw, Poland.

  11. Statistical mechanics for a class of quantum statistics

    International Nuclear Information System (INIS)

    Isakov, S.B.

    1994-01-01

    Generalized statistical distributions for identical particles are introduced for the case where filling a single-particle quantum state by particles depends on filling states of different momenta. The system of one-dimensional bosons with a two-body potential that can be solved by means of the thermodynamic Bethe ansatz is shown to be equivalent thermodynamically to a system of free particles obeying statistical distributions of the above class. The quantum statistics arising in this way are completely determined by the two-particle scattering phases of the corresponding interacting systems. An equation determining the statistical distributions for these statistics is derived

  12. Descriptive and inferential statistical methods used in burns research.

    Science.gov (United States)

    Al-Benna, Sammy; Al-Ajam, Yazan; Way, Benjamin; Steinstraesser, Lars

    2010-05-01

    Burns research articles utilise a variety of descriptive and inferential methods to present and analyse data. The aim of this study was to determine the descriptive methods (e.g. mean, median, SD, range, etc.) and survey the use of inferential methods (statistical tests) used in articles in the journal Burns. This study defined its population as all original articles published in the journal Burns in 2007. Letters to the editor, brief reports, reviews, and case reports were excluded. Study characteristics, use of descriptive statistics and the number and types of statistical methods employed were evaluated. Of the 51 articles analysed, 11(22%) were randomised controlled trials, 18(35%) were cohort studies, 11(22%) were case control studies and 11(22%) were case series. The study design and objectives were defined in all articles. All articles made use of continuous and descriptive data. Inferential statistics were used in 49(96%) articles. Data dispersion was calculated by standard deviation in 30(59%). Standard error of the mean was quoted in 19(37%). The statistical software product was named in 33(65%). Of the 49 articles that used inferential statistics, the tests were named in 47(96%). The 6 most common tests used (Student's t-test (53%), analysis of variance/co-variance (33%), chi(2) test (27%), Wilcoxon & Mann-Whitney tests (22%), Fisher's exact test (12%)) accounted for the majority (72%) of statistical methods employed. A specified significance level was named in 43(88%) and the exact significance levels were reported in 28(57%). Descriptive analysis and basic statistical techniques account for most of the statistical tests reported. This information should prove useful in deciding which tests should be emphasised in educating burn care professionals. These results highlight the need for burn care professionals to have a sound understanding of basic statistics, which is crucial in interpreting and reporting data. Advice should be sought from professionals

  13. Automatic detection of health changes using statistical process control techniques on measured transfer times of elderly.

    Science.gov (United States)

    Baldewijns, Greet; Luca, Stijn; Nagels, William; Vanrumste, Bart; Croonenborghs, Tom

    2015-01-01

    It has been shown that gait speed and transfer times are good measures of functional ability in elderly. However, data currently acquired by systems that measure either gait speed or transfer times in the homes of elderly people require manual reviewing by healthcare workers. This reviewing process is time-consuming. To alleviate this burden, this paper proposes the use of statistical process control methods to automatically detect both positive and negative changes in transfer times. Three SPC techniques: tabular CUSUM, standardized CUSUM and EWMA, known for their ability to detect small shifts in the data, are evaluated on simulated transfer times. This analysis shows that EWMA is the best-suited method with a detection accuracy of 82% and an average detection time of 9.64 days.

  14. IMPACTS OF ANTIFOAM ADDITIONS AND ARGON BUBBLING ON DEFENSE WASTE PROCESSING FACILITY REDUCTION/OXIDATION

    Energy Technology Data Exchange (ETDEWEB)

    Jantzen, C.; Johnson, F.

    2012-06-05

    During melting of HLW glass, the REDOX of the melt pool cannot be measured. Therefore, the Fe{sup +2}/{Sigma}Fe ratio in the glass poured from the melter must be related to melter feed organic and oxidant concentrations to ensure production of a high quality glass without impacting production rate (e.g., foaming) or melter life (e.g., metal formation and accumulation). A production facility such as the Defense Waste Processing Facility (DWPF) cannot wait until the melt or waste glass has been made to assess its acceptability, since by then no further changes to the glass composition and acceptability are possible. therefore, the acceptability decision is made on the upstream process, rather than on the downstream melt or glass product. That is, it is based on 'feed foward' statistical process control (SPC) rather than statistical quality control (SQC). In SPC, the feed composition to the melter is controlled prior to vitrification. Use of the DWPF REDOX model has controlled the balanjce of feed reductants and oxidants in the Sludge Receipt and Adjustment Tank (SRAT). Once the alkali/alkaline earth salts (both reduced and oxidized) are formed during reflux in the SRAT, the REDOX can only change if (1) additional reductants or oxidants are added to the SRAT, the Slurry Mix Evaporator (SME), or the Melter Feed Tank (MFT) or (2) if the melt pool is bubble dwith an oxidizing gas or sparging gas that imposes a different REDOX target than the chemical balance set during reflux in the SRAT.

  15. [Monitoring method of extraction process for Schisandrae Chinensis Fructus based on near infrared spectroscopy and multivariate statistical process control].

    Science.gov (United States)

    Xu, Min; Zhang, Lei; Yue, Hong-Shui; Pang, Hong-Wei; Ye, Zheng-Liang; Ding, Li

    2017-10-01

    To establish an on-line monitoring method for extraction process of Schisandrae Chinensis Fructus, the formula medicinal material of Yiqi Fumai lyophilized injection by combining near infrared spectroscopy with multi-variable data analysis technology. The multivariate statistical process control (MSPC) model was established based on 5 normal batches in production and 2 test batches were monitored by PC scores, DModX and Hotelling T2 control charts. The results showed that MSPC model had a good monitoring ability for the extraction process. The application of the MSPC model to actual production process could effectively achieve on-line monitoring for extraction process of Schisandrae Chinensis Fructus, and can reflect the change of material properties in the production process in real time. This established process monitoring method could provide reference for the application of process analysis technology in the process quality control of traditional Chinese medicine injections. Copyright© by the Chinese Pharmaceutical Association.

  16. Tomato farmers adoption level of postharvest value addition ...

    African Journals Online (AJOL)

    The study examined tomato farmers' adoption level of postharvest value addition technology and its constraints in Surulere Area of Oyo state. 160 tomato farmers were randomly selected and interviewed through structured interview schedule. Data obtained were subjected to descriptive and inferential statistics. Results ...

  17. HOW TO SELECT APPROPRIATE STATISTICAL TEST IN SCIENTIFIC ARTICLES

    Directory of Open Access Journals (Sweden)

    Vladimir TRAJKOVSKI

    2016-09-01

    Full Text Available Statistics is mathematical science dealing with the collection, analysis, interpretation, and presentation of masses of numerical data in order to draw relevant conclusions. Statistics is a form of mathematical analysis that uses quantified models, representations and synopses for a given set of experimental data or real-life studies. The students and young researchers in biomedical sciences and in special education and rehabilitation often declare that they have chosen to enroll that study program because they have lack of knowledge or interest in mathematics. This is a sad statement, but there is much truth in it. The aim of this editorial is to help young researchers to select statistics or statistical techniques and statistical software appropriate for the purposes and conditions of a particular analysis. The most important statistical tests are reviewed in the article. Knowing how to choose right statistical test is an important asset and decision in the research data processing and in the writing of scientific papers. Young researchers and authors should know how to choose and how to use statistical methods. The competent researcher will need knowledge in statistical procedures. That might include an introductory statistics course, and it most certainly includes using a good statistics textbook. For this purpose, there is need to return of Statistics mandatory subject in the curriculum of the Institute of Special Education and Rehabilitation at Faculty of Philosophy in Skopje. Young researchers have a need of additional courses in statistics. They need to train themselves to use statistical software on appropriate way.

  18. Spatially Controlled Delivery of siRNAs to Stem Cells in Implants Generated by Multi-Component Additive Manufacturing

    DEFF Research Database (Denmark)

    Andersen, Morten Østergaard; Le, Dang Quang Svend; Chen, Muwan

    2013-01-01

    Additive manufacturing is a promising technique in tissue engineering, as it enables truly individualized implants to be made to fit a particular defect. As previously shown, a feasible strategy to produce complex multicellular tissues is to deposit different small interfering RNA (siRNA) in porous...... implants that are subsequently sutured together. In this study, an additive manufacturing strategy to deposit carbohydrate hydrogels containing different siRNAs is applied into an implant, in a spatially controlled manner. When the obtained structures are seeded with mesenchymal stem (stromal) cells......, the selected siRNAs are delivered to the cells and induces specific and localized gene silencing. Here, it is demonstrated how to replicate part of a patient's spinal cord from a computed tomography scan, using an additive manufacturing technique to produce an implant with compartmentalized si...

  19. Mathematical and statistical applications in life sciences and engineering

    CERN Document Server

    Adhikari, Mahima; Chaubey, Yogendra

    2017-01-01

    The book includes articles from eminent international scientists discussing a wide spectrum of topics of current importance in mathematics and statistics and their applications. It presents state-of-the-art material along with a clear and detailed review of the relevant topics and issues concerned. The topics discussed include message transmission, colouring problem, control of stochastic structures and information dynamics, image denoising, life testing and reliability, survival and frailty models, analysis of drought periods, prediction of genomic profiles, competing risks, environmental applications and chronic disease control. It is a valuable resource for researchers and practitioners in the relevant areas of mathematics and statistics.

  20. Changing redox potential by controlling soil moisture and addition of inorganic oxidants to dissipate pentachlorophenol in different soils

    International Nuclear Information System (INIS)

    Lin Jiajiang; He Yan; Xu Jianming

    2012-01-01

    The potential for dissipation of pentachlorophenol (PCP) was investigated in soils from four different sites in China. These were an umbraqualf (Soil 1), a Plinthudult (Soil 2), a Haplustalf (Soil 3) and an Argiustoll (Soil 4) which were either flooded, to produce anaerobic conditions, or incubated aerobically at 60% water-holding capacity (WHC). The dissipation of PCP in Soil 1 at 60% WHC was higher than under flooded condition, while the opposite occurred in the other three soils. Under flooded conditions, the redox potential decreased significantly in Soil 1 and Soil 4, where sulphate reduction was occurred and the dissipation of PCP was statistically significant (about 96% and 98%, respectively) at the end of incubation. After addition of inorganic oxidants, dissipation of PCP was significantly inhibited by FeCl 3 , while Na 2 SO 4 and NaNO 3 had different effects, depending upon the soil type. - Highlights: ► The extent of the aerobic/anaerobic interface depends upon the soil properties. ► The dissipation of PCP was accelerated in some soils due to the soil-water interface. ► The addition of oxidants inhibited the decrease in soil redox potential. ► Most external oxidants added under flooded condition inhibited PCP dechlorination. - The addition of inorganic oxidants limited the decrease in redox potential and inhibited the reductive dechlorination of pentachlorophenol.