WorldWideScience

Sample records for statistical survey control

  1. Pseudo-populations a basic concept in statistical surveys

    CERN Document Server

    Quatember, Andreas

    2015-01-01

    This book emphasizes that artificial or pseudo-populations play an important role in statistical surveys from finite universes in two manners: firstly, the concept of pseudo-populations may substantially improve users’ understanding of various aspects in the sampling theory and survey methodology; an example of this scenario is the Horvitz-Thompson estimator. Secondly, statistical procedures exist in which pseudo-populations actually have to be generated. An example of such a scenario can be found in simulation studies in the field of survey sampling, where close-to-reality pseudo-populations are generated from known sample and population data to form the basis for the simulation process. The chapters focus on estimation methods, sampling techniques, nonresponse, questioning designs and statistical disclosure control.This book is a valuable reference in understanding the importance of the pseudo-population concept and applying it in teaching and research.

  2. Statistical literacy and sample survey results

    Science.gov (United States)

    McAlevey, Lynn; Sullivan, Charles

    2010-10-01

    Sample surveys are widely used in the social sciences and business. The news media almost daily quote from them, yet they are widely misused. Using students with prior managerial experience embarking on an MBA course, we show that common sample survey results are misunderstood even by those managers who have previously done a statistics course. In general, they fare no better than managers who have never studied statistics. There are implications for teaching, especially in business schools, as well as for consulting.

  3. Statistical Survey of Non-Formal Education

    Directory of Open Access Journals (Sweden)

    Ondřej Nývlt

    2012-12-01

    Full Text Available focused on a programme within a regular education system. Labour market flexibility and new requirements on employees create a new domain of education called non-formal education. Is there a reliable statistical source with a good methodological definition for the Czech Republic? Labour Force Survey (LFS has been the basic statistical source for time comparison of non-formal education for the last ten years. Furthermore, a special Adult Education Survey (AES in 2011 was focused on individual components of non-formal education in a detailed way. In general, the goal of the EU is to use data from both internationally comparable surveys for analyses of the particular fields of lifelong learning in the way, that annual LFS data could be enlarged by detailed information from AES in five years periods. This article describes reliability of statistical data aboutnon-formal education. This analysis is usually connected with sampling and non-sampling errors.

  4. USING STATISTICAL SURVEY IN ECONOMICS

    Directory of Open Access Journals (Sweden)

    Delia TESELIOS

    2012-01-01

    Full Text Available Statistical survey is an effective method of statistical investigation that involves gathering quantitative data, which is often preferred in statistical reports due to the information which can be obtained regarding the entire population studied by observing a part of it. Therefore, because of the information provided, polls are used in many research areas. In economics, statistics are used in the decision making process in choosing competitive strategies in the analysis of certain economic phenomena, the formulation of forecasts. Economic study presented in this paper is to illustrate how a simple random sampling is used to analyze the existing parking spaces situation in a given locality.

  5. Statistics available for site studies in registers and surveys at Statistics Sweden

    Energy Technology Data Exchange (ETDEWEB)

    Haldorson, Marie [Statistics Sweden, Oerebro (Sweden)

    2000-03-01

    Statistics Sweden (SCB) has produced this report on behalf of the Swedish Nuclear Fuel and Waste Management Company (SKB), as part of the data to be used by SKB in conducting studies of potential sites. The report goes over the statistics obtainable from SCB in the form of registers and surveys. The purpose is to identify the variables that are available, and to specify their degree of geographical detail and the time series that are available. Chapter two describes the statistical registers available at SCB, registers that share the common feature that they provide total coverage, i.e. they contain all 'objects' of a given type, such as population, economic activities (e.g. from statements of employees' earnings provided to the tax authorities), vehicles, enterprises or real estate. SCB has exclusive responsibility for seven of the nine registers included in the chapter, while two registers are ordered by public authorities with statistical responsibilities. Chapter three describes statistical surveys that are conducted by SCB, with the exception of the National Forest Inventory, which is carried out by the Swedish University of Agricultural Sciences. In terms of geographical breakdown, the degree of detail in the surveys varies, but all provide some possibility of reporting data at lower than the national level. The level involved may be county, municipality, yield district, coastal district or category of enterprises, e.g. aquaculture. Six of the nine surveys included in the chapter have been ordered by public authorities with statistical responsibilities, while SCB has exclusive responsibility for the others. Chapter four presents an overview of the statistics on land use maintained by SCB. This chapter does not follow the same pattern as chapters two and three but instead gives a more general account. The conclusion can be drawn that there are good prospects that SKB can make use of SCB's data as background information or in other ways when

  6. Statistics available for site studies in registers and surveys at Statistics Sweden

    International Nuclear Information System (INIS)

    Haldorson, Marie

    2000-03-01

    Statistics Sweden (SCB) has produced this report on behalf of the Swedish Nuclear Fuel and Waste Management Company (SKB), as part of the data to be used by SKB in conducting studies of potential sites. The report goes over the statistics obtainable from SCB in the form of registers and surveys. The purpose is to identify the variables that are available, and to specify their degree of geographical detail and the time series that are available. Chapter two describes the statistical registers available at SCB, registers that share the common feature that they provide total coverage, i.e. they contain all 'objects' of a given type, such as population, economic activities (e.g. from statements of employees' earnings provided to the tax authorities), vehicles, enterprises or real estate. SCB has exclusive responsibility for seven of the nine registers included in the chapter, while two registers are ordered by public authorities with statistical responsibilities. Chapter three describes statistical surveys that are conducted by SCB, with the exception of the National Forest Inventory, which is carried out by the Swedish University of Agricultural Sciences. In terms of geographical breakdown, the degree of detail in the surveys varies, but all provide some possibility of reporting data at lower than the national level. The level involved may be county, municipality, yield district, coastal district or category of enterprises, e.g. aquaculture. Six of the nine surveys included in the chapter have been ordered by public authorities with statistical responsibilities, while SCB has exclusive responsibility for the others. Chapter four presents an overview of the statistics on land use maintained by SCB. This chapter does not follow the same pattern as chapters two and three but instead gives a more general account. The conclusion can be drawn that there are good prospects that SKB can make use of SCB's data as background information or in other ways when undertaking future

  7. Statistics available for site studies in registers and surveys at Statistics Sweden

    Energy Technology Data Exchange (ETDEWEB)

    Haldorson, Marie [Statistics Sweden, Oerebro (Sweden)

    2000-03-01

    Statistics Sweden (SCB) has produced this report on behalf of the Swedish Nuclear Fuel and Waste Management Company (SKB), as part of the data to be used by SKB in conducting studies of potential sites. The report goes over the statistics obtainable from SCB in the form of registers and surveys. The purpose is to identify the variables that are available, and to specify their degree of geographical detail and the time series that are available. Chapter two describes the statistical registers available at SCB, registers that share the common feature that they provide total coverage, i.e. they contain all 'objects' of a given type, such as population, economic activities (e.g. from statements of employees' earnings provided to the tax authorities), vehicles, enterprises or real estate. SCB has exclusive responsibility for seven of the nine registers included in the chapter, while two registers are ordered by public authorities with statistical responsibilities. Chapter three describes statistical surveys that are conducted by SCB, with the exception of the National Forest Inventory, which is carried out by the Swedish University of Agricultural Sciences. In terms of geographical breakdown, the degree of detail in the surveys varies, but all provide some possibility of reporting data at lower than the national level. The level involved may be county, municipality, yield district, coastal district or category of enterprises, e.g. aquaculture. Six of the nine surveys included in the chapter have been ordered by public authorities with statistical responsibilities, while SCB has exclusive responsibility for the others. Chapter four presents an overview of the statistics on land use maintained by SCB. This chapter does not follow the same pattern as chapters two and three but instead gives a more general account. The conclusion can be drawn that there are good prospects that SKB can make use of SCB's data as background information or in other ways when undertaking future

  8. Statistical Estimators Using Jointly Administrative and Survey Data to Produce French Structural Business Statistics

    Directory of Open Access Journals (Sweden)

    Brion Philippe

    2015-12-01

    Full Text Available Using as much administrative data as possible is a general trend among most national statistical institutes. Different kinds of administrative sources, from tax authorities or other administrative bodies, are very helpful material in the production of business statistics. However, these sources often have to be completed by information collected through statistical surveys. This article describes the way Insee has implemented such a strategy in order to produce French structural business statistics. The originality of the French procedure is that administrative and survey variables are used jointly for the same enterprises, unlike the majority of multisource systems, in which the two kinds of sources generally complement each other for different categories of units. The idea is to use, as much as possible, the richness of the administrative sources combined with the timeliness of a survey, even if the latter is conducted only on a sample of enterprises. One main issue is the classification of enterprises within the NACE nomenclature, which is a cornerstone variable in producing the breakdown of the results by industry. At a given date, two values of the corresponding code may coexist: the value of the register, not necessarily up to date, and the value resulting from the data collected via the survey, but only from a sample of enterprises. Using all this information together requires the implementation of specific statistical estimators combining some properties of the difference estimators with calibration techniques. This article presents these estimators, as well as their statistical properties, and compares them with those of other methods.

  9. Water Quality attainment Information from Clean Water Act Statewide Statistical Surveys

    Data.gov (United States)

    U.S. Environmental Protection Agency — Designated uses assessed by statewide statistical surveys and their state and national attainment categories. Statewide statistical surveys are water quality...

  10. Water Quality Stressor Information from Clean Water Act Statewide Statistical Surveys

    Data.gov (United States)

    U.S. Environmental Protection Agency — Stressors assessed by statewide statistical surveys and their state and national attainment categories. Statewide statistical surveys are water quality assessments...

  11. Survey procedure: Control and accountability of nuclear materials

    International Nuclear Information System (INIS)

    Van Ness, H.

    1987-02-01

    This procedure outlines the method by which the Department of Energy (DOE) San Francisco Operations Office (SAN) will plan and execute periodic field surveys of the Material Control and Accountability (MC and A) program and practices at designated contractors' facilities. The surveys will be conducted in accordance with DOE Order 5630.7, Control and Accountability of Nuclear Materials Surveys (7/8/81) to ascertain compliance with applicable DOE Orders and SAN Management Directives in the 5630 series, as well as the adequacy of the contractor's program and procedures. Surveys will be conducted by the Safeguards and Security Division of DOE-SAN. The survey team will review and evaluate the adequacy of the contractor's procedures and practices for nuclear material control and accounting by means of physical inventory, internal control, measurement and statistics, material control indicators, records and reports, and personnel training. The survey will include an audit of records and reports, observation of inventory procedures, an independent test of the inventory and a review and evaluation of the inventory differences, accidental losses, and normal operational losses as applicable to the facility to be surveyed

  12. Radiological decontamination, survey, and statistical release method for vehicles

    International Nuclear Information System (INIS)

    Goodwill, M.E.; Lively, J.W.; Morris, R.L.

    1996-06-01

    Earth-moving vehicles (e.g., dump trucks, belly dumps) commonly haul radiologically contaminated materials from a site being remediated to a disposal site. Traditionally, each vehicle must be surveyed before being released. The logistical difficulties of implementing the traditional approach on a large scale demand that an alternative be devised. A statistical method for assessing product quality from a continuous process was adapted to the vehicle decontamination process. This method produced a sampling scheme that automatically compensates and accommodates fluctuating batch sizes and changing conditions without the need to modify or rectify the sampling scheme in the field. Vehicles are randomly selected (sampled) upon completion of the decontamination process to be surveyed for residual radioactive surface contamination. The frequency of sampling is based on the expected number of vehicles passing through the decontamination process in a given period and the confidence level desired. This process has been successfully used for 1 year at the former uranium millsite in Monticello, Utah (a cleanup site regulated under the Comprehensive Environmental Response, Compensation, and Liability Act). The method forces improvement in the quality of the decontamination process and results in a lower likelihood that vehicles exceeding the surface contamination standards are offered for survey. Implementation of this statistical sampling method on Monticello projects has resulted in more efficient processing of vehicles through decontamination and radiological release, saved hundreds of hours of processing time, provided a high level of confidence that release limits are met, and improved the radiological cleanliness of vehicles leaving the controlled site

  13. Frontiers in statistical quality control

    CERN Document Server

    Wilrich, Peter-Theodor

    2004-01-01

    This volume treats the four main categories of Statistical Quality Control: General SQC Methodology, On-line Control including Sampling Inspection and Statistical Process Control, Off-line Control with Data Analysis and Experimental Design, and, fields related to Reliability. Experts with international reputation present their newest contributions.

  14. Gravitational lensing statistics with extragalactic surveys - II. Analysis of the Jodrell Bank-VLA Astrometric Survey

    NARCIS (Netherlands)

    Helbig, P; Marlow, D; Quast, R; Wilkinson, PN; Browne, IWA; Koopmans, LVE

    We present constraints on the cosmological constant lambda(0) from gravitational lensing statistics of the Jodrell Bank-VLA Astrometric Survey (JVAS). Although this is the largest gravitational lens survey which has been analysed, cosmological constraints are only comparable to those from optical

  15. Opinion Polls and Statistical Surveys: What They Really Tell Us

    Indian Academy of Sciences (India)

    from the Indian Statistical. Iastihte in 1986 and the ... any other statistical survey) is the estimation of some un- ... shares are then predicted based on a suitable mathematical model. ..... have some application in opinion polls as well. In the elec ...

  16. Statistical Engine Knock Control

    DEFF Research Database (Denmark)

    Stotsky, Alexander A.

    2008-01-01

    A new statistical concept of the knock control of a spark ignition automotive engine is proposed . The control aim is associated with the statistical hy pothesis test which compares the threshold value to the average value of the max imal amplitud e of the knock sensor signal at a given freq uency....... C ontrol algorithm which is used for minimization of the regulation error realizes a simple count-up-count-d own logic. A new ad aptation algorithm for the knock d etection threshold is also d eveloped . C onfi d ence interval method is used as the b asis for ad aptation. A simple statistical mod el...... which includ es generation of the amplitud e signals, a threshold value d etermination and a knock sound mod el is d eveloped for evaluation of the control concept....

  17. Frontiers in statistical quality control 11

    CERN Document Server

    Schmid, Wolfgang

    2015-01-01

    The main focus of this edited volume is on three major areas of statistical quality control: statistical process control (SPC), acceptance sampling and design of experiments. The majority of the papers deal with statistical process control, while acceptance sampling and design of experiments are also treated to a lesser extent. The book is organized into four thematic parts, with Part I addressing statistical process control. Part II is devoted to acceptance sampling. Part III covers the design of experiments, while Part IV discusses related fields. The twenty-three papers in this volume stem from The 11th International Workshop on Intelligent Statistical Quality Control, which was held in Sydney, Australia from August 20 to August 23, 2013. The event was hosted by Professor Ross Sparks, CSIRO Mathematics, Informatics and Statistics, North Ryde, Australia and was jointly organized by Professors S. Knoth, W. Schmid and Ross Sparks. The papers presented here were carefully selected and reviewed by the scientifi...

  18. Multivariate Statistical Process Control

    DEFF Research Database (Denmark)

    Kulahci, Murat

    2013-01-01

    As sensor and computer technology continues to improve, it becomes a normal occurrence that we confront with high dimensional data sets. As in many areas of industrial statistics, this brings forth various challenges in statistical process control (SPC) and monitoring for which the aim...... is to identify “out-of-control” state of a process using control charts in order to reduce the excessive variation caused by so-called assignable causes. In practice, the most common method of monitoring multivariate data is through a statistic akin to the Hotelling’s T2. For high dimensional data with excessive...... amount of cross correlation, practitioners are often recommended to use latent structures methods such as Principal Component Analysis to summarize the data in only a few linear combinations of the original variables that capture most of the variation in the data. Applications of these control charts...

  19. Statistical Inference at Work: Statistical Process Control as an Example

    Science.gov (United States)

    Bakker, Arthur; Kent, Phillip; Derry, Jan; Noss, Richard; Hoyles, Celia

    2008-01-01

    To characterise statistical inference in the workplace this paper compares a prototypical type of statistical inference at work, statistical process control (SPC), with a type of statistical inference that is better known in educational settings, hypothesis testing. Although there are some similarities between the reasoning structure involved in…

  20. THE INTEGRATED SHORT-TERM STATISTICAL SURVEYS: EXPERIENCE OF NBS IN MOLDOVA

    Directory of Open Access Journals (Sweden)

    Oleg CARA

    2012-07-01

    Full Text Available The users’ rising need for relevant, reliable, coherent, timely data for the early diagnosis of the economic vulnerability and of the turning points in the business cycles, especially during a financial and economic crisis, asks for a prompt answer, coordinated by statistical institutions. High quality short term statistics are of special interest for the emerging market economies, such as the Moldavian one, being extremely vulnerable when facing economic recession. Answering to the challenges of producing a coherent and adequate image of the economic activity, by using the system of indicators and definitions efficiently applied at the level of the European Union, the National Bureau of Statistics (NBS of the Republic of Moldova has launched the development process of an integrated system of short term statistics (STS based on the advanced international experience.Thus, in 2011, BNS implemented the integrated statistical survey on STS based on consistent concepts, harmonized with the EU standards. The integration of the production processes, which were previously separated, is based on a common technical infrastructure, standardized procedures and techniques for data production. The achievement of this complex survey with holistic approach has allowed the consolidation of the statistical data quality, comparable at European level and the signifi cant reduction of information burden on business units, especially of small size.The reformation of STS based on the integrated survey has been possible thanks to the consistent methodological and practical support given to NBS by the National Institute of Statistics (INS of Romania, for which we would like to thank to our Romanian colleagues.

  1. Frontiers in statistical quality control

    CERN Document Server

    Wilrich, Peter-Theodor

    2001-01-01

    The book is a collection of papers presented at the 5th International Workshop on Intelligent Statistical Quality Control in Würzburg, Germany. Contributions deal with methodology and successful industrial applications. They can be grouped in four catagories: Sampling Inspection, Statistical Process Control, Data Analysis and Process Capability Studies and Experimental Design.

  2. Statistical Techniques for Project Control

    CERN Document Server

    Badiru, Adedeji B

    2012-01-01

    A project can be simple or complex. In each case, proven project management processes must be followed. In all cases of project management implementation, control must be exercised in order to assure that project objectives are achieved. Statistical Techniques for Project Control seamlessly integrates qualitative and quantitative tools and techniques for project control. It fills the void that exists in the application of statistical techniques to project control. The book begins by defining the fundamentals of project management then explores how to temper quantitative analysis with qualitati

  3. Statistical disclosure control for microdata methods and applications in R

    CERN Document Server

    Templ, Matthias

    2017-01-01

    This book on statistical disclosure control presents the theory, applications and software implementation of the traditional approach to (micro)data anonymization, including data perturbation methods, disclosure risk, data utility, information loss and methods for simulating synthetic data. Introducing readers to the R packages sdcMicro and simPop, the book also features numerous examples and exercises with solutions, as well as case studies with real-world data, accompanied by the underlying R code to allow readers to reproduce all results. The demand for and volume of data from surveys, registers or other sources containing sensible information on persons or enterprises have increased significantly over the last several years. At the same time, privacy protection principles and regulations have imposed restrictions on the access and use of individual data. Proper and secure microdata dissemination calls for the application of statistical disclosure control methods to the data before release. This book is in...

  4. Challenges in dental statistics: survey methodology topics

    OpenAIRE

    Pizzo, Giuseppe; Milani, Silvano; Spada, Elena; Ottolenghi, Livia

    2013-01-01

    This paper gathers some contributions concerning survey methodology in dental research, as discussed during the first Workshop of the SISMEC STATDENT working group on statistical methods and applications in dentistry, held in Ancona on the 28th September 2011.The first contribution deals with the European Global Oral Health Indicators Development (EGOHID) Project which proposed a comprehensive and standardized system of epidemiological tools (questionnaires and clinical forms) for national da...

  5. Methods and applications of statistics in engineering, quality control, and the physical sciences

    CERN Document Server

    Balakrishnan, N

    2011-01-01

    Inspired by the Encyclopedia of Statistical Sciences, Second Edition (ESS2e), this volume presents a concise, well-rounded focus on the statistical concepts and applications that are essential for understanding gathered data in the fields of engineering, quality control, and the physical sciences. The book successfully upholds the goals of ESS2e by combining both previously-published and newly developed contributions written by over 100 leading academics, researchers, and practitioner in a comprehensive, approachable format. The result is a succinct reference that unveils modern, cutting-edge approaches to acquiring and analyzing data across diverse subject areas within these three disciplines, including operations research, chemistry, physics, the earth sciences, electrical engineering, and quality assurance. In addition, techniques related to survey methodology, computational statistics, and operations research are discussed, where applicable. Topics of coverage include: optimal and stochastic control, arti...

  6. Statistical process control in wine industry using control cards

    OpenAIRE

    Dimitrieva, Evica; Atanasova-Pacemska, Tatjana; Pacemska, Sanja

    2013-01-01

    This paper is based on the research of the technological process of automatic filling of bottles of wine in winery in Stip, Republic of Macedonia. The statistical process control using statistical control card is created. The results and recommendations for improving the process are discussed.

  7. A Survey of Introductory Statistics Courses at University Faculties of Pharmaceutical Sciences in Japan.

    Science.gov (United States)

    Matsumura, Mina; Nakayama, Takuto; Sozu, Takashi

    2016-01-01

    A survey of introductory statistics courses at Japanese medical schools was published as a report in 2014. To obtain a complete understanding of the way in which statistics is taught at the university level in Japan, it is important to extend this survey to related fields, including pharmacy, dentistry, and nursing. The current study investigates the introductory statistics courses offered by faculties of pharmaceutical sciences (six-year programs) at Japanese universities, comparing the features of these courses with those studied in the survey of medical schools. We collected relevant data from the online syllabi of statistics courses published on the websites of 71 universities. The survey items included basic course information (for example, the course names, the targeted student grades, the number of credits, and course classification), textbooks, handouts, the doctoral subject and employment status of each lecturer, and course contents. The period surveyed was July-September 2015. We found that these 71 universities offered a total of 128 statistics courses. There were 67 course names, the most common of which was "biostatistics (iryou toukeigaku)." About half of the courses were designed for first- or second-year students. Students earned fewer than two credits. There were 62 different types of textbooks. The lecturers held doctoral degrees in 18 different subjects, the most common being a doctorate in pharmacy or science. Some course content differed, reflecting the lecturers' academic specialties. The content of introductory statistics courses taught in pharmaceutical science programs also differed slightly from the equivalent content taught in medical schools.

  8. Control cards as a statistical quality control resource

    Directory of Open Access Journals (Sweden)

    Aleksandar Živan Drenovac

    2013-02-01

    Full Text Available Normal 0 false false false MicrosoftInternetExplorer4 This paper proves that applying of statistical methods can significantly contribute increasing of products and services quality, as well as increasing of institutions rating. Determining of optimal, apropos anticipatory and limitary values, is based on sample`s statistical analyze. Control cards represent very confident instrument, which is simple for use and efficient for control of process, by which process is maintained in set borders. Thus, control cards can be applied in quality control of procesess of weapons and military equipment production, maintenance of technical systems, as well as for seting of standards and increasing of quality level for many other activities.

  9. Surveys Assessing Students' Attitudes toward Statistics: A Systematic Review of Validity and Reliability

    Science.gov (United States)

    Nolan, Meaghan M.; Beran, Tanya; Hecker, Kent G.

    2012-01-01

    Students with positive attitudes toward statistics are likely to show strong academic performance in statistics courses. Multiple surveys measuring students' attitudes toward statistics exist; however, a comparison of the validity and reliability of interpretations based on their scores is needed. A systematic review of relevant electronic…

  10. Multivariate Statistical Process Control Charts: An Overview

    OpenAIRE

    Bersimis, Sotiris; Psarakis, Stelios; Panaretos, John

    2006-01-01

    In this paper we discuss the basic procedures for the implementation of multivariate statistical process control via control charting. Furthermore, we review multivariate extensions for all kinds of univariate control charts, such as multivariate Shewhart-type control charts, multivariate CUSUM control charts and multivariate EWMA control charts. In addition, we review unique procedures for the construction of multivariate control charts, based on multivariate statistical techniques such as p...

  11. Gravitational lensing statistics with extragalactic surveys; 2, Analysis of the Jodrell Bank-VLA Astrometric Survey

    NARCIS (Netherlands)

    Helbig, P.; Marlow, D. R.; Quast, R.; Wilkinson, P. N.; Browne, I. W. A.; Koopmans, L. V. E.

    1999-01-01

    Published in: Astron. Astrophys. Suppl. Ser. 136 (1999) no. 2, pp.297-305 citations recorded in [Science Citation Index] Abstract: We present constraints on the cosmological constant $lambda_{0}$ from gravitational lensing statistics of the Jodrell Bank-VLA Astrometric Survey (JVAS). Although this

  12. NGS Survey Control Map

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The NGS Survey Control Map provides a map of the US which allows you to find and display geodetic survey control points stored in the database of the National...

  13. Predictors of certification in infection prevention and control among infection preventionists: APIC MegaSurvey findings.

    Science.gov (United States)

    Kalp, Ericka L; Harris, Jeanette J; Zawistowski, Grace

    2018-06-06

    The 2015 APIC MegaSurvey was completed by 4,078 members to assess infection prevention practices. This study's purpose was to examine MegaSurvey results to relate infection preventionist (IP) certification status with demographic characteristics, organizational structure, compensation benefits, and practice and competency factors. Descriptive statistics were used to examine population characteristics and certification status. Bivariate logistic regression was performed to evaluate relationships between independent variables and certification status. Variables demonstrating statistical significance (P Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.

  14. 75 FR 21231 - Proposed Information Collection; Comment Request; Marine Recreational Fisheries Statistics Survey

    Science.gov (United States)

    2010-04-23

    ... Collection; Comment Request; Marine Recreational Fisheries Statistics Survey AGENCY: National Oceanic and... Andrews, (301) 713-2328, ext. 148 or [email protected] . SUPPLEMENTARY INFORMATION: I. Abstract Marine recreational anglers are surveyed for catch and effort data, fish biology data, and angler socioeconomic...

  15. Evaluation of Theoretical and Empirical Characteristics of the Communication, Language, and Statistics Survey (CLASS)

    Science.gov (United States)

    Wagler, Amy E.; Lesser, Lawrence M.

    2018-01-01

    The interaction between language and the learning of statistical concepts has been receiving increased attention. The Communication, Language, And Statistics Survey (CLASS) was developed in response to the need to focus on dynamics of language in light of the culturally and linguistically diverse environments of introductory statistics classrooms.…

  16. Memory-type control charts in statistical process control

    NARCIS (Netherlands)

    Abbas, N.

    2012-01-01

    Control chart is the most important statistical tool to manage the business processes. It is a graph of measurements on a quality characteristic of the process on the vertical axis plotted against time on the horizontal axis. The graph is completed with control limits that cause variation mark. Once

  17. Statistical process control in nursing research.

    Science.gov (United States)

    Polit, Denise F; Chaboyer, Wendy

    2012-02-01

    In intervention studies in which randomization to groups is not possible, researchers typically use quasi-experimental designs. Time series designs are strong quasi-experimental designs but are seldom used, perhaps because of technical and analytic hurdles. Statistical process control (SPC) is an alternative analytic approach to testing hypotheses about intervention effects using data collected over time. SPC, like traditional statistical methods, is a tool for understanding variation and involves the construction of control charts that distinguish between normal, random fluctuations (common cause variation), and statistically significant special cause variation that can result from an innovation. The purpose of this article is to provide an overview of SPC and to illustrate its use in a study of a nursing practice improvement intervention. Copyright © 2011 Wiley Periodicals, Inc.

  18. Using statistical process control for monitoring the prevalence of hospital-acquired pressure ulcers.

    Science.gov (United States)

    Kottner, Jan; Halfens, Ruud

    2010-05-01

    Institutionally acquired pressure ulcers are used as outcome indicators to assess the quality of pressure ulcer prevention programs. Determining whether quality improvement projects that aim to decrease the proportions of institutionally acquired pressure ulcers lead to real changes in clinical practice depends on the measurement method and statistical analysis used. To examine whether nosocomial pressure ulcer prevalence rates in hospitals in the Netherlands changed, a secondary data analysis using different statistical approaches was conducted of annual (1998-2008) nationwide nursing-sensitive health problem prevalence studies in the Netherlands. Institutions that participated regularly in all survey years were identified. Risk-adjusted nosocomial pressure ulcers prevalence rates, grade 2 to 4 (European Pressure Ulcer Advisory Panel system) were calculated per year and hospital. Descriptive statistics, chi-square trend tests, and P charts based on statistical process control (SPC) were applied and compared. Six of the 905 healthcare institutions participated in every survey year and 11,444 patients in these six hospitals were identified as being at risk for pressure ulcers. Prevalence rates per year ranged from 0.05 to 0.22. Chi-square trend tests revealed statistically significant downward trends in four hospitals but based on SPC methods, prevalence rates of five hospitals varied by chance only. Results of chi-square trend tests and SPC methods were not comparable, making it impossible to decide which approach is more appropriate. P charts provide more valuable information than single P values and are more helpful for monitoring institutional performance. Empirical evidence about the decrease of nosocomial pressure ulcer prevalence rates in the Netherlands is contradictory and limited.

  19. Challenges in dental statistics: survey methodology topics

    Directory of Open Access Journals (Sweden)

    Giuseppe Pizzo

    2013-12-01

    Full Text Available This paper gathers some contributions concerning survey methodology in dental research, as discussed during the first Workshop of the SISMEC STATDENT working group on statistical methods and applications in dentistry, held in Ancona on the 28th September 2011.The first contribution deals with the European Global Oral Health Indicators Development (EGOHID Project which proposed a comprehensive and standardized system of epidemiological tools (questionnaires and clinical forms for national data collection on oral health in Europe. The second contribution regards the design and conduct of trials to evaluate the clinical efficacy and safety of toothbrushes and mouthrinses. Finally, a flexible and effective tool used to trace dental age reference charts tailored to Italian children is presented.

  20. Survey mode matters: adults' self-reported statistical confidence, ability to obtain health information, and perceptions of patient-health-care provider communication.

    Science.gov (United States)

    Wallace, Lorraine S; Chisolm, Deena J; Abdel-Rasoul, Mahmoud; DeVoe, Jennifer E

    2013-08-01

    This study examined adults' self-reported understanding and formatting preferences of medical statistics, confidence in self-care and ability to obtain health advice or information, and perceptions of patient-health-care provider communication measured through dual survey modes (random digital dial and mail). Even while controlling for sociodemographic characteristics, significant differences in regard to adults' responses to survey variables emerged as a function of survey mode. While the analyses do not allow us to pinpoint the underlying causes of the differences observed, they do suggest that mode of administration should be carefully adjusted for and considered.

  1. Survey of Native English Speakers and Spanish-Speaking English Language Learners in Tertiary Introductory Statistics

    Science.gov (United States)

    Lesser, Lawrence M.; Wagler, Amy E.; Esquinca, Alberto; Valenzuela, M. Guadalupe

    2013-01-01

    The framework of linguistic register and case study research on Spanish-speaking English language learners (ELLs) learning statistics informed the construction of a quantitative instrument, the Communication, Language, And Statistics Survey (CLASS). CLASS aims to assess whether ELLs and non-ELLs approach the learning of statistics differently with…

  2. STATISTICAL ASSESSMENT OF CAREERS OF DOCTORATE HOLDERS SURVEY BY CORRESPONDENCE ANALYSIS: THE CASE OF TURKEY

    OpenAIRE

    Zerrin AŞAN GREENACRE

    2018-01-01

    Nowadays, doctorate holders are not only academician and researcher, also they play roles across a wide range of working areas within society. These areas are increasing multiway. The aim of this study is to give statistical evaluation careers of doctorate holders survey using  the correspondence analysis with supplementary variables. In this paper  we focused on careers of doctorate holders  in Turkey. It was held a survey by  Turkish   Statistical Institute on this s...

  3. Applicability of statistical process control techniques

    NARCIS (Netherlands)

    Schippers, W.A.J.

    1998-01-01

    This paper concerns the application of Process Control Techniques (PCTs) for the improvement of the technical performance of discrete production processes. Successful applications of these techniques, such as Statistical Process Control Techniques (SPC), can be found in the literature. However, some

  4. Impact analysis of critical success factors on the benefits from statistical process control implementation

    Directory of Open Access Journals (Sweden)

    Fabiano Rodrigues Soriano

    Full Text Available Abstract The Statistical Process Control - SPC is a set of statistical techniques focused on process control, monitoring and analyzing variation causes in the quality characteristics and/or in the parameters used to control and process improvements. Implementing SPC in organizations is a complex task. The reasons for its failure are related to organizational or social factors such as lack of top management commitment and little understanding about its potential benefits. Other aspects concern technical factors such as lack of training on and understanding about the statistical techniques. The main aim of the present article is to understand the interrelations between conditioning factors associated with top management commitment (Support, SPC Training and Application, as well as to understand the relationships between these factors and the benefits associated with the implementation of the program. The Partial Least Squares Structural Equation Modeling (PLS-SEM was used in the analysis since the main goal is to establish the causal relations. A cross-section survey was used as research method to collect information of samples from Brazilian auto-parts companies, which were selected according to guides from the auto-parts industry associations. A total of 170 companies were contacted by e-mail and by phone in order to be invited to participate in the survey. However, just 93 industries agreed on participating, and only 43 answered the questionnaire. The results showed that the senior management support considerably affects the way companies develop their training programs. In turn, these trainings affect the way companies apply the techniques. Thus, it will reflect on the benefits gotten from implementing the program. It was observed that the managerial and technical aspects are closely connected to each other and that they are represented by the ratio between top management and training support. The technical aspects observed through SPC

  5. Using Statistical Process Control to Enhance Student Progression

    Science.gov (United States)

    Hanna, Mark D.; Raichura, Nilesh; Bernardes, Ednilson

    2012-01-01

    Public interest in educational outcomes has markedly increased in the most recent decade; however, quality management and statistical process control have not deeply penetrated the management of academic institutions. This paper presents results of an attempt to use Statistical Process Control (SPC) to identify a key impediment to continuous…

  6. Trends in violent crime: a comparison between police statistics and victimization surveys

    NARCIS (Netherlands)

    Wittebrood, Karin; Junger, Marianne

    2002-01-01

    Usually, two measures are used to describetrends in violent crime: police statistics andvictimization surveys. Both are available inthe Netherlands. In this contribution, we willfirst provide a description of the trends inviolent crime. It appears that both types ofstatistics reflect a different

  7. Statistics in the Workplace: A Survey of Use by Recent Graduates with Higher Degrees

    Science.gov (United States)

    Harraway, John A.; Barker, Richard J.

    2005-01-01

    A postal survey was conducted regarding statistical techniques, research methods and software used in the workplace by 913 graduates with PhD and Masters degrees in the biological sciences, psychology, business, economics, and statistics. The study identified gaps between topics and techniques learned at university and those used in the workplace,…

  8. Validation of survey information on smoking and alcohol consumption against import statistics, Greenland 1993-2010.

    Science.gov (United States)

    Bjerregaard, Peter; Becker, Ulrik

    2013-01-01

    Questionnaires are widely used to obtain information on health-related behaviour, and they are more often than not the only method that can be used to assess the distribution of behaviour in subgroups of the population. No validation studies of reported consumption of tobacco or alcohol have been published from circumpolar indigenous communities. The purpose of the study is to compare information on the consumption of tobacco and alcohol obtained from 3 population surveys in Greenland with import statistics. Estimates of consumption of cigarettes and alcohol using several different survey instruments in cross-sectional population studies from 1993-1994, 1999-2001 and 2005-2010 were compared with import statistics from the same years. For cigarettes, survey results accounted for virtually the total import. Alcohol consumption was significantly under-reported with reporting completeness ranging from 40% to 51% for different estimates of habitual weekly consumption in the 3 study periods. Including an estimate of binge drinking increased the estimated total consumption to 78% of the import. Compared with import statistics, questionnaire-based population surveys capture the consumption of cigarettes well in Greenland. Consumption of alcohol is under-reported, but asking about binge episodes in addition to the usual intake considerably increased the reported intake in this population and made it more in agreement with import statistics. It is unknown to what extent these findings at the population level can be inferred to population subgroups.

  9. Survey of occupant behaviour and control of indoor environment in Danish dwellings

    DEFF Research Database (Denmark)

    Andersen, Rune Vinther; Toftum, Jørn; Andersen, Klaus Kaae

    2009-01-01

    separately by means of multiple logistic regression in order to quantify factors influencing occupants’ behaviour. The window opening behaviour was strongly related to the outdoor temperature. The perception of the environment and factors concerning the dwelling also impacted the window opening behaviour......Repeated surveys of occupant control of the indoor environment were carried out in Danish dwellings from September to October 2006 and again from February to March 2007. The summer survey comprised 933 respondents and the winter survey 636 respondents. The surveys were carried out by sending out....... The proportion of dwellings with the heating turned on was strongly related to the outdoor temperature and the presence of a wood burning stove. The solar radiation, dwelling ownership conditions and the perception of the indoor environment also affected the use of heating. The results of the statistical...

  10. Minimum detectable activities of contamination control survey equipment

    International Nuclear Information System (INIS)

    Goles, R.W.; Baumann, B.L.; Johnson, M.L.

    1991-08-01

    The Instrumentation ampersand External Dosimetry (I ampersand ED) Section of the Health Physics Department at the Pacific Northwest Laboratory (PNL) has performed a series of tests to determine the ability of portable survey instruments used at Hanford to detect radioactive contamination at levels required by DOE 5480.11. This semi-empirical study combines instrumental, statistical, and human factors as necessary to derive operational detection limits. These threshold detection values have been compared to existing contamination control requirements, and detection deficiencies have been identified when present. Portable survey instruments used on the Hanford Site identify the presence of radioactive surface contamination based on the detection of α-, β-, γ-, and/or x-radiation. However, except in some unique circumstances, most contamination monitors in use at Hanford are configured to detect either α-radiation alone or β- and γ-radiation together. Testing was therefore conducted on only these two categories of radiation detection devices. Nevertheless, many of the results obtained are generally applicable to all survey instruments, allowing performance evaluations to be extended to monitoring devices which are exclusively γ- and/or x-ray- sensitive. 6 figs., 2 tabs

  11. A computerized diagnostic system for nuclear plant control rooms based on statistical quality control

    International Nuclear Information System (INIS)

    Heising, C.D.; Grenzebach, W.S.

    1990-01-01

    In engineering science, statistical quality control techniques have traditionally been applied to control manufacturing processes. An application to commercial nuclear power plant maintenance and control is presented that can greatly improve safety. As a demonstration of such an approach to plant maintenance and control, a specific system is analyzed: the reactor coolant pumps of the St. Lucie Unit 2 nuclear power plant located in Florida. A 30-day history of the four pumps prior to a plant shutdown caused by pump failure and a related fire within the containment was analyzed. Statistical quality control charts of recorded variables were constructed for each pump, which were shown to go out of statistical control many days before the plant trip. The analysis shows that statistical process control methods can be applied as an early warning system capable of identifying significant equipment problems well in advance of traditional control room alarm indicators

  12. Using Person Fit Statistics to Detect Outliers in Survey Research

    Directory of Open Access Journals (Sweden)

    John M. Felt

    2017-05-01

    Full Text Available Context: When working with health-related questionnaires, outlier detection is important. However, traditional methods of outlier detection (e.g., boxplots can miss participants with “atypical” responses to the questions that otherwise have similar total (subscale scores. In addition to detecting outliers, it can be of clinical importance to determine the reason for the outlier status or “atypical” response.Objective: The aim of the current study was to illustrate how to derive person fit statistics for outlier detection through a statistical method examining person fit with a health-based questionnaire.Design and Participants: Patients treated for Cushing's syndrome (n = 394 were recruited from the Cushing's Support and Research Foundation's (CSRF listserv and Facebook page.Main Outcome Measure: Patients were directed to an online survey containing the CushingQoL (English version. A two-dimensional graded response model was estimated, and person fit statistics were generated using the Zh statistic.Results: Conventional outlier detections methods revealed no outliers reflecting extreme scores on the subscales of the CushingQoL. However, person fit statistics identified 18 patients with “atypical” response patterns, which would have been otherwise missed (Zh > |±2.00|.Conclusion: While the conventional methods of outlier detection indicated no outliers, person fit statistics identified several patients with “atypical” response patterns who otherwise appeared average. Person fit statistics allow researchers to delve further into the underlying problems experienced by these “atypical” patients treated for Cushing's syndrome. Annotated code is provided to aid other researchers in using this method.

  13. Engaging Students in Survey Research Projects across Research Methods and Statistics Courses

    Science.gov (United States)

    Lovekamp, William E.; Soboroff, Shane D.; Gillespie, Michael D.

    2017-01-01

    One innovative way to help students make sense of survey research has been to create a multifaceted, collaborative assignment that promotes critical thinking, comparative analysis, self-reflection, and statistical literacy. We use a short questionnaire adapted from the Higher Education Research Institute's Cooperative Institutional Research…

  14. Parity Specific Birth Rates for West Germany: An Attempt to Combine Survey Data and Vital Statistics

    OpenAIRE

    Kreyenfeld, Michaela

    2014-01-01

    In this paper, we combine vital statistics and survey data to obtain parity specific birth rates for West Germany. Since vital statistics do not provide birth parity information, one is confined to using estimates. The robustness of these estimates is an issue, which is unfortunately only rarely addressed when fertility indicators for (West) Germany are reported. In order to check how reliable our results are, we estimate confidence intervals and compare them to results from survey data and e...

  15. Applying Statistical Process Control to Clinical Data: An Illustration.

    Science.gov (United States)

    Pfadt, Al; And Others

    1992-01-01

    Principles of statistical process control are applied to a clinical setting through the use of control charts to detect changes, as part of treatment planning and clinical decision-making processes. The logic of control chart analysis is derived from principles of statistical inference. Sample charts offer examples of evaluating baselines and…

  16. MANAGERIAL DECISION IN INNOVATIVE EDUCATION SYSTEMS STATISTICAL SURVEY BASED ON SAMPLE THEORY

    Directory of Open Access Journals (Sweden)

    Gheorghe SĂVOIU

    2012-12-01

    Full Text Available Before formulating the statistical hypotheses and the econometrictesting itself, a breakdown of some of the technical issues is required, which are related to managerial decision in innovative educational systems, the educational managerial phenomenon tested through statistical and mathematical methods, respectively the significant difference in perceiving the current qualities, knowledge, experience, behaviour and desirable health, obtained through a questionnaire applied to a stratified population at the end,in the educational environment, either with educational activities, or with simultaneously managerial and educational activities. The details having to do with research focused on the survey theory, turning into a working tool the questionnaires and statistical data that are processed from those questionnaires, are summarized below.

  17. Applying Statistical Process Quality Control Methodology to Educational Settings.

    Science.gov (United States)

    Blumberg, Carol Joyce

    A subset of Statistical Process Control (SPC) methodology known as Control Charting is introduced. SPC methodology is a collection of graphical and inferential statistics techniques used to study the progress of phenomena over time. The types of control charts covered are the null X (mean), R (Range), X (individual observations), MR (moving…

  18. [Statistical process control applied to intensity modulated radiotherapy pretreatment controls with portal dosimetry].

    Science.gov (United States)

    Villani, N; Gérard, K; Marchesi, V; Huger, S; François, P; Noël, A

    2010-06-01

    The first purpose of this study was to illustrate the contribution of statistical process control for a better security in intensity modulated radiotherapy (IMRT) treatments. This improvement is possible by controlling the dose delivery process, characterized by pretreatment quality control results. So, it is necessary to put under control portal dosimetry measurements (currently, the ionisation chamber measurements were already monitored by statistical process control thanks to statistical process control tools). The second objective was to state whether it is possible to substitute ionisation chamber with portal dosimetry in order to optimize time devoted to pretreatment quality control. At Alexis-Vautrin center, pretreatment quality controls in IMRT for prostate and head and neck treatments were performed for each beam of each patient. These controls were made with an ionisation chamber, which is the reference detector for the absolute dose measurement, and with portal dosimetry for the verification of dose distribution. Statistical process control is a statistical analysis method, coming from industry, used to control and improve the studied process quality. It uses graphic tools as control maps to follow-up process, warning the operator in case of failure, and quantitative tools to evaluate the process toward its ability to respect guidelines: this is the capability study. The study was performed on 450 head and neck beams and on 100 prostate beams. Control charts, showing drifts, both slow and weak, and also both strong and fast, of mean and standard deviation have been established and have shown special cause introduced (manual shift of the leaf gap of the multileaf collimator). Correlation between dose measured at one point, given with the EPID and the ionisation chamber has been evaluated at more than 97% and disagreement cases between the two measurements were identified. The study allowed to demonstrate the feasibility to reduce the time devoted to

  19. Statistical process control applied to intensity modulated radiotherapy pretreatment controls with portal dosimetry

    International Nuclear Information System (INIS)

    Villani, N.; Noel, A.; Villani, N.; Gerard, K.; Marchesi, V.; Huger, S.; Noel, A.; Francois, P.

    2010-01-01

    Purpose The first purpose of this study was to illustrate the contribution of statistical process control for a better security in intensity modulated radiotherapy (I.M.R.T.) treatments. This improvement is possible by controlling the dose delivery process, characterized by pretreatment quality control results. So, it is necessary to put under control portal dosimetry measurements (currently, the ionisation chamber measurements were already monitored by statistical process control thanks to statistical process control tools). The second objective was to state whether it is possible to substitute ionisation chamber with portal dosimetry in order to optimize time devoted to pretreatment quality control. Patients and methods At Alexis-Vautrin center, pretreatment quality controls in I.M.R.T. for prostate and head and neck treatments were performed for each beam of each patient. These controls were made with an ionisation chamber, which is the reference detector for the absolute dose measurement, and with portal dosimetry for the verification of dose distribution. Statistical process control is a statistical analysis method, coming from industry, used to control and improve the studied process quality. It uses graphic tools as control maps to follow-up process, warning the operator in case of failure, and quantitative tools to evaluate the process toward its ability to respect guidelines: this is the capability study. The study was performed on 450 head and neck beams and on 100 prostate beams. Results Control charts, showing drifts, both slow and weak, and also both strong and fast, of mean and standard deviation have been established and have shown special cause introduced (manual shift of the leaf gap of the multi-leaf collimator). Correlation between dose measured at one point, given with the E.P.I.D. and the ionisation chamber has been evaluated at more than 97% and disagreement cases between the two measurements were identified. Conclusion The study allowed to

  20. Statistical process control for serially correlated data

    NARCIS (Netherlands)

    Wieringa, Jakob Edo

    1999-01-01

    Statistical Process Control (SPC) aims at quality improvement through reduction of variation. The best known tool of SPC is the control chart. Over the years, the control chart has proved to be a successful practical technique for monitoring process measurements. However, its usefulness in practice

  1. Statistical learning methods: Basics, control and performance

    Energy Technology Data Exchange (ETDEWEB)

    Zimmermann, J. [Max-Planck-Institut fuer Physik, Foehringer Ring 6, 80805 Munich (Germany)]. E-mail: zimmerm@mppmu.mpg.de

    2006-04-01

    The basics of statistical learning are reviewed with a special emphasis on general principles and problems for all different types of learning methods. Different aspects of controlling these methods in a physically adequate way will be discussed. All principles and guidelines will be exercised on examples for statistical learning methods in high energy and astrophysics. These examples prove in addition that statistical learning methods very often lead to a remarkable performance gain compared to the competing classical algorithms.

  2. Statistical learning methods: Basics, control and performance

    International Nuclear Information System (INIS)

    Zimmermann, J.

    2006-01-01

    The basics of statistical learning are reviewed with a special emphasis on general principles and problems for all different types of learning methods. Different aspects of controlling these methods in a physically adequate way will be discussed. All principles and guidelines will be exercised on examples for statistical learning methods in high energy and astrophysics. These examples prove in addition that statistical learning methods very often lead to a remarkable performance gain compared to the competing classical algorithms

  3. 77 FR 46096 - Statistical Process Controls for Blood Establishments; Public Workshop

    Science.gov (United States)

    2012-08-02

    ...] Statistical Process Controls for Blood Establishments; Public Workshop AGENCY: Food and Drug Administration... workshop entitled: ``Statistical Process Controls for Blood Establishments.'' The purpose of this public workshop is to discuss the implementation of statistical process controls to validate and monitor...

  4. Validation of survey information on smoking and alcohol consumption against import statistics, Greenland 1993–2010

    Directory of Open Access Journals (Sweden)

    Peter Bjerregaard

    2013-03-01

    Full Text Available Background. Questionnaires are widely used to obtain information on health-related behaviour, and they are more often than not the only method that can be used to assess the distribution of behaviour in subgroups of the population. No validation studies of reported consumption of tobacco or alcohol have been published from circumpolar indigenous communities. Objective. The purpose of the study is to compare information on the consumption of tobacco and alcohol obtained from 3 population surveys in Greenland with import statistics. Design. Estimates of consumption of cigarettes and alcohol using several different survey instruments in cross-sectional population studies from 1993–1994, 1999–2001 and 2005–2010 were compared with import statistics from the same years. Results. For cigarettes, survey results accounted for virtually the total import. Alcohol consumption was significantly under-reported with reporting completeness ranging from 40% to 51% for different estimates of habitual weekly consumption in the 3 study periods. Including an estimate of binge drinking increased the estimated total consumption to 78% of the import. Conclusion. Compared with import statistics, questionnaire-based population surveys capture the consumption of cigarettes well in Greenland. Consumption of alcohol is under-reported, but asking about binge episodes in addition to the usual intake considerably increased the reported intake in this population and made it more in agreement with import statistics. It is unknown to what extent these findings at the population level can be inferred to population subgroups.

  5. Quantum control theory and applications: A survey

    OpenAIRE

    Dong, Daoyi; Petersen, Ian R

    2009-01-01

    This paper presents a survey on quantum control theory and applications from a control systems perspective. Some of the basic concepts and main developments (including open-loop control and closed-loop control) in quantum control theory are reviewed. In the area of open-loop quantum control, the paper surveys the notion of controllability for quantum systems and presents several control design strategies including optimal control, Lyapunov-based methodologies, variable structure control and q...

  6. Improving Instruction Using Statistical Process Control.

    Science.gov (United States)

    Higgins, Ronald C.; Messer, George H.

    1990-01-01

    Two applications of statistical process control to the process of education are described. Discussed are the use of prompt feedback to teachers and prompt feedback to students. A sample feedback form is provided. (CW)

  7. Robust Control Methods for On-Line Statistical Learning

    Directory of Open Access Journals (Sweden)

    Capobianco Enrico

    2001-01-01

    Full Text Available The issue of controlling that data processing in an experiment results not affected by the presence of outliers is relevant for statistical control and learning studies. Learning schemes should thus be tested for their capacity of handling outliers in the observed training set so to achieve reliable estimates with respect to the crucial bias and variance aspects. We describe possible ways of endowing neural networks with statistically robust properties by defining feasible error criteria. It is convenient to cast neural nets in state space representations and apply both Kalman filter and stochastic approximation procedures in order to suggest statistically robustified solutions for on-line learning.

  8. Radiographic rejection index using statistical process control

    International Nuclear Information System (INIS)

    Savi, M.B.M.B.; Camozzato, T.S.C.; Soares, F.A.P.; Nandi, D.M.

    2015-01-01

    The Repeat Analysis Index (IRR) is one of the items contained in the Quality Control Program dictated by brazilian law of radiological protection and should be performed frequently, at least every six months. In order to extract more and better information of IRR, this study presents the Statistical Quality Control applied to reject rate through Statistical Process Control (Control Chart for Attributes ρ - GC) and the Pareto Chart (GP). Data collection was performed for 9 months and the last four months of collection was given on a daily basis. The Limits of Control (LC) were established and Minitab 16 software used to create the charts. IRR obtained for the period was corresponding to 8.8% ± 2,3% and the generated charts analyzed. Relevant information such as orders for X-ray equipment and processors were crossed to identify the relationship between the points that exceeded the control limits and the state of equipment at the time. The GC demonstrated ability to predict equipment failures, as well as the GP showed clearly what causes are recurrent in IRR. (authors) [pt

  9. Nonparametric predictive inference in statistical process control

    NARCIS (Netherlands)

    Arts, G.R.J.; Coolen, F.P.A.; Laan, van der P.

    2000-01-01

    New methods for statistical process control are presented, where the inferences have a nonparametric predictive nature. We consider several problems in process control in terms of uncertainties about future observable random quantities, and we develop inferences for these random quantities hased on

  10. Survey of licensee control room habitability practices

    International Nuclear Information System (INIS)

    Boland, J.F.; Brookshire, R.L.; Danielson, W.F.; Driscoll, J.W.; Graham, E.D.; McConnell, R.J.; Thompson, V.N.

    1985-04-01

    This document presents the results of a survey of Licensee control-room-habitability practices. The survey is part of a comprehensive program plan instituted in August 1983 by the NRC to respond to ongoing questions from the Advisory Committee on Reactor Safeguards (ACRS). The emphasis of this survey was to determine by field review the control-room habitability practices at three different plants, one of which is still under construction and scheduled to receive an operating license in 1986. The other two plants are currently operating, having received operating licenses in the mid-1970's and early 1980's. The major finding of this survey is that despite the fact that the latest control-room-habitability systems have become larger and more complex than earlier systems surveyed, the latest systems do not appear to be functionally superior. The major recommendation of this report is to consolidate into a single NRC document, based upon a comprehensive systems engineering approach, the pertinent criteria for control-room-habitability design

  11. Cosmology constraints from shear peak statistics in Dark Energy Survey Science Verification data

    International Nuclear Information System (INIS)

    Kacprzak, T.; Kirk, D.; Friedrich, O.; Amara, A.; Refregier, A.

    2016-01-01

    Shear peak statistics has gained a lot of attention recently as a practical alternative to the two-point statistics for constraining cosmological parameters. We perform a shear peak statistics analysis of the Dark Energy Survey (DES) Science Verification (SV) data, using weak gravitational lensing measurements from a 139 deg"2 field. We measure the abundance of peaks identified in aperture mass maps, as a function of their signal-to-noise ratio, in the signal-to-noise range 0 4 would require significant corrections, which is why we do not include them in our analysis. We compare our results to the cosmological constraints from the two-point analysis on the SV field and find them to be in good agreement in both the central value and its uncertainty. Lastly, we discuss prospects for future peak statistics analysis with upcoming DES data.

  12. Statistical Control Charts: Performances of Short Term Stock Trading in Croatia

    Directory of Open Access Journals (Sweden)

    Dumičić Ksenija

    2015-03-01

    Full Text Available Background: The stock exchange, as a regulated financial market, in modern economies reflects their economic development level. The stock market indicates the mood of investors in the development of a country and is an important ingredient for growth. Objectives: This paper aims to introduce an additional statistical tool used to support the decision-making process in stock trading, and it investigate the usage of statistical process control (SPC methods into the stock trading process. Methods/Approach: The individual (I, exponentially weighted moving average (EWMA and cumulative sum (CUSUM control charts were used for gaining trade signals. The open and the average prices of CROBEX10 index stocks on the Zagreb Stock Exchange were used in the analysis. The statistical control charts capabilities for stock trading in the short-run were analysed. Results: The statistical control chart analysis pointed out too many signals to buy or sell stocks. Most of them are considered as false alarms. So, the statistical control charts showed to be not so much useful in stock trading or in a portfolio analysis. Conclusions: The presence of non-normality and autocorellation has great impact on statistical control charts performances. It is assumed that if these two problems are solved, the use of statistical control charts in a portfolio analysis could be greatly improved.

  13. Nonparametric predictive inference in statistical process control

    NARCIS (Netherlands)

    Arts, G.R.J.; Coolen, F.P.A.; Laan, van der P.

    2004-01-01

    Statistical process control (SPC) is used to decide when to stop a process as confidence in the quality of the next item(s) is low. Information to specify a parametric model is not always available, and as SPC is of a predictive nature, we present a control chart developed using nonparametric

  14. Application of multivariate statistical methods in analyzing expectation surveys in Central Bank of Nigeria

    OpenAIRE

    Raymond, Ogbuka Obinna

    2017-01-01

    In analyzing survey data, most researchers and analysts make use of statistical methods with straight forward statistical approaches. More common, is the use of one‐way, two‐way or multi‐way tables, and graphical displays such as bar charts, line charts, etc. A brief overview of these approaches and a good discussion on aspects needing attention during the data analysis process can be found in Wilson & Stern (2001). In most cases however, analysis procedures that go beyond simp...

  15. Use of statistical process control in the production of blood components

    DEFF Research Database (Denmark)

    Magnussen, K; Quere, S; Winkel, P

    2008-01-01

    Introduction of statistical process control in the setting of a small blood centre was tested, both on the regular red blood cell production and specifically to test if a difference was seen in the quality of the platelets produced, when a change was made from a relatively large inexperienced...... by an experienced staff with four technologists. We applied statistical process control to examine if time series of quality control values were in statistical control. Leucocyte count in red blood cells was out of statistical control. Platelet concentration and volume of the platelets produced by the occasional...... occasional component manufacturing staff to an experienced regular manufacturing staff. Production of blood products is a semi-automated process in which the manual steps may be difficult to control. This study was performed in an ongoing effort to improve the control and optimize the quality of the blood...

  16. Matched case-control studies: a review of reported statistical methodology

    Directory of Open Access Journals (Sweden)

    Niven DJ

    2012-04-01

    Full Text Available Daniel J Niven1, Luc R Berthiaume2, Gordon H Fick1, Kevin B Laupland11Department of Critical Care Medicine, Peter Lougheed Centre, Calgary, 2Department of Community Health Sciences, University of Calgary, Calgary, Alberta, CanadaBackground: Case-control studies are a common and efficient means of studying rare diseases or illnesses with long latency periods. Matching of cases and controls is frequently employed to control the effects of known potential confounding variables. The analysis of matched data requires specific statistical methods.Methods: The objective of this study was to determine the proportion of published, peer reviewed matched case-control studies that used statistical methods appropriate for matched data. Using a comprehensive set of search criteria we identified 37 matched case-control studies for detailed analysis.Results: Among these 37 articles, only 16 studies were analyzed with proper statistical techniques (43%. Studies that were properly analyzed were more likely to have included case patients with cancer and cardiovascular disease compared to those that did not use proper statistics (10/16 or 63%, versus 5/21 or 24%, P = 0.02. They were also more likely to have matched multiple controls for each case (14/16 or 88%, versus 13/21 or 62%, P = 0.08. In addition, studies with properly analyzed data were more likely to have been published in a journal with an impact factor listed in the top 100 according to the Journal Citation Reports index (12/16 or 69%, versus 1/21 or 5%, P ≤ 0.0001.Conclusion: The findings of this study raise concern that the majority of matched case-control studies report results that are derived from improper statistical analyses. This may lead to errors in estimating the relationship between a disease and exposure, as well as the incorrect adaptation of emerging medical literature.Keywords: case-control, matched, dependent data, statistics

  17. External quality-control surveys of peptide hormone radioimmunoassays in the Federal Republic of Germany. The present status

    International Nuclear Information System (INIS)

    Breuer, H.; Jungbluth, D.; Roehle, G.; Marschner, I.; Scriba, P.C.; Wood, W.G.

    1978-01-01

    Two types of quality-control survey (QCS) of hormone assays are performed in the Federal Republic of Germany. In the one survey, the participating laboratories are requested to determine seven or eight different hormones in two lyophilized sera that are distributed several times a year. Because of the lack of reference methods for peptide hormones, the statistical evaluation of the results indicates only whether they are ''correct'' or subject to systematic or nonsystematic errors with respect to the findings of the other participants. In the other survey, the participating laboratories are requested to assay only one given hormone in some 20 deep-frozen sera (including standards in hormone-free sera for derivation of a standard curve) that are distributed at relatively long intervals. The statistical analysis of the data derived from these QCSs allows - together with the methodological inquiry form - detection of probable causes for discrepancies in the results. (author)

  18. Quality assurance and statistical control

    DEFF Research Database (Denmark)

    Heydorn, K.

    1991-01-01

    In scientific research laboratories it is rarely possible to use quality assurance schemes, developed for large-scale analysis. Instead methods have been developed to control the quality of modest numbers of analytical results by relying on statistical control: Analysis of precision serves...... to detect analytical errors by comparing the a priori precision of the analytical results with the actual variability observed among replicates or duplicates. The method relies on the chi-square distribution to detect excess variability and is quite sensitive even for 5-10 results. Interference control...... serves to detect analytical bias by comparing results obtained by two different analytical methods, each relying on a different detection principle and therefore exhibiting different influence from matrix elements; only 5-10 sets of results are required to establish whether a regression line passes...

  19. Effective control of complex turbulent dynamical systems through statistical functionals.

    Science.gov (United States)

    Majda, Andrew J; Qi, Di

    2017-05-30

    Turbulent dynamical systems characterized by both a high-dimensional phase space and a large number of instabilities are ubiquitous among complex systems in science and engineering, including climate, material, and neural science. Control of these complex systems is a grand challenge, for example, in mitigating the effects of climate change or safe design of technology with fully developed shear turbulence. Control of flows in the transition to turbulence, where there is a small dimension of instabilities about a basic mean state, is an important and successful discipline. In complex turbulent dynamical systems, it is impossible to track and control the large dimension of instabilities, which strongly interact and exchange energy, and new control strategies are needed. The goal of this paper is to propose an effective statistical control strategy for complex turbulent dynamical systems based on a recent statistical energy principle and statistical linear response theory. We illustrate the potential practical efficiency and verify this effective statistical control strategy on the 40D Lorenz 1996 model in forcing regimes with various types of fully turbulent dynamics with nearly one-half of the phase space unstable.

  20. Using Statistical Process Control to Make Data-Based Clinical Decisions.

    Science.gov (United States)

    Pfadt, Al; Wheeler, Donald J.

    1995-01-01

    Statistical process control (SPC), which employs simple statistical tools and problem-solving techniques such as histograms, control charts, flow charts, and Pareto charts to implement continual product improvement procedures, can be incorporated into human service organizations. Examples illustrate use of SPC procedures to analyze behavioral data…

  1. A STATISTICAL SURVEY OF DIOXIN-LIKE COMPOUNDS IN U.S. BEEF: A PROGRESS REPORT

    Science.gov (United States)

    The USEPA and the USDA completed the first statistically designed survey of the occurrence and concentration of dibenzo-p-dioxins (CDDs), dibenzofurans (CDFs), and coplanar polychlorinated biphenyls (PCBs) in the fat of beef animals raised for human consumption in the United Stat...

  2. Summary Health Statistics for U.S. Adults: National Health Interview Survey, 2009. Data from the National Health Interview Survey. Vital and Health Statistics. Series 10, Number 249. DHHS Publication No. (PHS) 2011-1577

    Science.gov (United States)

    Pleis, J. R.; Ward, B. W.; Lucas, J. W.

    2010-01-01

    Objectives: This report presents health statistics from the 2009 National Health Interview Survey (NHIS) for the civilian noninstitutionalized adult population, classified by sex, age, race and ethnicity, education, family income, poverty status, health insurance coverage, marital status, and place and region of residence. Estimates are presented…

  3. An easy and low cost option for economic statistical process control ...

    African Journals Online (AJOL)

    An easy and low cost option for economic statistical process control using Excel. ... in both economic and economic statistical designs of the X-control chart. ... in this paper and the numerical examples illustrated are executed on this program.

  4. Turkish Version of the Survey of Attitudes toward Statistics: Factorial Structure Invariance by Gender

    Science.gov (United States)

    Sarikaya, Esma Emmioglu; Ok, Ahmet; Aydin, Yesim Capa; Schau, Candace

    2018-01-01

    This study examines factorial structure and the gender invariance of the Turkish version of the Survey of Attitudes toward Statistics (SATS-36). The SATS-36 has 36 items measuring six components: affect, cognitive competence, value, difficulty, effort, and interest. Data were collected from 347 university students. Results showed that the Turkish…

  5. A Statistical Project Control Tool for Engineering Managers

    Science.gov (United States)

    Bauch, Garland T.

    2001-01-01

    This slide presentation reviews the use of a Statistical Project Control Tool (SPCT) for managing engineering projects. A literature review pointed to a definition of project success, (i.e., A project is successful when the cost, schedule, technical performance, and quality satisfy the customer.) The literature review also pointed to project success factors, and traditional project control tools, and performance measures that are detailed in the report. The essential problem is that with resources becoming more limited, and an increasing number or projects, project failure is increasing, there is a limitation of existing methods and systematic methods are required. The objective of the work is to provide a new statistical project control tool for project managers. Graphs using the SPCT method plotting results of 3 successful projects and 3 failed projects are reviewed, with success and failure being defined by the owner.

  6. Statistical process control charts for monitoring military injuries.

    Science.gov (United States)

    Schuh, Anna; Canham-Chervak, Michelle; Jones, Bruce H

    2017-12-01

    An essential aspect of an injury prevention process is surveillance, which quantifies and documents injury rates in populations of interest and enables monitoring of injury frequencies, rates and trends. To drive progress towards injury reduction goals, additional tools are needed. Statistical process control charts, a methodology that has not been previously applied to Army injury monitoring, capitalise on existing medical surveillance data to provide information to leadership about injury trends necessary for prevention planning and evaluation. Statistical process control Shewhart u-charts were created for 49 US Army installations using quarterly injury medical encounter rates, 2007-2015, for active duty soldiers obtained from the Defense Medical Surveillance System. Injuries were defined according to established military injury surveillance recommendations. Charts display control limits three standard deviations (SDs) above and below an installation-specific historical average rate determined using 28 data points, 2007-2013. Charts are available in Army strategic management dashboards. From 2007 to 2015, Army injury rates ranged from 1254 to 1494 unique injuries per 1000 person-years. Installation injury rates ranged from 610 to 2312 injuries per 1000 person-years. Control charts identified four installations with injury rates exceeding the upper control limits at least once during 2014-2015, rates at three installations exceeded the lower control limit at least once and 42 installations had rates that fluctuated around the historical mean. Control charts can be used to drive progress towards injury reduction goals by indicating statistically significant increases and decreases in injury rates. Future applications to military subpopulations, other health outcome metrics and chart enhancements are suggested. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  7. The Business Of Urban Animals Survey: the facts and statistics on companion animals in Canada.

    Science.gov (United States)

    Perrin, Terri

    2009-01-01

    At the first Banff Summit for Urban Animal Strategies (BSUAS) in 2006, delegates clearly indicated that a lack of reliable Canadian statistics hampers municipal leaders and legislators in their efforts to develop urban animal strategies that create and sustain a healthy community for pets and people. To gain a better understanding of the situation, BSUAS municipal delegates and other industry stakeholders partnered with Ipsos Reid, one of the world's leading polling firms, to conduct a national survey on the "Business of Urban Animals." The results of the survey, summarized in this article, were presented at the BSUAS meeting in October 2008. In addition, each participating community will receive a comprehensive written analysis, as well as a customized report. The online survey was conducted from September 22 to October 1, 2008. There were 7208 participants, including 3973 pet and 3235 non-pet owners from the Ipsos-Reid's proprietary Canadian online panel. The national results were weighted to reflect the true population distribution across Canada and the panel was balanced on all major demographics to mirror Statistics Canada census information. The margin for error for the national results is 1/- 1.15%.

  8. Manufacturing Squares: An Integrative Statistical Process Control Exercise

    Science.gov (United States)

    Coy, Steven P.

    2016-01-01

    In the exercise, students in a junior-level operations management class are asked to manufacture a simple product. Given product specifications, they must design a production process, create roles and design jobs for each team member, and develop a statistical process control plan that efficiently and effectively controls quality during…

  9. Statistical Process Control for KSC Processing

    Science.gov (United States)

    Ford, Roger G.; Delgado, Hector; Tilley, Randy

    1996-01-01

    The 1996 Summer Faculty Fellowship Program and Kennedy Space Center (KSC) served as the basis for a research effort into statistical process control for KSC processing. The effort entailed several tasks and goals. The first was to develop a customized statistical process control (SPC) course for the Safety and Mission Assurance Trends Analysis Group. The actual teaching of this course took place over several weeks. In addition, an Internet version of the same course complete with animation and video excerpts from the course when it was taught at KSC was developed. The application of SPC to shuttle processing took up the rest of the summer research project. This effort entailed the evaluation of SPC use at KSC, both present and potential, due to the change in roles for NASA and the Single Flight Operations Contractor (SFOC). Individual consulting on SPC use was accomplished as well as an evaluation of SPC software for KSC use in the future. A final accomplishment of the orientation of the author to NASA changes, terminology, data format, and new NASA task definitions will allow future consultation when the needs arise.

  10. Statistic techniques of process control for MTR type

    International Nuclear Information System (INIS)

    Oliveira, F.S.; Ferrufino, F.B.J.; Santos, G.R.T.; Lima, R.M.

    2002-01-01

    This work aims at introducing some improvements on the fabrication of MTR type fuel plates, applying statistic techniques of process control. The work was divided into four single steps and their data were analyzed for: fabrication of U 3 O 8 fuel plates; fabrication of U 3 Si 2 fuel plates; rolling of small lots of fuel plates; applying statistic tools and standard specifications to perform a comparative study of these processes. (author)

  11. Adaptive Critic Nonlinear Robust Control: A Survey.

    Science.gov (United States)

    Wang, Ding; He, Haibo; Liu, Derong

    2017-10-01

    Adaptive dynamic programming (ADP) and reinforcement learning are quite relevant to each other when performing intelligent optimization. They are both regarded as promising methods involving important components of evaluation and improvement, at the background of information technology, such as artificial intelligence, big data, and deep learning. Although great progresses have been achieved and surveyed when addressing nonlinear optimal control problems, the research on robustness of ADP-based control strategies under uncertain environment has not been fully summarized. Hence, this survey reviews the recent main results of adaptive-critic-based robust control design of continuous-time nonlinear systems. The ADP-based nonlinear optimal regulation is reviewed, followed by robust stabilization of nonlinear systems with matched uncertainties, guaranteed cost control design of unmatched plants, and decentralized stabilization of interconnected systems. Additionally, further comprehensive discussions are presented, including event-based robust control design, improvement of the critic learning rule, nonlinear H ∞ control design, and several notes on future perspectives. By applying the ADP-based optimal and robust control methods to a practical power system and an overhead crane plant, two typical examples are provided to verify the effectiveness of theoretical results. Overall, this survey is beneficial to promote the development of adaptive critic control methods with robustness guarantee and the construction of higher level intelligent systems.

  12. Using Paper Helicopters to Teach Statistical Process Control

    Science.gov (United States)

    Johnson, Danny J.

    2011-01-01

    This hands-on project uses a paper helicopter to teach students how to distinguish between common and special causes of variability when developing and using statistical process control charts. It allows the student to experience a process that is out-of-control due to imprecise or incomplete product design specifications and to discover how the…

  13. Statistics, data mining, and machine learning in astronomy a practical Python guide for the analysis of survey data

    CERN Document Server

    Ivezic, Željko; VanderPlas, Jacob T; Gray, Alexander

    2014-01-01

    As telescopes, detectors, and computers grow ever more powerful, the volume of data at the disposal of astronomers and astrophysicists will enter the petabyte domain, providing accurate measurements for billions of celestial objects. This book provides a comprehensive and accessible introduction to the cutting-edge statistical methods needed to efficiently analyze complex data sets from astronomical surveys such as the Panoramic Survey Telescope and Rapid Response System, the Dark Energy Survey, and the upcoming Large Synoptic Survey Telescope. It serves as a practical handbook for graduate s

  14. What Statistics Canada Survey Data Sources are Available to Study Neurodevelopmental Conditions and Disabilities in Children and Youth?

    Directory of Open Access Journals (Sweden)

    Rubab G. Arim

    2016-09-01

    Full Text Available Researchers with an interest in examining and better understanding the social context of children suffering from neurodevelopmental disabilities can benefit by using data from a wide variety of Statistics Canada surveys as well as the information contained in administrative health databases. Selective use of a particular survey and database can be informative particularly when demographics, samples, and content align with the goals and outcomes of the researcher’s questions of interest. Disabilities are not merely conditions in isolation. They are a key part of a social context involving impairment, function, and social facilitators or barriers, such as work, school and extracurricular activities. Socioeconomic factors, single parenthood, income, and education also play a role in how families cope with children’s disabilities. Statistics indicate that five per cent of Canadian children aged five to 14 years have a disability, and 74 per cent of these are identified as having a neurodevelopmental condition and disability. A number of factors must be taken into account when choosing a source of survey data, including definitions of neurodevelopmental conditions, the target group covered by the survey, which special populations are included or excluded, along with a comparison group, and the survey’s design. Surveys fall into categories such as general health, disability-specific, and children and youth. They provide an excellent opportunity to look at the socioeconomic factors associated with the health of individuals, as well as how these conditions and disabilities affect families. However rich the information gleaned from survey data, it is not enough, especially given the data gaps that exist around the health and well-being of children and older youths. This is where administrative and other data can be used to complement existing data sources. Administrative data offer specific information about neurological conditions that won’t be

  15. Application of Statistical Process Control (SPC in it´s Quality control

    Directory of Open Access Journals (Sweden)

    Carlos Hernández-Pedrera

    2015-12-01

    Full Text Available The overall objective of this paper is to use the SPC to assess the possibility of improving the process of obtaining a sanitary device. As specific objectives we set out to identify the variables to be analyzed to enter the statistical control of process (SPC, analyze possible errors and variations indicated by the control charts in addition to evaluate and compare the results achieved with the study of SPC before and after monitoring direct in the production line were used sampling methods and laboratory replacement to determine the quality of the finished product, then statistical methods were applied seeking to emphasize the importance and contribution from its application to monitor corrective actions and support processes in production. It was shown that the process is under control because the results were found within established control limits. There is a tendency to be displaced toward one end of the boundary, the distribution exceeds the limits, creating the possibility that under certain conditions the process is out of control, the results also showed that the process being within the limits of quality control is operating far from the optimal conditions. In any of the study situations were obtained products outside the limits of weight and discoloration but defective products were obtained.

  16. Net analyte signal based statistical quality control

    NARCIS (Netherlands)

    Skibsted, E.T.S.; Boelens, H.F.M.; Westerhuis, J.A.; Smilde, A.K.; Broad, N.W.; Rees, D.R.; Witte, D.T.

    2005-01-01

    Net analyte signal statistical quality control (NAS-SQC) is a new methodology to perform multivariate product quality monitoring based on the net analyte signal approach. The main advantage of NAS-SQC is that the systematic variation in the product due to the analyte (or property) of interest is

  17. Applied Behavior Analysis and Statistical Process Control?

    Science.gov (United States)

    Hopkins, B. L.

    1995-01-01

    Incorporating statistical process control (SPC) methods into applied behavior analysis is discussed. It is claimed that SPC methods would likely reduce applied behavior analysts' intimate contacts with problems and would likely yield poor treatment and research decisions. Cases and data presented by Pfadt and Wheeler (1995) are cited as examples.…

  18. A Survey of Statistical Capstone Projects

    Science.gov (United States)

    Martonosi, Susan E.; Williams, Talithia D.

    2016-01-01

    In this article, we highlight the advantages of incorporating a statistical capstone experience in the undergraduate curriculum, where students perform an in-depth analysis of real-world data. Capstone experiences develop statistical thinking by allowing students to engage in a consulting-like experience that requires skills outside the scope of…

  19. Statistical evaluation of Pacific Northwest Residential Energy Consumption Survey weather data

    Energy Technology Data Exchange (ETDEWEB)

    Tawil, J.J.

    1986-02-01

    This report addresses an issue relating to energy consumption and conservation in the residential sector. BPA has obtained two meteorological data bases for use with its 1983 Pacific Northwest Residential Energy Survey (PNWRES). One data base consists of temperature data from weather stations; these have been aggregated to form a second data base that covers the National Oceanographic and Atmospheric Administration (NOAA) climatic divisions. At BPA's request, Pacific Northwest Laboratory has produced a household energy use model for both electricity and natural gas in order to determine whether the statistically estimated parameters of the model significantly differ when the two different meteorological data bases are used.

  20. Statistical Process Control: Going to the Limit for Quality.

    Science.gov (United States)

    Training, 1987

    1987-01-01

    Defines the concept of statistical process control, a quality control method used especially in manufacturing. Generally, concept users set specific standard levels that must be met. Makes the point that although employees work directly with the method, management is responsible for its success within the plant. (CH)

  1. Application of statistical process control to qualitative molecular diagnostic assays.

    Directory of Open Access Journals (Sweden)

    Cathal P O'brien

    2014-11-01

    Full Text Available Modern pathology laboratories and in particular high throughput laboratories such as clinical chemistry have developed a reliable system for statistical process control. Such a system is absent from the majority of molecular laboratories and where present is confined to quantitative assays. As the inability to apply statistical process control to assay is an obvious disadvantage this study aimed to solve this problem by using a frequency estimate coupled with a confidence interval calculation to detect deviations from an expected mutation frequency. The results of this study demonstrate the strengths and weaknesses of this approach and highlight minimum sample number requirements. Notably, assays with low mutation frequencies and detection of small deviations from an expected value require greater samples with a resultant protracted time to detection. Modelled laboratory data was also used to highlight how this approach might be applied in a routine molecular laboratory. This article is the first to describe the application of statistical process control to qualitative laboratory data.

  2. Assessment of the GPC Control Quality Using Non–Gaussian Statistical Measures

    Directory of Open Access Journals (Sweden)

    Domański Paweł D.

    2017-06-01

    Full Text Available This paper presents an alternative approach to the task of control performance assessment. Various statistical measures based on Gaussian and non-Gaussian distribution functions are evaluated. The analysis starts with the review of control error histograms followed by their statistical analysis using probability distribution functions. Simulation results obtained for a control system with the generalized predictive controller algorithm are considered. The proposed approach using Cauchy and Lévy α-stable distributions shows robustness against disturbances and enables effective control loop quality evaluation. Tests of the predictive algorithm prove its ability to detect the impact of the main controller parameters, such as the model gain, the dynamics or the prediction horizon.

  3. Statistical quality control a loss minimization approach

    CERN Document Server

    Trietsch, Dan

    1999-01-01

    While many books on quality espouse the Taguchi loss function, they do not examine its impact on statistical quality control (SQC). But using the Taguchi loss function sheds new light on questions relating to SQC and calls for some changes. This book covers SQC in a way that conforms with the need to minimize loss. Subjects often not covered elsewhere include: (i) measurements, (ii) determining how many points to sample to obtain reliable control charts (for which purpose a new graphic tool, diffidence charts, is introduced), (iii) the connection between process capability and tolerances, (iv)

  4. Statistical process control for residential treated wood

    Science.gov (United States)

    Patricia K. Lebow; Timothy M. Young; Stan Lebow

    2017-01-01

    This paper is the first stage of a study that attempts to improve the process of manufacturing treated lumber through the use of statistical process control (SPC). Analysis of industrial and auditing agency data sets revealed there are differences between the industry and agency probability density functions (pdf) for normalized retention data. Resampling of batches of...

  5. An introduction to statistical process control in research proteomics.

    Science.gov (United States)

    Bramwell, David

    2013-12-16

    Statistical process control is a well-established and respected method which provides a general purpose, and consistent framework for monitoring and improving the quality of a process. It is routinely used in many industries where the quality of final products is critical and is often required in clinical diagnostic laboratories [1,2]. To date, the methodology has been little utilised in research proteomics. It has been shown to be capable of delivering quantitative QC procedures for qualitative clinical assays [3] making it an ideal methodology to apply to this area of biological research. To introduce statistical process control as an objective strategy for quality control and show how it could be used to benefit proteomics researchers and enhance the quality of the results they generate. We demonstrate that rules which provide basic quality control are easy to derive and implement and could have a major impact on data quality for many studies. Statistical process control is a powerful tool for investigating and improving proteomics research work-flows. The process of characterising measurement systems and defining control rules forces the exploration of key questions that can lead to significant improvements in performance. This work asserts that QC is essential to proteomics discovery experiments. Every experimenter must know the current capabilities of their measurement system and have an objective means for tracking and ensuring that performance. Proteomic analysis work-flows are complicated and multi-variate. QC is critical for clinical chemistry measurements and huge strides have been made in ensuring the quality and validity of results in clinical biochemistry labs. This work introduces some of these QC concepts and works to bridge their use from single analyte QC to applications in multi-analyte systems. This article is part of a Special Issue entitled: Standardization and Quality Control in Proteomics. Copyright © 2013 The Author. Published by Elsevier

  6. Geodetic Control Points - National Geodetic Survey Benchmarks

    Data.gov (United States)

    NSGIC Local Govt | GIS Inventory — This data contains a set of geodetic control stations maintained by the National Geodetic Survey. Each geodetic control station in this dataset has either a precise...

  7. Statistical survey of the consumption of the products from seminatural environment in the Czech Republic

    International Nuclear Information System (INIS)

    Malatova, I.; Tecl, J.

    2001-01-01

    The central aspect of the work in international project SAVEC (Spatial Analysis of Vulnerable Ecosystems in Central Europe) is the identification of vulnerable areas. These are areas which, by virtue of the processes governing the transfer of radionuclides through food- chains, deliver high radionuclide fluxes to man. Identification of vulnerable areas is essential in establishing where intervention levels are likely to be exceeded in the event of a nuclear accident. Vulnerable areas could be identified using many criteria, a.o. high production rates of certain foodstuffs which substantially accumulate specific radionuclide. In addition to environmental factors, social factors will also contribute to vulnerability, in particular, dietary preferences can lead to the ingestion of more contaminated foodstuffs. Since Chernobyl accident, it is well known fact that in the Czech Republic, the most contaminated foodstuff is coming from semi-natural ecosystem (mushroom, forest berries and game). However, critical group as to the size and composition was not quantitatively evaluated up to now. Therefore, in the frame of SAVEC project, a statistical survey with the aim of finding critical group of inhabitants in the Czech Republic was performed. The survey was performed by a specialised marketing agency AMASIA. It was aimed at the consumption of products from semi-natural environment -mushrooms, forest berries and game. Two independent sub-surveys were performed. The first one, aimed at randomly selected households from phone directory, had a goal to obtain 1500 interviews among the whole population. In individual regions, number of respondent was selected according to the number of inhabitants of the region. The second survey was aimed at the hunters and their households. Mostly members of the hunting association were included into survey as there is only small number of professional hunters in the Czech Republic. The criterion was chosen that the respondent has to be hunter

  8. Methodology for performing surveys for fixed contamination

    International Nuclear Information System (INIS)

    Durham, J.S.; Gardner, D.L.

    1994-10-01

    This report describes a methodology for performing instrument surveys for fixed contamination that can be used to support the release of material from radiological areas, including release to controlled areas and release from radiological control. The methodology, which is based on a fast scan survey and a series of statistical, fixed measurements, meets the requirements of the U.S. Department of Energy Radiological Control Manual (RadCon Manual) (DOE 1994) and DOE Order 5400.5 (DOE 1990) for surveys for fixed contamination and requires less time than a conventional scan survey. The confidence interval associated with the new methodology conforms to the draft national standard for surveys. The methodology that is presented applies only to surveys for fixed contamination. Surveys for removable contamination are not discussed, and the new methodology does not affect surveys for removable contamination

  9. Assessing attitudes towards statistics among medical students: psychometric properties of the Serbian version of the Survey of Attitudes Towards Statistics (SATS.

    Directory of Open Access Journals (Sweden)

    Dejana Stanisavljevic

    Full Text Available BACKGROUND: Medical statistics has become important and relevant for future doctors, enabling them to practice evidence based medicine. Recent studies report that students' attitudes towards statistics play an important role in their statistics achievements. The aim of the study was to test the psychometric properties of the Serbian version of the Survey of Attitudes Towards Statistics (SATS in order to acquire a valid instrument to measure attitudes inside the Serbian educational context. METHODS: The validation study was performed on a cohort of 417 medical students who were enrolled in an obligatory introductory statistics course. The SATS adaptation was based on an internationally accepted methodology for translation and cultural adaptation. Psychometric properties of the Serbian version of the SATS were analyzed through the examination of factorial structure and internal consistency. RESULTS: Most medical students held positive attitudes towards statistics. The average total SATS score was above neutral (4.3±0.8, and varied from 1.9 to 6.2. Confirmatory factor analysis validated the six-factor structure of the questionnaire (Affect, Cognitive Competence, Value, Difficulty, Interest and Effort. Values for fit indices TLI (0.940 and CFI (0.961 were above the cut-off of ≥0.90. The RMSEA value of 0.064 (0.051-0.078 was below the suggested value of ≤0.08. Cronbach's alpha of the entire scale was 0.90, indicating scale reliability. In a multivariate regression model, self-rating of ability in mathematics and current grade point average were significantly associated with the total SATS score after adjusting for age and gender. CONCLUSION: Present study provided the evidence for the appropriate metric properties of the Serbian version of SATS. Confirmatory factor analysis validated the six-factor structure of the scale. The SATS might be reliable and a valid instrument for identifying medical students' attitudes towards statistics in the

  10. Assessing attitudes towards statistics among medical students: psychometric properties of the Serbian version of the Survey of Attitudes Towards Statistics (SATS).

    Science.gov (United States)

    Stanisavljevic, Dejana; Trajkovic, Goran; Marinkovic, Jelena; Bukumiric, Zoran; Cirkovic, Andja; Milic, Natasa

    2014-01-01

    Medical statistics has become important and relevant for future doctors, enabling them to practice evidence based medicine. Recent studies report that students' attitudes towards statistics play an important role in their statistics achievements. The aim of the study was to test the psychometric properties of the Serbian version of the Survey of Attitudes Towards Statistics (SATS) in order to acquire a valid instrument to measure attitudes inside the Serbian educational context. The validation study was performed on a cohort of 417 medical students who were enrolled in an obligatory introductory statistics course. The SATS adaptation was based on an internationally accepted methodology for translation and cultural adaptation. Psychometric properties of the Serbian version of the SATS were analyzed through the examination of factorial structure and internal consistency. Most medical students held positive attitudes towards statistics. The average total SATS score was above neutral (4.3±0.8), and varied from 1.9 to 6.2. Confirmatory factor analysis validated the six-factor structure of the questionnaire (Affect, Cognitive Competence, Value, Difficulty, Interest and Effort). Values for fit indices TLI (0.940) and CFI (0.961) were above the cut-off of ≥0.90. The RMSEA value of 0.064 (0.051-0.078) was below the suggested value of ≤0.08. Cronbach's alpha of the entire scale was 0.90, indicating scale reliability. In a multivariate regression model, self-rating of ability in mathematics and current grade point average were significantly associated with the total SATS score after adjusting for age and gender. Present study provided the evidence for the appropriate metric properties of the Serbian version of SATS. Confirmatory factor analysis validated the six-factor structure of the scale. The SATS might be reliable and a valid instrument for identifying medical students' attitudes towards statistics in the Serbian educational context.

  11. An evaluation of the quality of statistical design and analysis of published medical research: results from a systematic survey of general orthopaedic journals.

    Science.gov (United States)

    Parsons, Nick R; Price, Charlotte L; Hiskens, Richard; Achten, Juul; Costa, Matthew L

    2012-04-25

    The application of statistics in reported research in trauma and orthopaedic surgery has become ever more important and complex. Despite the extensive use of statistical analysis, it is still a subject which is often not conceptually well understood, resulting in clear methodological flaws and inadequate reporting in many papers. A detailed statistical survey sampled 100 representative orthopaedic papers using a validated questionnaire that assessed the quality of the trial design and statistical analysis methods. The survey found evidence of failings in study design, statistical methodology and presentation of the results. Overall, in 17% (95% confidence interval; 10-26%) of the studies investigated the conclusions were not clearly justified by the results, in 39% (30-49%) of studies a different analysis should have been undertaken and in 17% (10-26%) a different analysis could have made a difference to the overall conclusions. It is only by an improved dialogue between statistician, clinician, reviewer and journal editor that the failings in design methodology and analysis highlighted by this survey can be addressed.

  12. An evaluation of the quality of statistical design and analysis of published medical research: results from a systematic survey of general orthopaedic journals

    Directory of Open Access Journals (Sweden)

    Parsons Nick R

    2012-04-01

    Full Text Available Abstract Background The application of statistics in reported research in trauma and orthopaedic surgery has become ever more important and complex. Despite the extensive use of statistical analysis, it is still a subject which is often not conceptually well understood, resulting in clear methodological flaws and inadequate reporting in many papers. Methods A detailed statistical survey sampled 100 representative orthopaedic papers using a validated questionnaire that assessed the quality of the trial design and statistical analysis methods. Results The survey found evidence of failings in study design, statistical methodology and presentation of the results. Overall, in 17% (95% confidence interval; 10–26% of the studies investigated the conclusions were not clearly justified by the results, in 39% (30–49% of studies a different analysis should have been undertaken and in 17% (10–26% a different analysis could have made a difference to the overall conclusions. Conclusion It is only by an improved dialogue between statistician, clinician, reviewer and journal editor that the failings in design methodology and analysis highlighted by this survey can be addressed.

  13. Health and human rights: a statistical measurement framework using household survey data in Uganda.

    Science.gov (United States)

    Wesonga, Ronald; Owino, Abraham; Ssekiboobo, Agnes; Atuhaire, Leonard; Jehopio, Peter

    2015-05-03

    Health is intertwined with human rights as is clearly reflected in the right to life. Promotion of health practices in the context of human rights can be accomplished if there is a better understanding of the level of human rights observance. In this paper, we evaluate and present an appraisal for a possibility of applying household survey to study the determinants of health and human rights and also derive the probability that human rights are observed; an important ingredient into the national planning framework. Data from the Uganda National Governance Baseline Survey were used. A conceptual framework for predictors of a hybrid dependent variable was developed and both bivariate and multivariate statistical techniques employed. Multivariate post estimation computations were derived after evaluations of the significance of coefficients of health and human rights predictors. Findings, show that household characteristics of respondents considered in this study were statistically significant (p human rights observance. For example, a unit increase of respondents' schooling levels results in an increase of about 34% level of positively assessing human rights observance. Additionally, the study establishes, through the three models presented, that household assessment of health and human rights observance was 20% which also represents how much of the entire continuum of human rights is demanded. Findings propose important evidence for monitoring and evaluation of health in the context human rights using household survey data. They provide a benchmark for health and human rights assessments with a focus on international and national development plans to achieve socio-economic transformation and health in society.

  14. Statistical physics of human beings in games: Controlled experiments

    International Nuclear Information System (INIS)

    Liang Yuan; Huang Ji-Ping

    2014-01-01

    It is important to know whether the laws or phenomena in statistical physics for natural systems with non-adaptive agents still hold for social human systems with adaptive agents, because this implies whether it is possible to study or understand social human systems by using statistical physics originating from natural systems. For this purpose, we review the role of human adaptability in four kinds of specific human behaviors, namely, normal behavior, herd behavior, contrarian behavior, and hedge behavior. The approach is based on controlled experiments in the framework of market-directed resource-allocation games. The role of the controlled experiments could be at least two-fold: adopting the real human decision-making process so that the system under consideration could reflect the performance of genuine human beings; making it possible to obtain macroscopic physical properties of a human system by tuning a particular factor of the system, thus directly revealing cause and effect. As a result, both computer simulations and theoretical analyses help to show a few counterparts of some laws or phenomena in statistical physics for social human systems: two-phase phenomena or phase transitions, entropy-related phenomena, and a non-equilibrium steady state. This review highlights the role of human adaptability in these counterparts, and makes it possible to study or understand some particular social human systems by means of statistical physics coming from natural systems. (topical review - statistical physics and complex systems)

  15. Pengendalian Kualitas Kertas Dengan Menggunakan Statistical Process Control di Paper Machine 3

    Directory of Open Access Journals (Sweden)

    Vera Devani

    2017-01-01

    Full Text Available Purpose of this research is to determine types and causes of defects commonly found in Paper Machine 3 by using statistical process control (SPC method.  Statistical process control (SPC is a technique for solving problems and is used to monitor, control, analyze, manage and improve products and processes using statistical methods.  Based on Pareto Diagrams, wavy defect is found as the most frequent defect, which is 81.7%.  Human factor, meanwhile, is found as the main cause of defect, primarily due to lack of understanding on machinery and lack of training both leading to errors in data input.

  16. A statistical evaluation of the design and precision of the shrimp trawl survey off West Greenland

    DEFF Research Database (Denmark)

    Folmer, Ole; Pennington, M.

    2000-01-01

    statistical techniques were used to estimate two indices of shrimp abundance and their precision, and to determine the effective sample sizes for estimates of length-frequency distributions. It is concluded that the surveys produce a fairly precise abundance index, and that given the relatively small...... effective sample size, reducing tow duration to 15 min would increase overall survey precision. An unexpected outcome of the analysis is that the density of shrimp appears to have been fairly stable over the last 11 years. (C) 2000 Elsevier Science B.V. All rights reserved....

  17. Pitch Motion Stabilization by Propeller Speed Control Using Statistical Controller Design

    DEFF Research Database (Denmark)

    Nakatani, Toshihiko; Blanke, Mogens; Galeazzi, Roberto

    2006-01-01

    This paper describes dynamics analysis of a small training boat and a possibility of ship pitch stabilization by control of propeller speed. After upgrading the navigational system of an actual small training boat, in order to identify the model of the ship, the real data collected by sea trials...... were used for statistical analysis and system identification. This analysis shows that the pitching motion is indeed influenced by engine speed and it is suggested that there exists a possibility of reducing the pitching motion by properly controlling the engine throttle. Based on this observation...

  18. Development of nuclear power plant online monitoring system using statistical quality control

    International Nuclear Information System (INIS)

    An, Sang Ha

    2006-02-01

    Statistical Quality Control techniques have been applied to many aspects of industrial engineering. An application to nuclear power plant maintenance and control is also presented that can greatly improve plant safety. As a demonstration of such an approach, a specific system is analyzed: the reactor coolant pumps (RCP) and the fouling resistance of heat exchanger. This research uses Shewart X-bar, R charts, Cumulative Sum charts (CUSUM), and Sequential Probability Ratio Test (SPRT) to analyze the process for the state of statistical control. And we made Control Chart Analyzer (CCA) to support these analyses that can make a decision of error in process. The analysis shows that statistical process control methods can be applied as an early warning system capable of identifying significant equipment problems well in advance of traditional control room alarm indicators. Such a system would provide operators with enough time to respond to possible emergency situations and thus improve plant safety and reliability

  19. Image-guided radiotherapy quality control: Statistical process control using image similarity metrics.

    Science.gov (United States)

    Shiraishi, Satomi; Grams, Michael P; Fong de Los Santos, Luis E

    2018-05-01

    The purpose of this study was to demonstrate an objective quality control framework for the image review process. A total of 927 cone-beam computed tomography (CBCT) registrations were retrospectively analyzed for 33 bilateral head and neck cancer patients who received definitive radiotherapy. Two registration tracking volumes (RTVs) - cervical spine (C-spine) and mandible - were defined, within which a similarity metric was calculated and used as a registration quality tracking metric over the course of treatment. First, sensitivity to large misregistrations was analyzed for normalized cross-correlation (NCC) and mutual information (MI) in the context of statistical analysis. The distribution of metrics was obtained for displacements that varied according to a normal distribution with standard deviation of σ = 2 mm, and the detectability of displacements greater than 5 mm was investigated. Then, similarity metric control charts were created using a statistical process control (SPC) framework to objectively monitor the image registration and review process. Patient-specific control charts were created using NCC values from the first five fractions to set a patient-specific process capability limit. Population control charts were created using the average of the first five NCC values for all patients in the study. For each patient, the similarity metrics were calculated as a function of unidirectional translation, referred to as the effective displacement. Patient-specific action limits corresponding to 5 mm effective displacements were defined. Furthermore, effective displacements of the ten registrations with the lowest similarity metrics were compared with a three dimensional (3DoF) couch displacement required to align the anatomical landmarks. Normalized cross-correlation identified suboptimal registrations more effectively than MI within the framework of SPC. Deviations greater than 5 mm were detected at 2.8σ and 2.1σ from the mean for NCC and MI

  20. Medical facility statistics in Japan.

    Science.gov (United States)

    Hamajima, Nobuyuki; Sugimoto, Takuya; Hasebe, Ryo; Myat Cho, Su; Khaing, Moe; Kariya, Tetsuyoshi; Mon Saw, Yu; Yamamoto, Eiko

    2017-11-01

    Medical facility statistics provide essential information to policymakers, administrators, academics, and practitioners in the field of health services. In Japan, the Health Statistics Office of the Director-General for Statistics and Information Policy at the Ministry of Health, Labour and Welfare is generating these statistics. Although the statistics are widely available in both Japanese and English, the methodology described in the technical reports are primarily in Japanese, and are not fully described in English. This article aimed to describe these processes for readers in the English-speaking world. The Health Statistics Office routinely conduct two surveys called the Hospital Report and the Survey of Medical Institutions. The subjects of the former are all the hospitals and clinics with long-term care beds in Japan. It comprises a Patient Questionnaire focusing on the numbers of inpatients, admissions, discharges, and outpatients in one month, and an Employee Questionnaire, which asks about the number of employees as of October 1. The Survey of Medical Institutions consists of the Dynamic Survey, which focuses on the opening and closing of facilities every month, and the Static Survey, which focuses on staff, facilities, and services as of October 1, as well as the number of inpatients as of September 30 and the total number of outpatients during September. All hospitals, clinics, and dental clinics are requested to submit the Static Survey questionnaire every three years. These surveys are useful tools for collecting essential information, as well as providing occasions to implicitly inform facilities of the movements of government policy.

  1. Statistical Process Control Charts for Measuring and Monitoring Temporal Consistency of Ratings

    Science.gov (United States)

    Omar, M. Hafidz

    2010-01-01

    Methods of statistical process control were briefly investigated in the field of educational measurement as early as 1999. However, only the use of a cumulative sum chart was explored. In this article other methods of statistical quality control are introduced and explored. In particular, methods in the form of Shewhart mean and standard deviation…

  2. A Total Quality-Control Plan with Right-Sized Statistical Quality-Control.

    Science.gov (United States)

    Westgard, James O

    2017-03-01

    A new Clinical Laboratory Improvement Amendments option for risk-based quality-control (QC) plans became effective in January, 2016. Called an Individualized QC Plan, this option requires the laboratory to perform a risk assessment, develop a QC plan, and implement a QC program to monitor ongoing performance of the QC plan. Difficulties in performing a risk assessment may limit validity of an Individualized QC Plan. A better alternative is to develop a Total QC Plan including a right-sized statistical QC procedure to detect medically important errors. Westgard Sigma Rules provides a simple way to select the right control rules and the right number of control measurements. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. Human factors survey of advanced instrumentation and controls

    International Nuclear Information System (INIS)

    Carter, R.J.

    1989-01-01

    A survey oriented towards identifying the human factors issues in regard to the use of advanced instrumentation and controls (I ampersand C) in the nuclear industry was conducted. A number of United States (US) and Canadian nuclear vendors and utilities were participants in the survey. Human factors items, subsumed under the categories of computer-generated displays (CGD), controls, organizational support, training, and related topics, were discussed. The survey found the industry to be concerned about the human factors issues related to the implementation of advanced I ampersand C. Fifteen potential human factors problems were identified. They include: the need for an advanced I ampersand C guideline equivalent to NUREG-0700; a role change in the control room from operator to supervisor; information overload; adequacy of existing training technology for advanced I ampersand C; and operator acceptance and trust. 11 refs., 1 tab

  4. COMPARISON OF STATISTICALLY CONTROLLED MACHINING SOLUTIONS OF TITANIUM ALLOYS USING USM

    Directory of Open Access Journals (Sweden)

    R. Singh

    2010-06-01

    Full Text Available The purpose of the present investigation is to compare the statistically controlled machining solution of titanium alloys using ultrasonic machining (USM. In this study, the previously developed Taguchi model for USM of titanium and its alloys has been investigated and compared. Relationships between the material removal rate, tool wear rate, surface roughness and other controllable machining parameters (power rating, tool type, slurry concentration, slurry type, slurry temperature and slurry size have been deduced. The results of this study suggest that at the best settings of controllable machining parameters for titanium alloys (based upon the Taguchi design, the machining solution with USM is statistically controlled, which is not observed for other settings of input parameters on USM.

  5. Interrupted Time Series Versus Statistical Process Control in Quality Improvement Projects.

    Science.gov (United States)

    Andersson Hagiwara, Magnus; Andersson Gäre, Boel; Elg, Mattias

    2016-01-01

    To measure the effect of quality improvement interventions, it is appropriate to use analysis methods that measure data over time. Examples of such methods include statistical process control analysis and interrupted time series with segmented regression analysis. This article compares the use of statistical process control analysis and interrupted time series with segmented regression analysis for evaluating the longitudinal effects of quality improvement interventions, using an example study on an evaluation of a computerized decision support system.

  6. Statistical process control methods allow the analysis and improvement of anesthesia care.

    Science.gov (United States)

    Fasting, Sigurd; Gisvold, Sven E

    2003-10-01

    Quality aspects of the anesthetic process are reflected in the rate of intraoperative adverse events. The purpose of this report is to illustrate how the quality of the anesthesia process can be analyzed using statistical process control methods, and exemplify how this analysis can be used for quality improvement. We prospectively recorded anesthesia-related data from all anesthetics for five years. The data included intraoperative adverse events, which were graded into four levels, according to severity. We selected four adverse events, representing important quality and safety aspects, for statistical process control analysis. These were: inadequate regional anesthesia, difficult emergence from general anesthesia, intubation difficulties and drug errors. We analyzed the underlying process using 'p-charts' for statistical process control. In 65,170 anesthetics we recorded adverse events in 18.3%; mostly of lesser severity. Control charts were used to define statistically the predictable normal variation in problem rate, and then used as a basis for analysis of the selected problems with the following results: Inadequate plexus anesthesia: stable process, but unacceptably high failure rate; Difficult emergence: unstable process, because of quality improvement efforts; Intubation difficulties: stable process, rate acceptable; Medication errors: methodology not suited because of low rate of errors. By applying statistical process control methods to the analysis of adverse events, we have exemplified how this allows us to determine if a process is stable, whether an intervention is required, and if quality improvement efforts have the desired effect.

  7. Statistical physics of human beings in games: Controlled experiments

    Science.gov (United States)

    Liang, Yuan; Huang, Ji-Ping

    2014-07-01

    It is important to know whether the laws or phenomena in statistical physics for natural systems with non-adaptive agents still hold for social human systems with adaptive agents, because this implies whether it is possible to study or understand social human systems by using statistical physics originating from natural systems. For this purpose, we review the role of human adaptability in four kinds of specific human behaviors, namely, normal behavior, herd behavior, contrarian behavior, and hedge behavior. The approach is based on controlled experiments in the framework of market-directed resource-allocation games. The role of the controlled experiments could be at least two-fold: adopting the real human decision-making process so that the system under consideration could reflect the performance of genuine human beings; making it possible to obtain macroscopic physical properties of a human system by tuning a particular factor of the system, thus directly revealing cause and effect. As a result, both computer simulations and theoretical analyses help to show a few counterparts of some laws or phenomena in statistical physics for social human systems: two-phase phenomena or phase transitions, entropy-related phenomena, and a non-equilibrium steady state. This review highlights the role of human adaptability in these counterparts, and makes it possible to study or understand some particular social human systems by means of statistical physics coming from natural systems.

  8. Survey of statistical and sampling needs for environmental monitoring of commercial low-level radioactive waste disposal facilities

    International Nuclear Information System (INIS)

    Eberhardt, L.L.; Thomas, J.M.

    1986-07-01

    This project was designed to develop guidance for implementing 10 CFR Part 61 and to determine the overall needs for sampling and statistical work in characterizing, surveying, monitoring, and closing commercial low-level waste sites. When cost-effectiveness and statistical reliability are of prime importance, then double sampling, compositing, and stratification (with optimal allocation) are identified as key issues. If the principal concern is avoiding questionable statistical practice, then the applicability of kriging (for assessing spatial pattern), methods for routine monitoring, and use of standard textbook formulae in reporting monitoring results should be reevaluated. Other important issues identified include sampling for estimating model parameters and the use of data from left-censored (less than detectable limits) distributions

  9. Sampling methods to the statistical control of the production of blood components.

    Science.gov (United States)

    Pereira, Paulo; Seghatchian, Jerard; Caldeira, Beatriz; Santos, Paula; Castro, Rosa; Fernandes, Teresa; Xavier, Sandra; de Sousa, Gracinda; de Almeida E Sousa, João Paulo

    2017-12-01

    The control of blood components specifications is a requirement generalized in Europe by the European Commission Directives and in the US by the AABB standards. The use of a statistical process control methodology is recommended in the related literature, including the EDQM guideline. The control reliability is dependent of the sampling. However, a correct sampling methodology seems not to be systematically applied. Commonly, the sampling is intended to comply uniquely with the 1% specification to the produced blood components. Nevertheless, on a purely statistical viewpoint, this model could be argued not to be related to a consistent sampling technique. This could be a severe limitation to detect abnormal patterns and to assure that the production has a non-significant probability of producing nonconforming components. This article discusses what is happening in blood establishments. Three statistical methodologies are proposed: simple random sampling, sampling based on the proportion of a finite population, and sampling based on the inspection level. The empirical results demonstrate that these models are practicable in blood establishments contributing to the robustness of sampling and related statistical process control decisions for the purpose they are suggested for. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Statistical process control using optimized neural networks: a case study.

    Science.gov (United States)

    Addeh, Jalil; Ebrahimzadeh, Ata; Azarbad, Milad; Ranaee, Vahid

    2014-09-01

    The most common statistical process control (SPC) tools employed for monitoring process changes are control charts. A control chart demonstrates that the process has altered by generating an out-of-control signal. This study investigates the design of an accurate system for the control chart patterns (CCPs) recognition in two aspects. First, an efficient system is introduced that includes two main modules: feature extraction module and classifier module. In the feature extraction module, a proper set of shape features and statistical feature are proposed as the efficient characteristics of the patterns. In the classifier module, several neural networks, such as multilayer perceptron, probabilistic neural network and radial basis function are investigated. Based on an experimental study, the best classifier is chosen in order to recognize the CCPs. Second, a hybrid heuristic recognition system is introduced based on cuckoo optimization algorithm (COA) algorithm to improve the generalization performance of the classifier. The simulation results show that the proposed algorithm has high recognition accuracy. Copyright © 2013 ISA. Published by Elsevier Ltd. All rights reserved.

  11. Surveys & Programs

    Science.gov (United States)

    Employment and Payroll Survey of Business Owners Work from Home Our statistics highlight trends in household statistics from multiple surveys. Data Tools & Apps Main American FactFinder Census Business Builder My residential construction. Business Dynamics Statistics (BDS) Provides measures of openings and closings, job

  12. A survey on control schemes for distributed solar collector fields. Part II: Advanced control approaches

    Energy Technology Data Exchange (ETDEWEB)

    Camacho, E.F.; Rubio, F.R. [Universidad de Sevilla, Escuela Superior de Ingenieros, Departamento de Ingenieria de Sistemas y Automatica, Camino de Los Descubrimientos s/n, E-41092 Sevilla (Spain); Berenguel, M. [Universidad de Almeria, Departamento de Lenguajes y Computacion, Area de Ingenieria de Sistemas y Automatica, Carretera Sacramento s/n, E-04120 La Canada, Almeria (Spain); Valenzuela, L. [Plataforma Solar de Almeria - CIEMAT, Carretera Senes s/n, P.O. Box 22, E-04200 Tabernas (Almeria) (Spain)

    2007-10-15

    This article presents a survey of the different advanced automatic control techniques that have been applied to control the outlet temperature of solar plants with distributed collectors during the last 25 years. A classification of the modeling and control approaches described in the first part of this survey is used to explain the main features of each strategy. The treated strategies range from classical advanced control strategies to those with few industrial applications. (author)

  13. Predicting survey responses: how and why semantics shape survey statistics on organizational behaviour.

    Directory of Open Access Journals (Sweden)

    Jan Ketil Arnulf

    Full Text Available Some disciplines in the social sciences rely heavily on collecting survey responses to detect empirical relationships among variables. We explored whether these relationships were a priori predictable from the semantic properties of the survey items, using language processing algorithms which are now available as new research methods. Language processing algorithms were used to calculate the semantic similarity among all items in state-of-the-art surveys from Organisational Behaviour research. These surveys covered areas such as transformational leadership, work motivation and work outcomes. This information was used to explain and predict the response patterns from real subjects. Semantic algorithms explained 60-86% of the variance in the response patterns and allowed remarkably precise prediction of survey responses from humans, except in a personality test. Even the relationships between independent and their purported dependent variables were accurately predicted. This raises concern about the empirical nature of data collected through some surveys if results are already given a priori through the way subjects are being asked. Survey response patterns seem heavily determined by semantics. Language algorithms may suggest these prior to administering a survey. This study suggests that semantic algorithms are becoming new tools for the social sciences, opening perspectives on survey responses that prevalent psychometric theory cannot explain.

  14. From Quality to Information Quality in Official Statistics

    Directory of Open Access Journals (Sweden)

    Kenett Ron S.

    2016-12-01

    Full Text Available The term quality of statistical data, developed and used in official statistics and international organizations such as the International Monetary Fund (IMF and the Organisation for Economic Co-operation and Development (OECD, refers to the usefulness of summary statistics generated by producers of official statistics. Similarly, in the context of survey quality, official agencies such as Eurostat, National Center for Science and Engineering Statistics (NCSES, and Statistics Canada have created dimensions for evaluating the quality of a survey and its ability to report ‘accurate survey data’.

  15. Statistical Process Control. Impact and Opportunities for Ohio.

    Science.gov (United States)

    Brown, Harold H.

    The first purpose of this study is to help the reader become aware of the evolution of Statistical Process Control (SPC) as it is being implemented and used in industry today. This is approached through the presentation of a brief historical account of SPC, from its inception through the technological miracle that has occurred in Japan. The…

  16. An easy and low cost option for economic statistical process control ...

    African Journals Online (AJOL)

    a large number of nonconforming products are manufactured. ... size, n, sampling interval, h, and control limit parameter, k, that minimize the ...... [11] Montgomery DC, 2001, Introduction to statistical quality control, 4th Edition, John Wiley, New.

  17. a survey of the artisanal fisheries of kontagora reservoir, niger state

    African Journals Online (AJOL)

    DR. AMINU

    December, 2007, using statistical frame survey and catch assessment survey. ... monitoring, control and surveillance (MCS) system was suggested for management ... Table 2: ANOVA showing variations of fish species/fish landing in catch ...

  18. Business Statistics Education: Content and Software in Undergraduate Business Statistics Courses.

    Science.gov (United States)

    Tabatabai, Manouchehr; Gamble, Ralph

    1997-01-01

    Survey responses from 204 of 500 business schools identified most often topics in business statistics I and II courses. The most popular software at both levels was Minitab. Most schools required both statistics I and II. (SK)

  19. Belief Control Practices and Organizational Performances: A Survey ...

    African Journals Online (AJOL)

    Belief Control Practices and Organizational Performances: A Survey of Sugar Industry in Kenya. ... employees in the company core values and design of strategic control systems to cope with changing internal and external operating business ...

  20. Statistical Disclosure Control for Micro-Data Using the R Package sdcMicro

    Directory of Open Access Journals (Sweden)

    Matthias Templ

    2015-10-01

    The R package sdcMicro serves as an easy-to-handle, object-oriented S4 class implementation of SDC methods to evaluate and anonymize confidential micro-data sets. It includes all popular disclosure risk and perturbation methods. The package performs automated recalculation of frequency counts, individual and global risk measures, information loss and data utility statistics after each anonymization step. All methods are highly optimized in terms of computational costs to be able to work with large data sets. Reporting facilities that summarize the anonymization process can also be easily used by practitioners. We describe the package and demonstrate its functionality with a complex household survey test data set that has been distributed by the International Household Survey Network.

  1. Reducing lumber thickness variation using real-time statistical process control

    Science.gov (United States)

    Thomas M. Young; Brian H. Bond; Jan Wiedenbeck

    2002-01-01

    A technology feasibility study for reducing lumber thickness variation was conducted from April 2001 until March 2002 at two sawmills located in the southern U.S. A real-time statistical process control (SPC) system was developed that featured Wonderware human machine interface technology (HMI) with distributed real-time control charts for all sawing centers and...

  2. Statistical methods for quality assurance basics, measurement, control, capability, and improvement

    CERN Document Server

    Vardeman, Stephen B

    2016-01-01

    This undergraduate statistical quality assurance textbook clearly shows with real projects, cases and data sets how statistical quality control tools are used in practice. Among the topics covered is a practical evaluation of measurement effectiveness for both continuous and discrete data. Gauge Reproducibility and Repeatability methodology (including confidence intervals for Repeatability, Reproducibility and the Gauge Capability Ratio) is thoroughly developed. Process capability indices and corresponding confidence intervals are also explained. In addition to process monitoring techniques, experimental design and analysis for process improvement are carefully presented. Factorial and Fractional Factorial arrangements of treatments and Response Surface methods are covered. Integrated throughout the book are rich sets of examples and problems that help readers gain a better understanding of where and how to apply statistical quality control tools. These large and realistic problem sets in combination with the...

  3. The statistical process control methods - SPC

    Directory of Open Access Journals (Sweden)

    Floreková Ľubica

    1998-03-01

    Full Text Available Methods of statistical evaluation of quality – SPC (item 20 of the documentation system of quality control of ISO norm, series 900 of various processes, products and services belong amongst basic qualitative methods that enable us to analyse and compare data pertaining to various quantitative parameters. Also they enable, based on the latter, to propose suitable interventions with the aim of improving these processes, products and services. Theoretical basis and applicatibily of the principles of the: - diagnostics of a cause and effects, - Paret analysis and Lorentz curve, - number distribution and frequency curves of random variable distribution, - Shewhart regulation charts, are presented in the contribution.

  4. Implementation of statistical process control for proteomic experiments via LC MS/MS.

    Science.gov (United States)

    Bereman, Michael S; Johnson, Richard; Bollinger, James; Boss, Yuval; Shulman, Nick; MacLean, Brendan; Hoofnagle, Andrew N; MacCoss, Michael J

    2014-04-01

    Statistical process control (SPC) is a robust set of tools that aids in the visualization, detection, and identification of assignable causes of variation in any process that creates products, services, or information. A tool has been developed termed Statistical Process Control in Proteomics (SProCoP) which implements aspects of SPC (e.g., control charts and Pareto analysis) into the Skyline proteomics software. It monitors five quality control metrics in a shotgun or targeted proteomic workflow. None of these metrics require peptide identification. The source code, written in the R statistical language, runs directly from the Skyline interface, which supports the use of raw data files from several of the mass spectrometry vendors. It provides real time evaluation of the chromatographic performance (e.g., retention time reproducibility, peak asymmetry, and resolution), and mass spectrometric performance (targeted peptide ion intensity and mass measurement accuracy for high resolving power instruments) via control charts. Thresholds are experiment- and instrument-specific and are determined empirically from user-defined quality control standards that enable the separation of random noise and systematic error. Finally, Pareto analysis provides a summary of performance metrics and guides the user to metrics with high variance. The utility of these charts to evaluate proteomic experiments is illustrated in two case studies.

  5. An Experience of Statistical Method Application in Forest Survey at Angara River Region in 1932

    Directory of Open Access Journals (Sweden)

    L. N. Vashchuk

    2014-10-01

    Full Text Available Report of the Angara forest economic expedition of forest economic survey in 1932 on the left bank of the Angara River has been found. The survey covered a part of Krasnoyarsk Territory and Irkutsk region, a total area of 18641.8 thousand ha. The report describes technology of forest inventory and achievements that have not previously been published. The survey was conducted by statistical method, which consisted of a sample by a continuous forest inventory enumeration of trees on sample plots (SP, arranged in an array on a particular system, followed by mathematical-statistical recalculation of the sample results to the entire survey. To do this, strip finders (sights were cut in the latitudinal direction at a distance from one another at 16 km. On the hacked sights, by every 2 km, 0.1 ha (10 × 100 m SP were established. In total 32 forest inventory sights were hacked, with total length of 9931 km, which incorporated 4817 SP. The accuracy of forest resources’ inventory characteristics determining also was investigated using smaller sample plots. For this purpose, each of the SP were cut to smaller area of 0.01 ha (10 × 10 m, where independent continuous enumeration of trees was conducted, andsample trees were cut, measured and bucked to the assortments, to explore the tree stand assortment structure. At each «sample cutting area» all the trees were felled out from 44 cm and above DBH. At half of the sample plot with 5 × 10 m size, located in the eastern end, all the trees were felled out and measured from 24 cm and above DBH. Every four «sample cutting area» in the fifth, all the trees with 12 cm and above DBH were cut down and measured. According to the results of the work, a detailed description of forest resources in the whole Angara river basin, and across 17 forest exploitation areas was completed.

  6. Statistical transformation and the interpretation of inpatient glucose control data.

    Science.gov (United States)

    Saulnier, George E; Castro, Janna C; Cook, Curtiss B

    2014-03-01

    To introduce a statistical method of assessing hospital-based non-intensive care unit (non-ICU) inpatient glucose control. Point-of-care blood glucose (POC-BG) data from hospital non-ICUs were extracted for January 1 through December 31, 2011. Glucose data distribution was examined before and after Box-Cox transformations and compared to normality. Different subsets of data were used to establish upper and lower control limits, and exponentially weighted moving average (EWMA) control charts were constructed from June, July, and October data as examples to determine if out-of-control events were identified differently in nontransformed versus transformed data. A total of 36,381 POC-BG values were analyzed. In all 3 monthly test samples, glucose distributions in nontransformed data were skewed but approached a normal distribution once transformed. Interpretation of out-of-control events from EWMA control chart analyses also revealed differences. In the June test data, an out-of-control process was identified at sample 53 with nontransformed data, whereas the transformed data remained in control for the duration of the observed period. Analysis of July data demonstrated an out-of-control process sooner in the transformed (sample 55) than nontransformed (sample 111) data, whereas for October, transformed data remained in control longer than nontransformed data. Statistical transformations increase the normal behavior of inpatient non-ICU glycemic data sets. The decision to transform glucose data could influence the interpretation and conclusions about the status of inpatient glycemic control. Further study is required to determine whether transformed versus nontransformed data influence clinical decisions or evaluation of interventions.

  7. Surveying Future Surveys

    Science.gov (United States)

    Carlstrom, John E.

    2016-06-01

    The now standard model of cosmology has been tested and refined by the analysis of increasingly sensitive, large astronomical surveys, especially with statistically significant millimeter-wave surveys of the cosmic microwave background and optical surveys of the distribution of galaxies. This talk will offer a glimpse of the future, which promises an acceleration of this trend with cosmological information coming from new surveys across the electromagnetic spectrum as well as particles and even gravitational waves.

  8. Statistical analysis of quality control of automatic processor

    International Nuclear Information System (INIS)

    Niu Yantao; Zhao Lei; Zhang Wei; Yan Shulin

    2002-01-01

    Objective: To strengthen the scientific management of automatic processor and promote QC, based on analyzing QC management chart for automatic processor by statistical method, evaluating and interpreting the data and trend of the chart. Method: Speed, contrast, minimum density of step wedge of film strip were measured everyday and recorded on the QC chart. Mean (x-bar), standard deviation (s) and range (R) were calculated. The data and the working trend were evaluated and interpreted for management decisions. Results: Using relative frequency distribution curve constructed by measured data, the authors can judge whether it is a symmetric bell-shaped curve or not. If not, it indicates a few extremes overstepping control limits possibly are pulling the curve to the left or right. If it is a normal distribution, standard deviation (s) is observed. When x-bar +- 2s lies in upper and lower control limits of relative performance indexes, it indicates the processor works in stable status in this period. Conclusion: Guided by statistical method, QC work becomes more scientific and quantified. The authors can deepen understanding and application of the trend chart, and improve the quality management to a new step

  9. Statistical techniques applied to aerial radiometric surveys (STAARS): series introduction and the principal-components-analysis method

    International Nuclear Information System (INIS)

    Pirkle, F.L.

    1981-04-01

    STAARS is a new series which is being published to disseminate information concerning statistical procedures for interpreting aerial radiometric data. The application of a particular data interpretation technique to geologic understanding for delineating regions favorable to uranium deposition is the primary concern of STAARS. Statements concerning the utility of a technique on aerial reconnaissance data as well as detailed aerial survey data will be included

  10. Using Statistical Process Control Methods to Classify Pilot Mental Workloads

    National Research Council Canada - National Science Library

    Kudo, Terence

    2001-01-01

    .... These include cardiac, ocular, respiratory, and brain activity measures. The focus of this effort is to apply statistical process control methodology on different psychophysiological features in an attempt to classify pilot mental workload...

  11. National Center for Health Statistics

    Science.gov (United States)

    ... Submit Search the CDC National Center for Health Statistics Note: Javascript is disabled or is not supported ... Survey of Family Growth Vital Records National Vital Statistics System National Death Index Vital Statistics Rapid Release ...

  12. Statistical analysis applied to safety culture self-assessment

    International Nuclear Information System (INIS)

    Macedo Soares, P.P.

    2002-01-01

    Interviews and opinion surveys are instruments used to assess the safety culture in an organization as part of the Safety Culture Enhancement Programme. Specific statistical tools are used to analyse the survey results. This paper presents an example of an opinion survey with the corresponding application of the statistical analysis and the conclusions obtained. Survey validation, Frequency statistics, Kolmogorov-Smirnov non-parametric test, Student (T-test) and ANOVA means comparison tests and LSD post-hoc multiple comparison test, are discussed. (author)

  13. Statistical Process Control in the Practice of Program Evaluation.

    Science.gov (United States)

    Posavac, Emil J.

    1995-01-01

    A technique developed to monitor the quality of manufactured products, statistical process control (SPC), incorporates several features that may prove attractive to evaluators. This paper reviews the history of SPC, suggests how the approach can enrich program evaluation, and illustrates its use in a hospital-based example. (SLD)

  14. Schools and Staffing Survey (SASS): 1995. Selected Papers Presented at the Meeting of the American Statistical Association (Orlando, Florida, August 13-17, 1996). Working Paper Series.

    Science.gov (United States)

    National Center for Education Statistics (ED), Washington, DC.

    The papers were presented at the Social Statistics Section, the Government Statistics Section, and the Section on Survey Research Methods. The following papers are included in the Social Statistics Section and Government Statistics Section, "Overcoming the Bureaucratic Paradigm: Memorial Session in Honor of Roger Herriot": "1995…

  15. A new instrument for statistical process control of thermoset molding

    International Nuclear Information System (INIS)

    Day, D.R.; Lee, H.L.; Shepard, D.D.; Sheppard, N.F.

    1991-01-01

    The recent development of a rugged ceramic mold mounted dielectric sensor and high speed dielectric instrumentation now enables monitoring and statistical process control of production molding over thousands of runs. In this work special instrumentation and software (ICAM-1000) was utilized that automatically extracts critical point during the molding process including flow point, viscosity minimum gel inflection, and reaction endpoint. In addition, other sensors were incorporated to measure temperature and pressure. The critical point as well as temperature and pressure were then recorded during normal production and then plotted in the form of statistical process control (SPC) charts. Experiments have been carried out in RIM, SMC, and RTM type molding operations. The influence of temperature, pressure chemistry, and other variables has been investigated. In this paper examples of both RIM and SMC are discussed

  16. Our Surveys & Programs

    Science.gov (United States)

    Employment and Payroll Survey of Business Owners Work from Home Our statistics highlight trends in household statistics from multiple surveys. Data Tools & Apps Main American FactFinder Census Business Builder My Classification Codes (i.e., NAICS) Economic Census Economic Indicators Economic Studies Industry Statistics

  17. Response Burden in Official Business Surveys: Measurement and Reduction Practices of National Statistical Institutes

    Directory of Open Access Journals (Sweden)

    Bavdaž Mojca

    2015-12-01

    Full Text Available Response burden in business surveys has long been a concern for National Statistical Institutes (NSIs for three types of reasons: political reasons, because response burden is part of the total administrative burden governments impose on businesses; methodological reasons, because an excessive response burden may reduce data quality and increase data-collection costs; and strategic reasons, because it affects relations between the NSIs and the business community. This article investigates NSI practices concerning business response burden measurement and reduction actions based on a survey of 41 NSIs from 39 countries. Most NSIs monitor at least some burden aspects and have implemented some actions to reduce burden, but large differences exist between NSIs’ methodologies for burden measurement and actions taken to reduce burden. Future research should find ways to deal with methodological differences in burden conceptualization, operationalization, and measurement, and provide insights into the effectiveness and efficiency of burden-reduction actions.

  18. Application of classical versus bayesian statistical control charts to on-line radiological monitoring

    International Nuclear Information System (INIS)

    DeVol, T.A.; Gohres, A.A.; Williams, C.L.

    2009-01-01

    False positive and false negative incidence rates of radiological monitoring data from classical and Bayesian statistical process control chart techniques are compared. The on-line monitoring for illicit radioactive material with no false positives or false negatives is the goal of homeland security monitoring, but is unrealistic. However, statistical fluctuations in the detector signal, short detection times, large source to detector distances, and shielding effects make distinguishing between a radiation source and natural background particularly difficult. Experimental time series data were collected using a 1' x 1' LaCl 3 (Ce) based scintillation detector (Scionix, Orlando, FL) under various simulated conditions. Experimental parameters include radionuclide (gamma-ray) energy, activity, density thickness (source to detector distance and shielding), time, and temperature. All statistical algorithms were developed using MATLAB TM . The Shewhart (3-σ) control chart and the cumulative sum (CUSUM) control chart are the classical procedures adopted, while the Bayesian technique is the Shiryayev-Roberts (S-R) control chart. The Shiryayev-Roberts method was the best method for controlling the number of false positive detects, followed by the CUSUM method. However, The Shiryayev-Roberts method, used without modification, resulted in one of the highest false negative incidence rates independent of the signal strength. Modification of The Shiryayev-Roberts statistical analysis method reduced the number of false negatives, but resulted in an increase in the false positive incidence rate. (author)

  19. Survey of orbital dynamics and control of space rendezvous

    Directory of Open Access Journals (Sweden)

    Luo Yazhong

    2014-02-01

    Full Text Available Rendezvous orbital dynamics and control (RODC is a key technology for operating space rendezvous and docking missions. This paper surveys the studies on RODC. Firstly, the basic relative dynamics equation set is introduced and its improved versions are evaluated. Secondly, studies on rendezvous trajectory optimization are commented from three aspects: the linear rendezvous, the nonlinear two-body rendezvous, and the perturbed and constrained rendezvous. Thirdly, studies on relative navigation are briefly reviewed, and then close-range control methods including automated control, manual control, and telecontrol are analyzed. Fourthly, advances in rendezvous trajectory safety and robust analysis are surveyed, and their applications in trajectory optimization are discussed. Finally, conclusions are drawn and prospects of studies on RODC are presented.

  20. Statistical study on the self-selection bias in FDG-PET cancer screening by a questionnaire survey

    International Nuclear Information System (INIS)

    Kita, Tamotsu; Yano, Fuzuki; Watanabe, Sadahiro; Soga, Shigeyoshi; Hama, Yukihiro; Shinmoto, Hiroshi; Kosuda, Shigeru

    2008-01-01

    A questionnaire survey was performed to investigate the possible presence of self-selection bias in 18 F-fluorodeoxyglucose (FDG) positron emission tomography (PET) cancer screening (PET cancer screening). Responders to the questionnaires survey consisted of 80 healthy persons, who answered whether they undergo PET cancer screening, health consciousness, age, sex, and smoking history. The univariate and multivariate analyses on the four parameters were performed between the responders who were to undergo PET cancer screening and the responders who were not. Statistically significant difference was found in health consciousness between the above-mentioned two groups by both univariate and multivariate analysis with the odds ratio of 2.088. The study indicated that self-selection bias should exist in PET cancer screening. (author)

  1. Statistical Process Control in a Modern Production Environment

    DEFF Research Database (Denmark)

    Windfeldt, Gitte Bjørg

    gathered here and standard statistical software. In Paper 2 a new method for process monitoring is introduced. The method uses a statistical model of the quality characteristic and a sliding window of observations to estimate the probability that the next item will not respect the specications......Paper 1 is aimed at practicians to help them test the assumption that the observations in a sample are independent and identically distributed. An assumption that is essential when using classical Shewhart charts. The test can easily be performed in the control chart setup using the samples....... If the estimated probability exceeds a pre-determined threshold the process will be stopped. The method is exible, allowing a complexity in modeling that remains invisible to the end user. Furthermore, the method allows to build diagnostic plots based on the parameters estimates that can provide valuable insight...

  2. Statistical process control: separating signal from noise in emergency department operations.

    Science.gov (United States)

    Pimentel, Laura; Barrueto, Fermin

    2015-05-01

    Statistical process control (SPC) is a visually appealing and statistically rigorous methodology very suitable to the analysis of emergency department (ED) operations. We demonstrate that the control chart is the primary tool of SPC; it is constructed by plotting data measuring the key quality indicators of operational processes in rationally ordered subgroups such as units of time. Control limits are calculated using formulas reflecting the variation in the data points from one another and from the mean. SPC allows managers to determine whether operational processes are controlled and predictable. We review why the moving range chart is most appropriate for use in the complex ED milieu, how to apply SPC to ED operations, and how to determine when performance improvement is needed. SPC is an excellent tool for operational analysis and quality improvement for these reasons: 1) control charts make large data sets intuitively coherent by integrating statistical and visual descriptions; 2) SPC provides analysis of process stability and capability rather than simple comparison with a benchmark; 3) SPC allows distinction between special cause variation (signal), indicating an unstable process requiring action, and common cause variation (noise), reflecting a stable process; and 4) SPC keeps the focus of quality improvement on process rather than individual performance. Because data have no meaning apart from their context, and every process generates information that can be used to improve it, we contend that SPC should be seriously considered for driving quality improvement in emergency medicine. Copyright © 2015 Elsevier Inc. All rights reserved.

  3. Use of statistic control of the process as part of a quality assurance plan

    International Nuclear Information System (INIS)

    Acosta, S.; Lewis, C.

    2013-01-01

    One of the technical requirements of the standard IRAM ISO 17025 for the accreditation of testing laboratories, is the assurance of the quality of the results through the control and monitoring of the factors influencing the reliability of them. The grade the factors contribute to the total measurement uncertainty, determines which of them should be considered when developing a quality assurance plan. The laboratory of environmental measurements of strontium-90 in the accreditation process, performs most of its determinations in samples with values close to the detection limit. For this reason the correct characterization of the white, is a critical parameter and is verified through a letter for statistical process control. The scope of the present work is concerned the control of whites and so it was collected a statistically significant amount of data, for a period of time that is covered of different conditions. This allowed consider significant variables in the process, such as temperature and humidity, and build a graph of white control, which forms the basis of a statistical process control. The data obtained were lower and upper limits for the preparation of the charter white control. In this way the process of characterization of white was considered to operate under statistical control and concludes that it can be used as part of a plan of insurance of the quality

  4. Statistics and Corporate Environmental Management: Relations and Problems

    DEFF Research Database (Denmark)

    Madsen, Henning; Ulhøi, John Parm

    1997-01-01

    Statistical methods have long been used to analyse the macroeconomic consequences of environmentally damaging activities, political actions to control, prevent, or reduce these damages, and environmental problems in the natural environment. Up to now, however, they have had a limited and not very...... in the external environment. The nature and extent of the practical use of quantitative techniques in corporate environmental management systems is discussed on the basis of a number of company surveys in four European countries.......Statistical methods have long been used to analyse the macroeconomic consequences of environmentally damaging activities, political actions to control, prevent, or reduce these damages, and environmental problems in the natural environment. Up to now, however, they have had a limited and not very...... specific use in corporate environmental management systems. This paper will address some of the special problems related to the use of statistical techniques in corporate environmental management systems. One important aspect of this is the interaction of internal decisions and activities with conditions...

  5. Guideline implementation in clinical practice: Use of statistical process control charts as visual feedback devices

    Directory of Open Access Journals (Sweden)

    Fahad A Al-Hussein

    2009-01-01

    Conclusions: A process of audits in the context of statistical process control is necessary for any improvement in the implementation of guidelines in primary care. Statistical process control charts are an effective means of visual feedback to the care providers.

  6. Statistical survey of day-side magnetospheric current flow using Cluster observations: magnetopause

    Directory of Open Access Journals (Sweden)

    E. Liebert

    2017-05-01

    Full Text Available We present a statistical survey of current structures observed by the Cluster spacecraft at high-latitude day-side magnetopause encounters in the close vicinity of the polar cusps. Making use of the curlometer technique and the fluxgate magnetometer data, we calculate the 3-D current densities and investigate the magnetopause current direction, location, and magnitude during varying solar wind conditions. We find that the orientation of the day-side current structures is in accordance with existing magnetopause current models. Based on the ambient plasma properties, we distinguish five different transition regions at the magnetopause surface and observe distinctive current properties for each region. Additionally, we find that the location of currents varies with respect to the onset of the changes in the plasma environment during magnetopause crossings.

  7. Statistical applications for chemistry, manufacturing and controls (CMC) in the pharmaceutical industry

    CERN Document Server

    Burdick, Richard K; Pfahler, Lori B; Quiroz, Jorge; Sidor, Leslie; Vukovinsky, Kimberly; Zhang, Lanju

    2017-01-01

    This book examines statistical techniques that are critically important to Chemistry, Manufacturing, and Control (CMC) activities. Statistical methods are presented with a focus on applications unique to the CMC in the pharmaceutical industry. The target audience consists of statisticians and other scientists who are responsible for performing statistical analyses within a CMC environment. Basic statistical concepts are addressed in Chapter 2 followed by applications to specific topics related to development and manufacturing. The mathematical level assumes an elementary understanding of statistical methods. The ability to use Excel or statistical packages such as Minitab, JMP, SAS, or R will provide more value to the reader. The motivation for this book came from an American Association of Pharmaceutical Scientists (AAPS) short course on statistical methods applied to CMC applications presented by four of the authors. One of the course participants asked us for a good reference book, and the only book recomm...

  8. The Sedentary Multi-Frequency Survey. I. Statistical Identification and Cosmological Properties of HBL BL Lacs

    OpenAIRE

    Giommi, P.; Menna, M. T.; Padovani, P.

    1999-01-01

    We have assembled a multi-frequency database by cross-correlating the NVSS catalog of radio sources with the RASSBSC list of soft X-ray sources, obtaining optical magnitude estimates from the Palomar and UK Schmidt surveys as provided by the APM and COSMOS on-line services. By exploiting the nearly unique broad-band properties of High-Energy Peaked (HBL) BL Lacs we have statistically identified a sample of 218 objects that is expected to include about 85% of BL Lacs and that is therefore seve...

  9. Statistical Process Control. A Summary. FEU/PICKUP Project Report.

    Science.gov (United States)

    Owen, M.; Clark, I.

    A project was conducted to develop a curriculum and training materials to be used in training industrial operatives in statistical process control (SPC) techniques. During the first phase of the project, questionnaires were sent to 685 companies (215 of which responded) to determine where SPC was being used, what type of SPC firms needed, and how…

  10. Survey of editors and reviewers of high-impact psychology journals: statistical and research design problems in submitted manuscripts.

    Science.gov (United States)

    Harris, Alex; Reeder, Rachelle; Hyun, Jenny

    2011-01-01

    The authors surveyed 21 editors and reviewers from major psychology journals to identify and describe the statistical and design errors they encounter most often and to get their advice regarding prevention of these problems. Content analysis of the text responses revealed themes in 3 major areas: (a) problems with research design and reporting (e.g., lack of an a priori power analysis, lack of congruence between research questions and study design/analysis, failure to adequately describe statistical procedures); (b) inappropriate data analysis (e.g., improper use of analysis of variance, too many statistical tests without adjustments, inadequate strategy for addressing missing data); and (c) misinterpretation of results. If researchers attended to these common methodological and analytic issues, the scientific quality of manuscripts submitted to high-impact psychology journals might be significantly improved.

  11. A micro-controller based wide range survey meter

    International Nuclear Information System (INIS)

    Bhingare, R.R.; Bajaj, K.C.; Kannan, S.

    2004-01-01

    Wide range survey meters (1μSv/h -10 Sv/h) with the detector(s) mounted at the end of a two-to-four meter-long extendable tube are widely used for radiation protection survey of difficult to reach locations and high dose rate areas, The commercially available survey meters of this type use two GM counters to cover a wide range of dose rate measurement. A new micro-controller based wide range survey meter using two Si diode detectors has been developed. The use of solid state detectors in the survey meter has a number of advantages like low power consumption, lighter battery powered detector probe, elimination of high voltage for the operation of the detectors, etc. The design uses infrared communication between the probe and the readout unit through a light-weight collapsible extension tube for high reliability. The design details and features are discussed in detail. (author)

  12. Statistical process control for electron beam monitoring.

    Science.gov (United States)

    López-Tarjuelo, Juan; Luquero-Llopis, Naika; García-Mollá, Rafael; Quirós-Higueras, Juan David; Bouché-Babiloni, Ana; Juan-Senabre, Xavier Jordi; de Marco-Blancas, Noelia; Ferrer-Albiach, Carlos; Santos-Serra, Agustín

    2015-07-01

    To assess the electron beam monitoring statistical process control (SPC) in linear accelerator (linac) daily quality control. We present a long-term record of our measurements and evaluate which SPC-led conditions are feasible for maintaining control. We retrieved our linac beam calibration, symmetry, and flatness daily records for all electron beam energies from January 2008 to December 2013, and retrospectively studied how SPC could have been applied and which of its features could be used in the future. A set of adjustment interventions designed to maintain these parameters under control was also simulated. All phase I data was under control. The dose plots were characterized by rising trends followed by steep drops caused by our attempts to re-center the linac beam calibration. Where flatness and symmetry trends were detected they were less-well defined. The process capability ratios ranged from 1.6 to 9.3 at a 2% specification level. Simulated interventions ranged from 2% to 34% of the total number of measurement sessions. We also noted that if prospective SPC had been applied it would have met quality control specifications. SPC can be used to assess the inherent variability of our electron beam monitoring system. It can also indicate whether a process is capable of maintaining electron parameters under control with respect to established specifications by using a daily checking device, but this is not practical unless a method to establish direct feedback from the device to the linac can be devised. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  13. A Survey of Statistical Machine Translation

    Science.gov (United States)

    2007-04-01

    methods are notoriously sen- sitive to domain differences, however, so the move to informal text is likely to present many interesting challenges ...Och, Christoph Tillman, and Hermann Ney. Improved alignment models for statistical machine translation. In Proc. of EMNLP- VLC , pages 20–28, Jun 1999

  14. The New Migration Statistics: A Good Choice made by the INE (Spanish Institute for National Statistics [ENG

    Directory of Open Access Journals (Sweden)

    Carmen Ródenas

    2013-01-01

    Full Text Available The Spanish Institute for National Statistics (INE has decided to create new Migration Statistics (Estadística de Migraciones based upon Residential Variation Statistics (Estadística de Variaciones Residenciales. This article presents arguments to support this decision, in view of the continued lack of consistency found among the sources of the Spanish statistics system for measuring population mobility. Specifically, an insight is provided into the problems of underestimation and internal inconsistency in the Spanish Labour Force Survey when measuring immigration rates, based upon discrepancies identified in the three international immigration flow series produced by this survey.

  15. National Geodetic Survey (NGS) Geodetic Control Stations, (Horizontal and/or Vertical Control), March 2009

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — This data contains a set of geodetic control stations maintained by the National Geodetic Survey. Each geodetic control station in this dataset has either a precise...

  16. Microgrid Controller and Advanced Distribution Management System Survey Report

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Guodong [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Starke, Michael R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Herron, Andrew N. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-07-01

    A microgrid controller, which serves as the heart of a microgrid, is responsible for optimally managing the distributed energy resources, energy storage systems, and responsive demand and for ensuring the microgrid is being operated in an efficient, reliable, and resilient way. As the market for microgrids has blossomed in recently years, many vendors have released their own microgrid controllers to meet the various needs of different microgrid clients. However, due to the absence of a recognized standard for such controllers, vendor-supported microgrid controllers have a range of functionalities that are significantly different from each other in many respects. As a result the current state of the industry has been difficult to assess. To remedy this situation the authors conducted a survey of the functions of microgrid controllers developed by vendors and national laboratories. This report presents a clear indication of the state of the microgrid-controller industry based on analysis of the survey results. The results demonstrate that US Department of Energy funded research in microgrid controllers is unique and not competing with that of industry.

  17. STATISTICAL ANALYSIS AND OPINION SURVEY UPON DICTATORSHIP AS A PEDAGOGICAL STRATEGY OF THE TEACHING OF HISTORY.

    Directory of Open Access Journals (Sweden)

    Vitória A. da Fonseca

    2016-07-01

    Full Text Available This paper presents a practice of teaching, whose purpose was to make students of high school capable of understanding the issues upon the dictatorship as a theme in the teaching of history. Considering the importance of practice as a tool which makes up a learning path, the activity has involved debate, survey and statistical analysis. It is worth highlighting the engagement of students in this activity and mapping of their opinions about the dictatorship.

  18. A case study: application of statistical process control tool for determining process capability and sigma level.

    Science.gov (United States)

    Chopra, Vikram; Bairagi, Mukesh; Trivedi, P; Nagar, Mona

    2012-01-01

    Statistical process control is the application of statistical methods to the measurement and analysis of variation process. Various regulatory authorities such as Validation Guidance for Industry (2011), International Conference on Harmonisation ICH Q10 (2009), the Health Canada guidelines (2009), Health Science Authority, Singapore: Guidance for Product Quality Review (2008), and International Organization for Standardization ISO-9000:2005 provide regulatory support for the application of statistical process control for better process control and understanding. In this study risk assessments, normal probability distributions, control charts, and capability charts are employed for selection of critical quality attributes, determination of normal probability distribution, statistical stability, and capability of production processes, respectively. The objective of this study is to determine tablet production process quality in the form of sigma process capability. By interpreting data and graph trends, forecasting of critical quality attributes, sigma process capability, and stability of process were studied. The overall study contributes to an assessment of process at the sigma level with respect to out-of-specification attributes produced. Finally, the study will point to an area where the application of quality improvement and quality risk assessment principles for achievement of six sigma-capable processes is possible. Statistical process control is the most advantageous tool for determination of the quality of any production process. This tool is new for the pharmaceutical tablet production process. In the case of pharmaceutical tablet production processes, the quality control parameters act as quality assessment parameters. Application of risk assessment provides selection of critical quality attributes among quality control parameters. Sequential application of normality distributions, control charts, and capability analyses provides a valid statistical

  19. Business statistics for dummies

    CERN Document Server

    Anderson, Alan

    2013-01-01

    Score higher in your business statistics course? Easy. Business statistics is a common course for business majors and MBA candidates. It examines common data sets and the proper way to use such information when conducting research and producing informational reports such as profit and loss statements, customer satisfaction surveys, and peer comparisons. Business Statistics For Dummies tracks to a typical business statistics course offered at the undergraduate and graduate levels and provides clear, practical explanations of business statistical ideas, techniques, formulas, and calculations, w

  20. Integrated alarm annunciation and entry control systems -- Survey results

    International Nuclear Information System (INIS)

    Clever, J.J.; Arakaki, L.H.; Monaco, F.M.; Juarros, L.E.; Quintana, G.R.

    1993-10-01

    This report provides the results and analyses of a detailed survey undertaken in Summer 1993 to address integrated intrusion detection alarm annunciation and entry control system issues. This survey was undertaken as a first attempt toward beginning to answer questions about integrated systems and commercial capabilities to meet or partially meet US Department of Energy (DOE) site needs

  1. Application of Multivariable Statistical Techniques in Plant-wide WWTP Control Strategies Analysis

    DEFF Research Database (Denmark)

    Flores Alsina, Xavier; Comas, J.; Rodríguez-Roda, I.

    2007-01-01

    The main objective of this paper is to present the application of selected multivariable statistical techniques in plant-wide wastewater treatment plant (WWTP) control strategies analysis. In this study, cluster analysis (CA), principal component analysis/factor analysis (PCA/FA) and discriminant...... analysis (DA) are applied to the evaluation matrix data set obtained by simulation of several control strategies applied to the plant-wide IWA Benchmark Simulation Model No 2 (BSM2). These techniques allow i) to determine natural groups or clusters of control strategies with a similar behaviour, ii......) to find and interpret hidden, complex and casual relation features in the data set and iii) to identify important discriminant variables within the groups found by the cluster analysis. This study illustrates the usefulness of multivariable statistical techniques for both analysis and interpretation...

  2. A small unconditional non-financial incentive suggests an increase in survey response rates amongst older general practitioners (GPs): a randomised controlled trial study.

    Science.gov (United States)

    Pit, Sabrina Winona; Hansen, Vibeke; Ewald, Dan

    2013-07-30

    Few studies have investigated the effect of small unconditional non-monetary incentives on survey response rates amongst GPs or medical practitioners. This study assessed the effectiveness of offering a small unconditional non-financial incentive to increase survey response rates amongst general practitioners within a randomised controlled trial (RCT). An RCT was conducted within a general practice survey that investigated how to prolong working lives amongst ageing GPs in Australia. GPs (n = 125) were randomised to receive an attractive pen or no pen during their first invitation for participation in a survey. GPs could elect to complete the survey online or via mail. Two follow up reminders were sent without a pen to both groups. The main outcome measure was response rates. The response rate for GPs who received a pen was higher in the intervention group (61.9%) compared to the control group (46.8%). This study did not find a statistically significant effect of a small unconditional non-financial incentive (in the form of a pen) on survey response rates amongst GPs (Odds ratio, 95% confidence interval: 1.85 (0.91 to 3.77). No GPs completed the online version. A small unconditional non-financial incentives, in the form of a pen, may improve response rates for GPs.

  3. American Housing Survey (AHS)

    Science.gov (United States)

    Employment and Payroll Survey of Business Owners Work from Home Our statistics highlight trends in household statistics from multiple surveys. Data Tools & Apps Main American FactFinder Census Business Builder My Classification Codes (i.e., NAICS) Economic Census Economic Indicators Economic Studies Industry Statistics

  4. PROCESS VARIABILITY REDUCTION THROUGH STATISTICAL PROCESS CONTROL FOR QUALITY IMPROVEMENT

    Directory of Open Access Journals (Sweden)

    B.P. Mahesh

    2010-09-01

    Full Text Available Quality has become one of the most important customer decision factors in the selection among the competing product and services. Consequently, understanding and improving quality is a key factor leading to business success, growth and an enhanced competitive position. Hence quality improvement program should be an integral part of the overall business strategy. According to TQM, the effective way to improve the Quality of the product or service is to improve the process used to build the product. Hence, TQM focuses on process, rather than results as the results are driven by the processes. Many techniques are available for quality improvement. Statistical Process Control (SPC is one such TQM technique which is widely accepted for analyzing quality problems and improving the performance of the production process. This article illustrates the step by step procedure adopted at a soap manufacturing company to improve the Quality by reducing process variability using Statistical Process Control.

  5. Statistical searches for microlensing events in large, non-uniformly sampled time-domain surveys: A test using palomar transient factory data

    Energy Technology Data Exchange (ETDEWEB)

    Price-Whelan, Adrian M.; Agüeros, Marcel A. [Department of Astronomy, Columbia University, 550 W 120th Street, New York, NY 10027 (United States); Fournier, Amanda P. [Department of Physics, Broida Hall, University of California, Santa Barbara, CA 93106 (United States); Street, Rachel [Las Cumbres Observatory Global Telescope Network, Inc., 6740 Cortona Drive, Suite 102, Santa Barbara, CA 93117 (United States); Ofek, Eran O. [Benoziyo Center for Astrophysics, Weizmann Institute of Science, 76100 Rehovot (Israel); Covey, Kevin R. [Lowell Observatory, 1400 West Mars Hill Road, Flagstaff, AZ 86001 (United States); Levitan, David; Sesar, Branimir [Division of Physics, Mathematics, and Astronomy, California Institute of Technology, Pasadena, CA 91125 (United States); Laher, Russ R.; Surace, Jason, E-mail: adrn@astro.columbia.edu [Spitzer Science Center, California Institute of Technology, Mail Stop 314-6, Pasadena, CA 91125 (United States)

    2014-01-20

    Many photometric time-domain surveys are driven by specific goals, such as searches for supernovae or transiting exoplanets, which set the cadence with which fields are re-imaged. In the case of the Palomar Transient Factory (PTF), several sub-surveys are conducted in parallel, leading to non-uniform sampling over its ∼20,000 deg{sup 2} footprint. While the median 7.26 deg{sup 2} PTF field has been imaged ∼40 times in the R band, ∼2300 deg{sup 2} have been observed >100 times. We use PTF data to study the trade off between searching for microlensing events in a survey whose footprint is much larger than that of typical microlensing searches, but with far-from-optimal time sampling. To examine the probability that microlensing events can be recovered in these data, we test statistics used on uniformly sampled data to identify variables and transients. We find that the von Neumann ratio performs best for identifying simulated microlensing events in our data. We develop a selection method using this statistic and apply it to data from fields with >10 R-band observations, 1.1 × 10{sup 9} light curves, uncovering three candidate microlensing events. We lack simultaneous, multi-color photometry to confirm these as microlensing events. However, their number is consistent with predictions for the event rate in the PTF footprint over the survey's three years of operations, as estimated from near-field microlensing models. This work can help constrain all-sky event rate predictions and tests microlensing signal recovery in large data sets, which will be useful to future time-domain surveys, such as that planned with the Large Synoptic Survey Telescope.

  6. Evaluation of statistical control charts for on-line radiation monitoring

    International Nuclear Information System (INIS)

    Hughes, L.D.; DeVol, T.A.

    2008-01-01

    Statistical control charts are presented for the evaluation of time series radiation counter data from flow cells used for monitoring of low levels of 99 TcO 4 - in environmental solutions. Control chart methods consisted of the 3-sigma (3σ) chart, the cumulative sum (CUSUM) chart, and the exponentially weighted moving average (EWMA) chart. Each method involves a control limit based on the detector background which constitutes the detection limit. Both the CUSUM and EWMA charts are suitable to detect and estimate sample concentration requiring less solution volume than when using a 3? control chart. Data presented here indicate that the overall accuracy and precision of the CUSUM method is the best. (author)

  7. Mapping indoor radon-222 in Denmark: Design and test of the statistical model used in the second nationwide survey

    DEFF Research Database (Denmark)

    Andersen, C.E.; Ulbak, K.; Damkjær, A.

    2001-01-01

    In Denmark, a new survey of indoor radon-222 has been carried out. 1-year alpha track measurements (CR-39) have been made in 3019 single-family houses. There are from 3 to 23 house measurements in each of the 275 municipalities. Within each municipality, houses have been selected randomly. One...... important outcome of the survey is the prediction of the fraction of houses in each municipality with an annual average radon concentration above 200 Bq m(-3). To obtain the most accurate estimate and to assess the associated uncertainties, a statistical model has been developed. The purpose of this paper...

  8. Experience in statistical quality control for road construction in South Africa

    CSIR Research Space (South Africa)

    Mitchell, MF

    1977-06-01

    Full Text Available of statistically oriented acceptance control procedures to a major road construction project is examined and it is concluded that such procedures promise to be of benefit to both the client and the contractor....

  9. Development of remote controller for an EMI test receiver in site survey

    International Nuclear Information System (INIS)

    Cha, K. H.; Hwang, I. G.; Lee, D. Y.; Lee, K. Y.; Park, J. K.

    2000-01-01

    EMI assessment, which is based on the Site survey(the measurement of EMI noise) in an operating plant, can be considered for system design. Our Site survey is being planned to utilize the ESI7 model, to be manufactured as the EMI test receiver by Rodge-Schwaltz GmbH. But the ESI7 is often manipulated by manual if a Site survey is continued for some days in nuclear power plant. The problem can be resolved if a remote controller is implemented for the ESI7 and it controls the ESI7. The Remote Controller has its functions for supporting the ESI7 manual tasks, including storing mass SCAN data onto external PC memory (hard-disk), controlling ESI7, and analyzing the stored SCAN data. These functions have been implemented in 'G' programming of LabVIEW software under a notebook PC with PCMCIA-GPIB card. The Remote Controller prototype will be applied to store the real EMI measurements in the coming Site survey and analyze the data after integrated tests and their evaluation

  10. Management of Uncertainty by Statistical Process Control and a Genetic Tuned Fuzzy System

    Directory of Open Access Journals (Sweden)

    Stephan Birle

    2016-01-01

    Full Text Available In food industry, bioprocesses like fermentation often are a crucial part of the manufacturing process and decisive for the final product quality. In general, they are characterized by highly nonlinear dynamics and uncertainties that make it difficult to control these processes by the use of traditional control techniques. In this context, fuzzy logic controllers offer quite a straightforward way to control processes that are affected by nonlinear behavior and uncertain process knowledge. However, in order to maintain process safety and product quality it is necessary to specify the controller performance and to tune the controller parameters. In this work, an approach is presented to establish an intelligent control system for oxidoreductive yeast propagation as a representative process biased by the aforementioned uncertainties. The presented approach is based on statistical process control and fuzzy logic feedback control. As the cognitive uncertainty among different experts about the limits that define the control performance as still acceptable may differ a lot, a data-driven design method is performed. Based upon a historic data pool statistical process corridors are derived for the controller inputs control error and change in control error. This approach follows the hypothesis that if the control performance criteria stay within predefined statistical boundaries, the final process state meets the required quality definition. In order to keep the process on its optimal growth trajectory (model based reference trajectory a fuzzy logic controller is used that alternates the process temperature. Additionally, in order to stay within the process corridors, a genetic algorithm was applied to tune the input and output fuzzy sets of a preliminarily parameterized fuzzy controller. The presented experimental results show that the genetic tuned fuzzy controller is able to keep the process within its allowed limits. The average absolute error to the

  11. What Are Probability Surveys used by the National Aquatic Resource Surveys?

    Science.gov (United States)

    The National Aquatic Resource Surveys (NARS) use probability-survey designs to assess the condition of the nation’s waters. In probability surveys (also known as sample-surveys or statistical surveys), sampling sites are selected randomly.

  12. Studies in Theoretical and Applied Statistics

    CERN Document Server

    Pratesi, Monica; Ruiz-Gazen, Anne

    2018-01-01

    This book includes a wide selection of the papers presented at the 48th Scientific Meeting of the Italian Statistical Society (SIS2016), held in Salerno on 8-10 June 2016. Covering a wide variety of topics ranging from modern data sources and survey design issues to measuring sustainable development, it provides a comprehensive overview of the current Italian scientific research in the fields of open data and big data in public administration and official statistics, survey sampling, ordinal and symbolic data, statistical models and methods for network data, time series forecasting, spatial analysis, environmental statistics, economic and financial data analysis, statistics in the education system, and sustainable development. Intended for researchers interested in theoretical and empirical issues, this volume provides interesting starting points for further research.

  13. Potentiation of cigarette smoking and radiation: evidence from a sputum cytology survey among uranium miners and controls

    International Nuclear Information System (INIS)

    Band, P.; Feldstein, M.; Saccomanno, G.; Watson, L.; King, G.

    1980-01-01

    To assess the effect of cigarette smoking and of exposure to radon daughters, a prospective survey consisting of periodic sputum cytology evaluation was initiated among 249 underground uranium miners and 123 male controls. Sputum cytology specimens showing moderate atypia, marked atypia, or cancer cells were classified as abnormal. As compared to control smokers, miners who smoke had a significantly higher incidence of abnormal cytology (P = 0.025). For miner smokers, the observed frequencies of abnormal cytology were linearly related to cumulative exposure to radon daughters and to the number of years of uranium mining. A statistical model relating the probability of abnormal cytology to the risk factors was investigated using a binary logistic regression. The estimated frequency of abnormal cytology was significantly dependent, for controls, on the duration of cigarette smoking, and for miners, on the duration of cigarette smoking and of uranium mining

  14. Use Of R in Statistics Lithuania

    Directory of Open Access Journals (Sweden)

    Tomas Rudys

    2016-06-01

    Full Text Available Recently R becoming more and more popular among official statistics offices. It can be used not even for research purposes, but also for a production of official statistics. Statistics Lithuania recently started an analysis of possibilities where R can be used and could it replace some other statistical programming languages or systems. For this reason a work group was arranged. In the paper we will present overview of the current situation on implementation of R in Statistics Lithuania, some problems we are chasing with and some future plans. At the current situation R is used mainly for research purposes. Looking for- ward a short courses on basic R was prepared and at the moment we are starting to use R for data analysis, data manipulation from Oracle data bases, some reports preparation, data editing, survey estimation. On the other hand we found some problems working with big data sets, also survey sampling as there are surveys with complex sampling designs. We are also analysing the running of R on our servers in order to have possibilities to use more random access memory (RAM. Despite the problems, we are trying to use R in more fields in production of official statistics.

  15. Disclosure Control using Partially Synthetic Data for Large-Scale Health Surveys, with Applications to CanCORS

    OpenAIRE

    Loong, Bronwyn; Zaslavsky, Alan M.; He, Yulei; Harrington, David P.

    2013-01-01

    Statistical agencies have begun to partially synthesize public-use data for major surveys to protect the confidentiality of respondents’ identities and sensitive attributes, by replacing high disclosure risk and sensitive variables with multiple imputations. To date, there are few applications of synthetic data techniques to large-scale healthcare survey data. Here, we describe partial synthesis of survey data collected by CanCORS, a comprehensive observational study of the experiences, treat...

  16. Quality control statistic for laboratory analysis and assays in Departamento de Tecnologia de Combustiveis - IPEN-BR

    International Nuclear Information System (INIS)

    Lima, Waldir C. de; Lainetti, Paulo E.O.; Lima, Roberto M. de; Peres, Henrique G.

    1996-01-01

    The purpose of this work is the study for introduction of the statistical control in test and analysis realized in the Departamento de Tecnologia de Combustiveis. Are succinctly introduced: theories of statistical process control, elaboration of control graphs, the definition of standards test (or analysis) and how the standards are employed for determination the control limits in the graphs. The more expressive result is the applied form for the practice quality control, moreover it is also exemplified the utilization of one standard of verification and analysis in the laboratory of control. (author)

  17. An Automated Statistical Process Control Study of Inline Mixing Using Spectrophotometric Detection

    Science.gov (United States)

    Dickey, Michael D.; Stewart, Michael D.; Willson, C. Grant

    2006-01-01

    An experiment is described, which is designed for a junior-level chemical engineering "fundamentals of measurements and data analysis" course, where students are introduced to the concept of statistical process control (SPC) through a simple inline mixing experiment. The students learn how to create and analyze control charts in an effort to…

  18. Statistics Graduate Teaching Assistants' Beliefs, Practices and Preparation for Teaching Introductory Statistics

    Science.gov (United States)

    Justice, Nicola; Zieffler, Andrew; Garfield, Joan

    2017-01-01

    Graduate teaching assistants (GTAs) are responsible for the instruction of many statistics courses offered at the university level, yet little is known about these students' preparation for teaching, their beliefs about how introductory statistics should be taught, or the pedagogical practices of the courses they teach. An online survey to examine…

  19. Factors controlling volume errors through 2D gully erosion assessment: guidelines for optimal survey design

    Science.gov (United States)

    Castillo, Carlos; Pérez, Rafael

    2017-04-01

    The assessment of gully erosion volumes is essential for the quantification of soil losses derived from this relevant degradation process. Traditionally, 2D and 3D approaches has been applied for this purpose (Casalí et al., 2006). Although innovative 3D approaches have recently been proposed for gully volume quantification, a renewed interest can be found in literature regarding the useful information that cross-section analysis still provides in gully erosion research. Moreover, the application of methods based on 2D approaches can be the most cost-effective approach in many situations such as preliminary studies with low accuracy requirements or surveys under time or budget constraints. The main aim of this work is to examine the key factors controlling volume error variability in 2D gully assessment by means of a stochastic experiment involving a Monte Carlo analysis over synthetic gully profiles in order to 1) contribute to a better understanding of the drivers and magnitude of gully erosion 2D-surveys uncertainty and 2) provide guidelines for optimal survey designs. Owing to the stochastic properties of error generation in 2D volume assessment, a statistical approach was followed to generate a large and significant set of gully reach configurations to evaluate quantitatively the influence of the main factors controlling the uncertainty of the volume assessment. For this purpose, a simulation algorithm in Matlab® code was written, involving the following stages: - Generation of synthetic gully area profiles with different degrees of complexity (characterized by the cross-section variability) - Simulation of field measurements characterised by a survey intensity and the precision of the measurement method - Quantification of the volume error uncertainty as a function of the key factors In this communication we will present the relationships between volume error and the studied factors and propose guidelines for 2D field surveys based on the minimal survey

  20. 76 FR 71309 - Notice of Intent To Suspend the Distillers Co-Products Survey and All Associated Reports

    Science.gov (United States)

    2011-11-17

    ... Distillers Co-Products Survey and All Associated Reports AGENCY: National Agricultural Statistics Service... Distillers Co- Products survey currently approved under docket 0535-0247. FOR FURTHER INFORMATION CONTACT... . SUPPLEMENTARY INFORMATION: Title: Suspension of Distillers Co-Products Survey. OMB Control Number: 0535-0247...

  1. Statistical process control: An approach to quality assurance in the production of vitrified nuclear waste

    International Nuclear Information System (INIS)

    Pulsipher, B.A.; Kuhn, W.L.

    1987-01-01

    Current planning for liquid high-level nuclear wastes existing in the United States includes processing in a liquid-fed ceramic melter to incorporate it into a high-quality glass, and placement in a deep geologic repository. The nuclear waste vitrification process requires assurance of a quality product with little or no final inspection. Statistical process control (SPC) is a quantitative approach to one quality assurance aspect of vitrified nuclear waste. This method for monitoring and controlling a process in the presence of uncertainties provides a statistical basis for decisions concerning product quality improvement. Statistical process control is shown to be a feasible and beneficial tool to help the waste glass producers demonstrate that the vitrification process can be controlled sufficiently to produce an acceptable product. This quantitative aspect of quality assurance could be an effective means of establishing confidence in the claims to a quality product

  2. Statistical process control: An approach to quality assurance in the production of vitrified nuclear waste

    International Nuclear Information System (INIS)

    Pulsipher, B.A.; Kuhn, W.L.

    1987-02-01

    Current planning for liquid high-level nuclear wastes existing in the US includes processing in a liquid-fed ceramic melter to incorporate it into a high-quality glass, and placement in a deep geologic repository. The nuclear waste vitrification process requires assurance of a quality product with little or no final inspection. Statistical process control (SPC) is a quantitative approach to one quality assurance aspect of vitrified nuclear waste. This method for monitoring and controlling a process in the presence of uncertainties provides a statistical basis for decisions concerning product quality improvement. Statistical process control is shown to be a feasible and beneficial tool to help the waste glass producers demonstrate that the vitrification process can be controlled sufficiently to produce an acceptable product. This quantitative aspect of quality assurance could be an effective means of establishing confidence in the claims to a quality product. 2 refs., 4 figs

  3. Monitoring a PVC batch process with multivariate statistical process control charts

    NARCIS (Netherlands)

    Tates, A. A.; Louwerse, D. J.; Smilde, A. K.; Koot, G. L. M.; Berndt, H.

    1999-01-01

    Multivariate statistical process control charts (MSPC charts) are developed for the industrial batch production process of poly(vinyl chloride) (PVC). With these MSPC charts different types of abnormal batch behavior were detected on-line. With batch contribution plots, the probable causes of these

  4. Transition to academic nurse educator: a survey exploring readiness, confidence, and locus of control.

    Science.gov (United States)

    Goodrich, Robin S

    2014-01-01

    The purpose of this study was to describe nurse transition to the role of academic nurse educator and to investigate the resources and barriers that nurses experience during this career transition, specifically the relationships among levels of readiness, confidence, personal control, support, decision independence, general self-esteem, and work locus of control. A convenience sample of registered nurses in the United States (N = 541) who hold current full-time employment at an accredited nursing program granting baccalaureate or higher degrees was utilized. Subjects were recruited via electronic mail and answered an on-line survey. Pearson product-moment correlation and multivariate analysis of variance were used for statistical calculations. Results indicated significant, positive relationships among all the variables except readiness and personal control (p = .01). Significant differences were found in amount of time that nurses were in the role of academic nurse educator and the demographic variables of number of children, marital status, and highest degree held. The results of this study provide evidence to support and enhance processes to develop and retain nurse academicians, to promote excellence in the academic nurse educator role, and to advance the science and practice of the profession. © 2014.

  5. The application of statistical process control in linac quality assurance

    International Nuclear Information System (INIS)

    Li Dingyu; Dai Jianrong

    2009-01-01

    Objective: To improving linac quality assurance (QA) program with statistical process control (SPC) method. Methods: SPC is applied to set the control limit of QA data, draw charts and differentiate the random and systematic errors. A SPC quality assurance software named QA M ANAGER has been developed by VB programming for clinical use. Two clinical cases are analyzed with SPC to study daily output QA of a 6MV photon beam. Results: In the clinical case, the SPC is able to identify the systematic errors. Conclusion: The SPC application may be assistant to detect systematic errors in linac quality assurance thus it alarms the abnormal trend to eliminate the systematic errors and improves quality control. (authors)

  6. Relationship between parental locus of control and caries experience in preschool children - cross-sectional survey.

    Science.gov (United States)

    Lencová, Erika; Pikhart, Hynek; Broukal, Zdenek; Tsakos, Georgios

    2008-06-12

    Due to high prevalence and serious impacts, childhood caries represents a public health issue. Behavioural risk factors such as locus of health control have been implicated in the development of the disease; however their association with childhood caries has not been thoroughly studied. The aim of this cross-sectional survey was to assess the relationship between parental locus of health control and caries experience and untreated caries of their preschool children in a representative sample in Czech Republic, adjusting for relevant sociodemographic characteristics. A representative sample of 285 preschool children and their parents was recruited. Study data included children's dental status recorded in nurseries and parental questionnaires with 13 attitudinal items regarding locus of control (LoC) in caries prevention. The association between parental locus of control and children's caries experience and level of untreated caries was analysed using logistic regression, adjusting for the effect of key sociodemographic variables. There was a statistically highly significant linear trend between increased parental LoC and higher probability of the children to be free from untreated caries, independent from the effect of sociodemographic variables of children and parents. A similar highly statistically significant trend, although not entirely linear, and independent from sociodemographic variables was observed with respect to the chance of the children to be free from caries experience with increasing strength of parental LoC. After full adjustment, children in the strongest parental LoC quintile were 2.81 (1.23-6.42, pcontrol of both untreated caries and caries experience in their preschool children and highlight that a more internal LoC within the family is advantageous in the prevention of dental caries.

  7. Are You in a Survey?

    Science.gov (United States)

    Employment and Payroll Survey of Business Owners Work from Home Our statistics highlight trends in household statistics from multiple surveys. Data Tools & Apps Main American FactFinder Census Business Builder My Classification Codes (i.e., NAICS) Economic Census Economic Indicators Economic Studies Industry Statistics

  8. A statistical model for estimation of fish density including correlation in size, space, time and between species from research survey data

    DEFF Research Database (Denmark)

    Nielsen, J. Rasmus; Kristensen, Kasper; Lewy, Peter

    2014-01-01

    Trawl survey data with high spatial and seasonal coverage were analysed using a variant of the Log Gaussian Cox Process (LGCP) statistical model to estimate unbiased relative fish densities. The model estimates correlations between observations according to time, space, and fish size and includes...

  9. Statistical Yearbook of Norway 2012

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2012-07-01

    The Statistical Yearbook of Norway 2012 contains statistics on Norway and main figures for the Nordic countries and other countries selected from international statistics. The international over-views are integrated with the other tables and figures. The selection of tables in this edition is mostly the same as in the 2011 edition. The yearbook's 480 tables and figures present the main trends in official statistics in most areas of society. The list of tables and figures and an index at the back of the book provide easy access to relevant information. In addition, source information and Internet addresses below the tables make the yearbook a good starting point for those who are looking for more detailed statistics. The statistics are based on data gathered in statistical surveys and from administrative data, which, in cooperation with other public institutions, have been made available for statistical purposes. Some tables have been prepared in their entirety by other public institutions. The statistics follow approved principles, standards and classifications that are in line with international recommendations and guidelines. Content: 00. General subjects; 01. Environment; 02. Population; 03. Health and social conditions; 04. Education; 05. Personal economy and housing conditions; 06. Labour market; 07. Recreational, cultural and sporting activities; 08. Prices and indices; 09. National Economy and external trade; 10. Industrial activities; 11. Financial markets; 12. Public finances; Geographical survey.(eb)

  10. Statistical Yearbook of Norway 2012

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2012-07-01

    The Statistical Yearbook of Norway 2012 contains statistics on Norway and main figures for the Nordic countries and other countries selected from international statistics. The international over-views are integrated with the other tables and figures. The selection of tables in this edition is mostly the same as in the 2011 edition. The yearbook's 480 tables and figures present the main trends in official statistics in most areas of society. The list of tables and figures and an index at the back of the book provide easy access to relevant information. In addition, source information and Internet addresses below the tables make the yearbook a good starting point for those who are looking for more detailed statistics. The statistics are based on data gathered in statistical surveys and from administrative data, which, in cooperation with other public institutions, have been made available for statistical purposes. Some tables have been prepared in their entirety by other public institutions. The statistics follow approved principles, standards and classifications that are in line with international recommendations and guidelines. Content: 00. General subjects; 01. Environment; 02. Population; 03. Health and social conditions; 04. Education; 05. Personal economy and housing conditions; 06. Labour market; 07. Recreational, cultural and sporting activities; 08. Prices and indices; 09. National Economy and external trade; 10. Industrial activities; 11. Financial markets; 12. Public finances; Geographical survey.(eb)

  11. Calculation Software versus Illustration Software for Teaching Statistics

    DEFF Research Database (Denmark)

    Mortensen, Peter Stendahl; Boyle, Robin G.

    1999-01-01

    As personal computers have become more and more powerful, so have the software packages available to us for teaching statistics. This paper investigates what software packages are currently being used by progressive statistics instructors at university level, examines some of the deficiencies...... of such software, and indicates features that statistics instructors wish to have incorporated in software in the future. The basis of the paper is a survey of participants at ICOTS-5 (the Fifth International Conference on Teaching Statistics). These survey results, combined with the software based papers...

  12. Statistical problems raised by data processing of food surveys

    International Nuclear Information System (INIS)

    Lacourly, Nancy

    1974-01-01

    The methods used for the analysis of dietary habits of national populations - food surveys - have been studied. S. Lederman's linear model for the estimation of the average individual consumptions from the total family diets was in the light of a food survey carried on with 250 Roman families in 1969. An important bias in the estimates thus obtained was shown out by a simulation assuming 'housewife's dictatorship'; these assumptions should contribute to set up an unbiased model. Several techniques of multidimensional analysis were therefore used and the theoretical aspect of linear regression for some particular situations had to be investigated: quasi-colinear 'independent variables', measurements with errors, positive constraints on regression coefficients. A new survey methodology was developed taking account of the new 'Integrated Information Systems', which have incidence on all the stages of a consumption survey: organization, data collection, constitution of an information bank and data processing. (author) [fr

  13. The Research Potential of New Types of Enterprise Data based on Surveys from Official Statistics in Germany

    OpenAIRE

    Joachim Wagner

    2010-01-01

    A new generation of data sets became available recently in the research data centres of the German statistical offices. These new data combine information for firms gathered in different surveys (or from other sources) that could not be analyzed jointly before. This paper offers a short description of these data, and gives examples of their use to demonstrate their research potential. Furthermore, and looking ahead to the next generation of data, it discusses an ongoing project, KombiFiD, tha...

  14. Cancer Statistics

    Science.gov (United States)

    ... What Is Cancer? Cancer Statistics Cancer Disparities Cancer Statistics Cancer has a major impact on society in ... success of efforts to control and manage cancer. Statistics at a Glance: The Burden of Cancer in ...

  15. Nonclinical statistics for pharmaceutical and biotechnology industries

    CERN Document Server

    2016-01-01

    This book serves as a reference text for regulatory, industry and academic statisticians and also a handy manual for entry level Statisticians. Additionally it aims to stimulate academic interest in the field of Nonclinical Statistics and promote this as an important discipline in its own right. This text brings together for the first time in a single volume a comprehensive survey of methods important to the nonclinical science areas within the pharmaceutical and biotechnology industries. Specifically the Discovery and Translational sciences, the Safety/Toxiology sciences, and the Chemistry, Manufacturing and Controls sciences. Drug discovery and development is a long and costly process. Most decisions in the drug development process are made with incomplete information. The data is rife with uncertainties and hence risky by nature. This is therefore the purview of Statistics. As such, this book aims to introduce readers to important statistical thinking and its application in these nonclinical areas. The cha...

  16. Improving the Document Development Process: Integrating Relational Data and Statistical Process Control.

    Science.gov (United States)

    Miller, John

    1994-01-01

    Presents an approach to document numbering, document titling, and process measurement which, when used with fundamental techniques of statistical process control, reveals meaningful process-element variation as well as nominal productivity models. (SR)

  17. The use of test scores from large-scale assessment surveys: psychometric and statistical considerations

    Directory of Open Access Journals (Sweden)

    Henry Braun

    2017-11-01

    Full Text Available Abstract Background Economists are making increasing use of measures of student achievement obtained through large-scale survey assessments such as NAEP, TIMSS, and PISA. The construction of these measures, employing plausible value (PV methodology, is quite different from that of the more familiar test scores associated with assessments such as the SAT or ACT. These differences have important implications both for utilization and interpretation. Although much has been written about PVs, it appears that there are still misconceptions about whether and how to employ them in secondary analyses. Methods We address a range of technical issues, including those raised in a recent article that was written to inform economists using these databases. First, an extensive review of the relevant literature was conducted, with particular attention to key publications that describe the derivation and psychometric characteristics of such achievement measures. Second, a simulation study was carried out to compare the statistical properties of estimates based on the use of PVs with those based on other, commonly used methods. Results It is shown, through both theoretical analysis and simulation, that under fairly general conditions appropriate use of PV yields approximately unbiased estimates of model parameters in regression analyses of large scale survey data. The superiority of the PV methodology is particularly evident when measures of student achievement are employed as explanatory variables. Conclusions The PV methodology used to report student test performance in large scale surveys remains the state-of-the-art for secondary analyses of these databases.

  18. The statistical analysis of anisotropies

    International Nuclear Information System (INIS)

    Webster, A.

    1977-01-01

    One of the many uses to which a radio survey may be put is an analysis of the distribution of the radio sources on the celestial sphere to find out whether they are bunched into clusters or lie in preferred regions of space. There are many methods of testing for clustering in point processes and since they are not all equally good this contribution is presented as a brief guide to what seems to be the best of them. The radio sources certainly do not show very strong clusering and may well be entirely unclustered so if a statistical method is to be useful it must be both powerful and flexible. A statistic is powerful in this context if it can efficiently distinguish a weakly clustered distribution of sources from an unclustered one, and it is flexible if it can be applied in a way which avoids mistaking defects in the survey for true peculiarities in the distribution of sources. The paper divides clustering statistics into two classes: number density statistics and log N/log S statistics. (Auth.)

  19. Analyzing a Mature Software Inspection Process Using Statistical Process Control (SPC)

    Science.gov (United States)

    Barnard, Julie; Carleton, Anita; Stamper, Darrell E. (Technical Monitor)

    1999-01-01

    This paper presents a cooperative effort where the Software Engineering Institute and the Space Shuttle Onboard Software Project could experiment applying Statistical Process Control (SPC) analysis to inspection activities. The topics include: 1) SPC Collaboration Overview; 2) SPC Collaboration Approach and Results; and 3) Lessons Learned.

  20. Turking Statistics: Student-Generated Surveys Increase Student Engagement and Performance

    Science.gov (United States)

    Whitley, Cameron T.; Dietz, Thomas

    2018-01-01

    Thirty years ago, Hubert M. Blalock Jr. published an article in "Teaching Sociology" about the importance of teaching statistics. We honor Blalock's legacy by assessing how using Amazon Mechanical Turk (MTurk) in statistics classes can enhance student learning and increase statistical literacy among social science gradaute students. In…

  1. Using Statistical Process Control to Drive Improvement in Neonatal Care: A Practical Introduction to Control Charts.

    Science.gov (United States)

    Gupta, Munish; Kaplan, Heather C

    2017-09-01

    Quality improvement (QI) is based on measuring performance over time, and variation in data measured over time must be understood to guide change and make optimal improvements. Common cause variation is natural variation owing to factors inherent to any process; special cause variation is unnatural variation owing to external factors. Statistical process control methods, and particularly control charts, are robust tools for understanding data over time and identifying common and special cause variation. This review provides a practical introduction to the use of control charts in health care QI, with a focus on neonatology. Copyright © 2017 Elsevier Inc. All rights reserved.

  2. Topics in theoretical and applied statistics

    CERN Document Server

    Giommi, Andrea

    2016-01-01

    This book highlights the latest research findings from the 46th International Meeting of the Italian Statistical Society (SIS) in Rome, during which both methodological and applied statistical research was discussed. This selection of fully peer-reviewed papers, originally presented at the meeting, addresses a broad range of topics, including the theory of statistical inference; data mining and multivariate statistical analysis; survey methodologies; analysis of social, demographic and health data; and economic statistics and econometrics.

  3. EFFECT OF MEASUREMENT ERRORS ON PREDICTED COSMOLOGICAL CONSTRAINTS FROM SHEAR PEAK STATISTICS WITH LARGE SYNOPTIC SURVEY TELESCOPE

    Energy Technology Data Exchange (ETDEWEB)

    Bard, D.; Chang, C.; Kahn, S. M.; Gilmore, K.; Marshall, S. [KIPAC, Stanford University, 452 Lomita Mall, Stanford, CA 94309 (United States); Kratochvil, J. M.; Huffenberger, K. M. [Department of Physics, University of Miami, Coral Gables, FL 33124 (United States); May, M. [Physics Department, Brookhaven National Laboratory, Upton, NY 11973 (United States); AlSayyad, Y.; Connolly, A.; Gibson, R. R.; Jones, L.; Krughoff, S. [Department of Astronomy, University of Washington, Seattle, WA 98195 (United States); Ahmad, Z.; Bankert, J.; Grace, E.; Hannel, M.; Lorenz, S. [Department of Physics, Purdue University, West Lafayette, IN 47907 (United States); Haiman, Z.; Jernigan, J. G., E-mail: djbard@slac.stanford.edu [Department of Astronomy and Astrophysics, Columbia University, New York, NY 10027 (United States); and others

    2013-09-01

    We study the effect of galaxy shape measurement errors on predicted cosmological constraints from the statistics of shear peak counts with the Large Synoptic Survey Telescope (LSST). We use the LSST Image Simulator in combination with cosmological N-body simulations to model realistic shear maps for different cosmological models. We include both galaxy shape noise and, for the first time, measurement errors on galaxy shapes. We find that the measurement errors considered have relatively little impact on the constraining power of shear peak counts for LSST.

  4. Statistical imprints of CMB B -type polarization leakage in an incomplete sky survey analysis

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Larissa; Wang, Kai; Hu, Yangrui; Fang, Wenjuan; Zhao, Wen, E-mail: larissa@ustc.edu.cn, E-mail: ljwk@mail.ustc.edu.cn, E-mail: hyr1996@mail.ustc.edu.cn, E-mail: wenjuan.fang@gmail.com, E-mail: wzhao7@ustc.edu.cn [CAS Key Laboratory for Researches in Galaxies and Cosmology, Department of Astronomy, University of Science and Technology of China, Chinese Academy of Sciences, Hefei, Anhui 230026 (China)

    2017-01-01

    One of the main goals of modern cosmology is to search for primordial gravitational waves by looking on their imprints in the B -type polarization in the cosmic microwave background radiation. However, this signal is contaminated by various sources, including cosmic weak lensing, foreground radiations, instrumental noises, as well as the E -to- B leakage caused by the partial sky surveys, which should be well understood to avoid the misinterpretation of the observed data. In this paper, we adopt the E / B decomposition method suggested by Smith in 2006, and study the imprints of E -to- B leakage residuals in the constructed B -type polarization maps, B( n-circumflex ), by employing various statistical tools. We find that the effects of E -to- B leakage are negligible for the B-mode power spectrum, as well as the skewness and kurtosis analyses of B-maps. However, if employing the morphological statistical tools, including Minkowski functionals and/or Betti numbers, we find the effect of leakage can be detected at very high confidence level, which shows that in the morphological analysis, the leakage can play a significant role as a contaminant for measuring the primordial B -mode signal and must be taken into account for a correct explanation of the data.

  5. Estimating survival probabilities by exposure levels: utilizing vital statistics and complex survey data with mortality follow-up.

    Science.gov (United States)

    Landsman, V; Lou, W Y W; Graubard, B I

    2015-05-20

    We present a two-step approach for estimating hazard rates and, consequently, survival probabilities, by levels of general categorical exposure. The resulting estimator utilizes three sources of data: vital statistics data and census data are used at the first step to estimate the overall hazard rate for a given combination of gender and age group, and cohort data constructed from a nationally representative complex survey with linked mortality records, are used at the second step to divide the overall hazard rate by exposure levels. We present an explicit expression for the resulting estimator and consider two methods for variance estimation that account for complex multistage sample design: (1) the leaving-one-out jackknife method, and (2) the Taylor linearization method, which provides an analytic formula for the variance estimator. The methods are illustrated with smoking and all-cause mortality data from the US National Health Interview Survey Linked Mortality Files, and the proposed estimator is compared with a previously studied crude hazard rate estimator that uses survey data only. The advantages of a two-step approach and possible extensions of the proposed estimator are discussed. Copyright © 2015 John Wiley & Sons, Ltd.

  6. Control room habitability survey of licensed commercial nuclear power generating stations

    International Nuclear Information System (INIS)

    Driscoll, J.W.

    1988-10-01

    This document presents the results of a survey of control room habitability systems at twelve commercial nuclear generating stations. The survey, conducted by Argonne National Laboratory (ANL), is part of an NRC program initiated in response to concerns and recommendations of the Advisory Committee on Reactor Safeguards (ACRS). The major conclusion of the report is that the numerous types of potentially significant discrepancies found among the surveyed plants may be indicative of similar discrepancies throughout the industry. The report provides plant-specific and generalized findings regarding safety functions with respect to the consistency of the design, construction, operation and testing of control room habitability systems and corresponding Technical Specifications compared with descriptions provided in the license basis documentation including assumptions in the operator toxic gas concentration and radiation dose calculations. Calculations of operator toxic gas concentrations and radiation doses were provided in the license basis documentation and were not performed by the ANL survey team. Recommendation for improvements are provided in the report

  7. Perceived Statistical Knowledge Level and Self-Reported Statistical Practice Among Academic Psychologists

    Directory of Open Access Journals (Sweden)

    Laura Badenes-Ribera

    2018-06-01

    Full Text Available Introduction: Publications arguing against the null hypothesis significance testing (NHST procedure and in favor of good statistical practices have increased. The most frequently mentioned alternatives to NHST are effect size statistics (ES, confidence intervals (CIs, and meta-analyses. A recent survey conducted in Spain found that academic psychologists have poor knowledge about effect size statistics, confidence intervals, and graphic displays for meta-analyses, which might lead to a misinterpretation of the results. In addition, it also found that, although the use of ES is becoming generalized, the same thing is not true for CIs. Finally, academics with greater knowledge about ES statistics presented a profile closer to good statistical practice and research design. Our main purpose was to analyze the extension of these results to a different geographical area through a replication study.Methods: For this purpose, we elaborated an on-line survey that included the same items as the original research, and we asked academic psychologists to indicate their level of knowledge about ES, their CIs, and meta-analyses, and how they use them. The sample consisted of 159 Italian academic psychologists (54.09% women, mean age of 47.65 years. The mean number of years in the position of professor was 12.90 (SD = 10.21.Results: As in the original research, the results showed that, although the use of effect size estimates is becoming generalized, an under-reporting of CIs for ES persists. The most frequent ES statistics mentioned were Cohen's d and R2/η2, which can have outliers or show non-normality or violate statistical assumptions. In addition, academics showed poor knowledge about meta-analytic displays (e.g., forest plot and funnel plot and quality checklists for studies. Finally, academics with higher-level knowledge about ES statistics seem to have a profile closer to good statistical practices.Conclusions: Changing statistical practice is not

  8. Statistical transformation and the interpretation of inpatient glucose control data from the intensive care unit.

    Science.gov (United States)

    Saulnier, George E; Castro, Janna C; Cook, Curtiss B

    2014-05-01

    Glucose control can be problematic in critically ill patients. We evaluated the impact of statistical transformation on interpretation of intensive care unit inpatient glucose control data. Point-of-care blood glucose (POC-BG) data derived from patients in the intensive care unit for 2011 was obtained. Box-Cox transformation of POC-BG measurements was performed, and distribution of data was determined before and after transformation. Different data subsets were used to establish statistical upper and lower control limits. Exponentially weighted moving average (EWMA) control charts constructed from April, October, and November data determined whether out-of-control events could be identified differently in transformed versus nontransformed data. A total of 8679 POC-BG values were analyzed. POC-BG distributions in nontransformed data were skewed but approached normality after transformation. EWMA control charts revealed differences in projected detection of out-of-control events. In April, an out-of-control process resulting in the lower control limit being exceeded was identified at sample 116 in nontransformed data but not in transformed data. October transformed data detected an out-of-control process exceeding the upper control limit at sample 27 that was not detected in nontransformed data. Nontransformed November results remained in control, but transformation identified an out-of-control event less than 10 samples into the observation period. Using statistical methods to assess population-based glucose control in the intensive care unit could alter conclusions about the effectiveness of care processes for managing hyperglycemia. Further study is required to determine whether transformed versus nontransformed data change clinical decisions about the interpretation of care or intervention results. © 2014 Diabetes Technology Society.

  9. Weak lensing in the Dark Energy Survey

    Science.gov (United States)

    Troxel, Michael

    2016-03-01

    I will present the current status of weak lensing results from the Dark Energy Survey (DES). DES will survey 5000 square degrees in five photometric bands (grizY), and has already provided a competitive weak lensing catalog from Science Verification data covering just 3% of the final survey footprint. I will summarize the status of shear catalog production using observations from the first year of the survey and discuss recent weak lensing science results from DES. Finally, I will report on the outlook for future cosmological analyses in DES including the two-point cosmic shear correlation function and discuss challenges that DES and future surveys will face in achieving a control of systematics that allows us to take full advantage of the available statistical power of our shear catalogs.

  10. The product composition control system at Savannah River: Statistical process control algorithm

    International Nuclear Information System (INIS)

    Brown, K.G.

    1994-01-01

    The Defense Waste Processing Facility (DWPF) at the Savannah River Site (SRS) will be used to immobilize the approximately 130 million liters of high-level nuclear waste currently stored at the site in 51 carbon steel tanks. Waste handling operations separate this waste into highly radioactive insoluble sludge and precipitate and less radioactive water soluble salts. In DWPF, precipitate (PHA) is blended with insoluble sludge and ground glass frit to produce melter feed slurry which is continuously fed to the DWPF melter. The melter produces a molten borosilicate glass which is poured into stainless steel canisters for cooling and, ultimately, shipment to and storage in an geologic repository. Described here is the Product Composition Control System (PCCS) process control algorithm. The PCCS is the amalgam of computer hardware and software intended to ensure that the melt will be processable and that the glass wasteform produced will be acceptable. Within PCCS, the Statistical Process Control (SPC) Algorithm is the means which guides control of the DWPF process. The SPC Algorithm is necessary to control the multivariate DWPF process in the face of uncertainties arising from the process, its feeds, sampling, modeling, and measurement systems. This article describes the functions performed by the SPC Algorithm, characterization of DWPF prior to making product, accounting for prediction uncertainty, accounting for measurement uncertainty, monitoring a SME batch, incorporating process information, and advantages of the algorithm. 9 refs., 6 figs

  11. 75 FR 35093 - Submission for Review: Customer Satisfaction Surveys, OMB Control No. 3206-0236.

    Science.gov (United States)

    2010-06-21

    ... OFFICE OF PERSONNEL MANAGEMENT Submission for Review: Customer Satisfaction Surveys, OMB Control... customers to evaluate our performance in providing services. Customer satisfaction surveys are valuable... surveys. Only those surveys relating specifically to customer satisfaction will be associated with OMB...

  12. Quality Control of the Print with the Application of Statistical Methods

    Science.gov (United States)

    Simonenko, K. V.; Bulatova, G. S.; Antropova, L. B.; Varepo, L. G.

    2018-04-01

    The basis for standardizing the process of offset printing is the control of print quality indicators. The solution of this problem has various approaches, among which the most important are statistical methods. Practical implementation of them for managing the quality of the printing process is very relevant and is reflected in this paper. The possibility of using the method of constructing a Control Card to identify the reasons for the deviation of the optical density for a triad of inks in offset printing is shown.

  13. The German Birth Order Register - order-specific data generated from perinatal statistics and statistics on out-of-hospital births 2001-2008

    OpenAIRE

    Michaela Kreyenfeld; Rembrandt D. Scholz; Frederik Peters; Ines Wlosnewski

    2010-01-01

    Until 2008, Germany’s vital statistics did not include information on the biological order of each birth. This resulted in a dearth of important demographic indicators, such as the mean age at first birth and the level of childlessness. Researchers have tried to fill this gap by generating order-specific birth rates from survey data, and by combining survey data with vital statistics. This paper takes a different approach by using hospital statistics on births to generate birth order-specific...

  14. Assessing thermal comfort and energy efficiency in buildings by statistical quality control for autocorrelated data

    International Nuclear Information System (INIS)

    Barbeito, Inés; Zaragoza, Sonia; Tarrío-Saavedra, Javier; Naya, Salvador

    2017-01-01

    Highlights: • Intelligent web platform development for energy efficiency management in buildings. • Controlling and supervising thermal comfort and energy consumption in buildings. • Statistical quality control procedure to deal with autocorrelated data. • Open source alternative using R software. - Abstract: In this paper, a case study of performing a reliable statistical procedure to evaluate the quality of HVAC systems in buildings using data retrieved from an ad hoc big data web energy platform is presented. The proposed methodology based on statistical quality control (SQC) is used to analyze the real state of thermal comfort and energy efficiency of the offices of the company FRIDAMA (Spain) in a reliable way. Non-conformities or alarms, and the actual assignable causes of these out of control states are detected. The capability to meet specification requirements is also analyzed. Tools and packages implemented in the open-source R software are employed to apply the different procedures. First, this study proposes to fit ARIMA time series models to CTQ variables. Then, the application of Shewhart and EWMA control charts to the time series residuals is proposed to control and monitor thermal comfort and energy consumption in buildings. Once thermal comfort and consumption variability are estimated, the implementation of capability indexes for autocorrelated variables is proposed to calculate the degree to which standards specifications are met. According with case study results, the proposed methodology has detected real anomalies in HVAC installation, helping to detect assignable causes and to make appropriate decisions. One of the goals is to perform and describe step by step this statistical procedure in order to be replicated by practitioners in a better way.

  15. Statistical Process Control: A Quality Tool for a Venous Thromboembolic Disease Registry.

    Science.gov (United States)

    Posadas-Martinez, Maria Lourdes; Rojas, Liliana Paloma; Vazquez, Fernando Javier; De Quiros, Fernan Bernaldo; Waisman, Gabriel Dario; Giunta, Diego Hernan

    2016-01-01

    We aim to describe Statistical Control Process as a quality tool for the Institutional Registry of Venous Thromboembolic Disease (IRTD), a registry developed in a community-care tertiary hospital in Buenos Aires, Argentina. The IRTD is a prospective cohort. The process of data acquisition began with the creation of a computerized alert generated whenever physicians requested imaging or laboratory study to diagnose venous thromboembolism, which defined eligible patients. The process then followed a structured methodology for patient's inclusion, evaluation, and posterior data entry. To control this process, process performance indicators were designed to be measured monthly. These included the number of eligible patients, the number of included patients, median time to patient's evaluation, and percentage of patients lost to evaluation. Control charts were graphed for each indicator. The registry was evaluated in 93 months, where 25,757 patients were reported and 6,798 patients met inclusion criteria. The median time to evaluation was 20 hours (SD, 12) and 7.7% of the total was lost to evaluation. Each indicator presented trends over time, caused by structural changes and improvement cycles, and therefore the central limit suffered inflexions. Statistical process control through process performance indicators allowed us to control the performance of the registry over time to detect systematic problems. We postulate that this approach could be reproduced for other clinical registries.

  16. Statistical cluster analysis and diagnosis of nuclear system level performance

    International Nuclear Information System (INIS)

    Teichmann, T.; Levine, M.M.; Samanta, P.K.; Kato, W.Y.

    1985-01-01

    The complexity of individual nuclear power plants and the importance of maintaining reliable and safe operations makes it desirable to complement the deterministic analyses of these plants by corresponding statistical surveys and diagnoses. Based on such investigations, one can then explore, statistically, the anticipation, prevention, and when necessary, the control of such failures and malfunctions. This paper, and the accompanying one by Samanta et al., describe some of the initial steps in exploring the feasibility of setting up such a program on an integrated and global (industry-wide) basis. The conceptual statistical and data framework was originally outlined in BNL/NUREG-51609, NUREG/CR-3026, and the present work aims at showing how some important elements might be implemented in a practical way (albeit using hypothetical or simulated data)

  17. Statistical deception at work

    CERN Document Server

    Mauro, John

    2013-01-01

    Written to reveal statistical deceptions often thrust upon unsuspecting journalists, this book views the use of numbers from a public perspective. Illustrating how the statistical naivete of journalists often nourishes quantitative misinformation, the author's intent is to make journalists more critical appraisers of numerical data so that in reporting them they do not deceive the public. The book frequently uses actual reported examples of misused statistical data reported by mass media and describes how journalists can avoid being taken in by them. Because reports of survey findings seldom g

  18. Multivariate Statistical Process Control Charts and the Problem of Interpretation: A Short Overview and Some Applications in Industry

    OpenAIRE

    Bersimis, Sotiris; Panaretos, John; Psarakis, Stelios

    2005-01-01

    Woodall and Montgomery [35] in a discussion paper, state that multivariate process control is one of the most rapidly developing sections of statistical process control. Nowadays, in industry, there are many situations in which the simultaneous monitoring or control, of two or more related quality - process characteristics is necessary. Process monitoring problems in which several related variables are of interest are collectively known as Multivariate Statistical Process Control (MSPC).This ...

  19. Statistical process control: A feasibility study of the application of time-series measurement in early neurorehabilitation after acquired brain injury.

    Science.gov (United States)

    Markovic, Gabriela; Schult, Marie-Louise; Bartfai, Aniko; Elg, Mattias

    2017-01-31

    Progress in early cognitive recovery after acquired brain injury is uneven and unpredictable, and thus the evaluation of rehabilitation is complex. The use of time-series measurements is susceptible to statistical change due to process variation. To evaluate the feasibility of using a time-series method, statistical process control, in early cognitive rehabilitation. Participants were 27 patients with acquired brain injury undergoing interdisciplinary rehabilitation of attention within 4 months post-injury. The outcome measure, the Paced Auditory Serial Addition Test, was analysed using statistical process control. Statistical process control identifies if and when change occurs in the process according to 3 patterns: rapid, steady or stationary performers. The statistical process control method was adjusted, in terms of constructing the baseline and the total number of measurement points, in order to measure a process in change. Statistical process control methodology is feasible for use in early cognitive rehabilitation, since it provides information about change in a process, thus enabling adjustment of the individual treatment response. Together with the results indicating discernible subgroups that respond differently to rehabilitation, statistical process control could be a valid tool in clinical decision-making. This study is a starting-point in understanding the rehabilitation process using a real-time-measurements approach.

  20. Survey on the Finnsh biotechnology industry: Background and descriptive statistics

    OpenAIRE

    Hermans, Raine; Kulvik, Martti; Tahvanainen, Antti-Jussi

    2005-01-01

    ETLA, the Research Institute of the Finnish Economy, conducted surveys at the end of 2004 and at the beginning of 2002 on the enterprises listed in the Index of Biotechnology Companies in the Finnish Bioindustries organization. The surveys provide data on financial accounting, R&D activities, intellectual property rights, and sales forecasts. In addition to the updates, the ETLA 2004 Survey also provides detailed linkages to product-level information that incorporates R&D- and sales figures, ...

  1. THE SLOAN DIGITAL SKY SURVEY QUASAR LENS SEARCH. IV. STATISTICAL LENS SAMPLE FROM THE FIFTH DATA RELEASE

    International Nuclear Information System (INIS)

    Inada, Naohisa; Oguri, Masamune; Shin, Min-Su; Kayo, Issha; Fukugita, Masataka; Strauss, Michael A.; Gott, J. Richard; Hennawi, Joseph F.; Morokuma, Tomoki; Becker, Robert H.; Gregg, Michael D.; White, Richard L.; Kochanek, Christopher S.; Chiu, Kuenley; Johnston, David E.; Clocchiatti, Alejandro; Richards, Gordon T.; Schneider, Donald P.; Frieman, Joshua A.

    2010-01-01

    We present the second report of our systematic search for strongly lensed quasars from the data of the Sloan Digital Sky Survey (SDSS). From extensive follow-up observations of 136 candidate objects, we find 36 lenses in the full sample of 77,429 spectroscopically confirmed quasars in the SDSS Data Release 5. We then define a complete sample of 19 lenses, including 11 from our previous search in the SDSS Data Release 3, from the sample of 36,287 quasars with i Λ = 0.84 +0.06 -0.08 (stat.) +0.09 -0.07 (syst.) assuming a flat universe, which is in good agreement with other cosmological observations. We also report the discoveries of seven binary quasars with separations ranging from 1.''1 to 16.''6, which are identified in the course of our lens survey. This study concludes the construction of our statistical lens sample in the full SDSS-I data set.

  2. 75 FR 65040 - Submission for Review: Customer Satisfaction Surveys, OMB Control No. 3206-0236

    Science.gov (United States)

    2010-10-21

    ... OFFICE OF PERSONNEL MANAGEMENT Submission for Review: Customer Satisfaction Surveys, OMB Control.... Customer satisfaction surveys are valuable tools to gather information from our customers so we can design... specifically to customer satisfaction will be associated with OMB Control No. 3206-0236. We estimate 495,182...

  3. Advances in Statistical Control, Algebraic Systems Theory, and Dynamic Systems Characteristics A Tribute to Michael K Sain

    CERN Document Server

    Won, Chang-Hee; Michel, Anthony N

    2008-01-01

    This volume - dedicated to Michael K. Sain on the occasion of his seventieth birthday - is a collection of chapters covering recent advances in stochastic optimal control theory and algebraic systems theory. Written by experts in their respective fields, the chapters are thematically organized into four parts: Part I focuses on statistical control theory, where the cost function is viewed as a random variable and performance is shaped through cost cumulants. In this respect, statistical control generalizes linear-quadratic-Gaussian and H-infinity control. Part II addresses algebraic systems th

  4. Assessment of the beryllium lymphocyte proliferation test using statistical process control.

    Science.gov (United States)

    Cher, Daniel J; Deubner, David C; Kelsh, Michael A; Chapman, Pamela S; Ray, Rose M

    2006-10-01

    Despite more than 20 years of surveillance and epidemiologic studies using the beryllium blood lymphocyte proliferation test (BeBLPT) as a measure of beryllium sensitization (BeS) and as an aid for diagnosing subclinical chronic beryllium disease (CBD), improvements in specific understanding of the inhalation toxicology of CBD have been limited. Although epidemiologic data suggest that BeS and CBD risks vary by process/work activity, it has proven difficult to reach specific conclusions regarding the dose-response relationship between workplace beryllium exposure and BeS or subclinical CBD. One possible reason for this uncertainty could be misclassification of BeS resulting from variation in BeBLPT testing performance. The reliability of the BeBLPT, a biological assay that measures beryllium sensitization, is unknown. To assess the performance of four laboratories that conducted this test, we used data from a medical surveillance program that offered testing for beryllium sensitization with the BeBLPT. The study population was workers exposed to beryllium at various facilities over a 10-year period (1992-2001). Workers with abnormal results were offered diagnostic workups for CBD. Our analyses used a standard statistical technique, statistical process control (SPC), to evaluate test reliability. The study design involved a repeated measures analysis of BeBLPT results generated from the company-wide, longitudinal testing. Analytical methods included use of (1) statistical process control charts that examined temporal patterns of variation for the stimulation index, a measure of cell reactivity to beryllium; (2) correlation analysis that compared prior perceptions of BeBLPT instability to the statistical measures of test variation; and (3) assessment of the variation in the proportion of missing test results and how time periods with more missing data influenced SPC findings. During the period of this study, all laboratories displayed variation in test results that

  5. A statistical study towards high-mass BGPS clumps with the MALT90 survey

    Science.gov (United States)

    Liu, Xiao-Lan; Xu, Jin-Long; Ning, Chang-Chun; Zhang, Chuan-Peng; Liu, Xiao-Tao

    2018-01-01

    In this work, we perform a statistical investigation towards 50 high-mass clumps using data from the Bolocam Galactic Plane Survey (BGPS) and Millimetre Astronomy Legacy Team 90-GHz survey (MALT90). Eleven dense molecular lines (N2H+(1–0), HNC(1–0), HCO+(1–0), HCN(1–0), HN13C(1–0), H13CO+(1–0), C2H(1–0), HC3N(10–9), SiO(2–1), 13CS(2–1)and HNCO(44,0 ‑ 30,3)) are detected. N2H+ and HNC are shown to be good tracers for clumps in various evolutionary stages since they are detected in all the fields. The detection rates of N-bearing molecules decrease as the clumps evolve, but those of O-bearing species increase with evolution. Furthermore, the abundance ratios [N2H+]/[HCO+] and log([HC3N]/[HCO+]) decline with log([HCO+]) as two linear functions, respectively. This suggests that N2H+ and HC3N transform to HCO+ as the clumps evolve. We also find that C2H is the most abundant molecule with an order of magnitude 10‑8. In addition, three new infall candidates, G010.214–00.324, G011.121–00.128 and G012.215–00.118(a), are discovered to have large-scale infall motions and infall rates with an order of magnitude 10‑3 M ⊙ yr‑1.

  6. Cluster survey of the high-altitude cusp properties: a three-year statistical study

    Directory of Open Access Journals (Sweden)

    B. Lavraud

    2004-09-01

    Full Text Available The global characteristics of the high-altitude cusp and its surrounding regions are investigated using a three-year statistical survey based on data obtained by the Cluster spacecraft. The analysis involves an elaborate orbit-sampling methodology that uses a model field and takes into account the actual solar wind conditions and level of geomagnetic activity. The spatial distribution of the magnetic field and various plasma parameters in the vicinity of the low magnetic field exterior cusp are determined and it is found that: 1 The magnetic field distribution shows the presence of an intermediate region between the magnetosheath and the magnetosphere: the exterior cusp, 2 This region is characterized by the presence of dense plasma of magnetosheath origin; a comparison with the Tsyganenko (1996 magnetic field model shows that it is diamagnetic in nature, 3 The spatial distributions show that three distinct boundaries with the lobes, the dayside plasma sheet and the magnetosheath surround the exterior cusp, 4 The external boundary with the magnetosheath has a sharp bulk velocity gradient, as well as a density decrease and temperature increase as one goes from the magnetosheath to the exterior cusp, 5 While the two inner boundaries form a funnel, the external boundary shows no clear indentation, 6 The plasma and magnetic pressure distributions suggest that the exterior cusp is in equilibrium with its surroundings in a statistical sense, and 7 A preliminary analysis of the bulk flow distributions suggests that the exterior cusp is stagnant under northward IMF conditions but convective under southward IMF conditions.

  7. A Unified Statistical Rain-Attenuation Model for Communication Link Fade Predictions and Optimal Stochastic Fade Control Design Using a Location-Dependent Rain-Statistic Database

    Science.gov (United States)

    Manning, Robert M.

    1990-01-01

    A static and dynamic rain-attenuation model is presented which describes the statistics of attenuation on an arbitrarily specified satellite link for any location for which there are long-term rainfall statistics. The model may be used in the design of the optimal stochastic control algorithms to mitigate the effects of attenuation and maintain link reliability. A rain-statistics data base is compiled, which makes it possible to apply the model to any location in the continental U.S. with a resolution of 0-5 degrees in latitude and longitude. The model predictions are compared with experimental observations, showing good agreement.

  8. Surveying managers to inform a regionally relevant invasive Phragmites australis control research program.

    Science.gov (United States)

    Rohal, C B; Kettenring, K M; Sims, K; Hazelton, E L G; Ma, Z

    2018-01-15

    Managers of invasive species consider the peer-reviewed literature only moderately helpful for guiding their management programs. Though this "knowing-doing gap" has been well-described, there have been few efforts to guide scientists in how to develop useful and usable science. Here we demonstrate how a comprehensive survey of managers (representing 42 wetland management units across the Great Salt Lake watershed) can highlight management practices and challenges (here for the widespread invasive plant, Phragmites australis, a recent and aggressive invader in this region) to ultimately inform a research program. The diversity of surveyed organizations had wide-ranging amounts of Phragmites which led to different goals and approaches including more aggressive control targets and a wider array of control tools for smaller, private organizations compared to larger government-run properties. We found that nearly all managers (97%) used herbicide as their primary Phragmites control tool, while burning (65%), livestock grazing (49%), and mowing (43%) were also frequently used. Managers expressed uncertainties regarding the timing of herbicide application and type of herbicide for effective control. Trade-offs between different Phragmites treatments were driven by budgetary concerns, as well as environmental conditions like water levels and social constraints like permitting issues. Managers had specific ideas about the plant communities they desired following Phragmites control, yet revegetation of native species was rarely attempted. The results of this survey informed the development of large-scale, multi-year Phragmites control and native plant revegetation experiments to address management uncertainties regarding herbicide type and timing. The survey also facilitated initial scientist-manager communication, which led to collaborations and knowledge co-production between managers and researchers. An important outcome of the survey was that experimental results were

  9. Chlamydia control in Europe - a survey of Member States

    DEFF Research Database (Denmark)

    Andersen, Berit; van Bergen, J; Ward, H

    ), contributed to the design and interpretation of the survey, commented on the draft report and approved the final report. Shelagh Redmond (Institute of Social and Preventive Medicine, University of Bern, Bern, Switzerland) provided technical support. Nicola Low (University of Bern) led the Chlamydia Control...

  10. Quality control in public participation assessments of water quality: the OPAL Water Survey.

    Science.gov (United States)

    Rose, N L; Turner, S D; Goldsmith, B; Gosling, L; Davidson, T A

    2016-07-22

    Public participation in scientific data collection is a rapidly expanding field. In water quality surveys, the involvement of the public, usually as trained volunteers, generally includes the identification of aquatic invertebrates to a broad taxonomic level. However, quality assurance is often not addressed and remains a key concern for the acceptance of publicly-generated water quality data. The Open Air Laboratories (OPAL) Water Survey, launched in May 2010, aimed to encourage interest and participation in water science by developing a 'low-barrier-to-entry' water quality survey. During 2010, over 3000 participant-selected lakes and ponds were surveyed making this the largest public participation lake and pond survey undertaken to date in the UK. But the OPAL approach of using untrained volunteers and largely anonymous data submission exacerbates quality control concerns. A number of approaches were used in order to address data quality issues including: sensitivity analysis to determine differences due to operator, sampling effort and duration; direct comparisons of identification between participants and experienced scientists; the use of a self-assessment identification quiz; the use of multiple participant surveys to assess data variability at single sites over short periods of time; comparison of survey techniques with other measurement variables and with other metrics generally considered more accurate. These quality control approaches were then used to screen the OPAL Water Survey data to generate a more robust dataset. The OPAL Water Survey results provide a regional and national assessment of water quality as well as a first national picture of water clarity (as suspended solids concentrations). Less than 10 % of lakes and ponds surveyed were 'poor' quality while 26.8 % were in the highest water quality band. It is likely that there will always be a question mark over untrained volunteer generated data simply because quality assurance is uncertain

  11. Federal Funds for Research and Development: Fiscal Years 1980, 1981, and 1982. Volume XXX. Detailed Statistical Tables. Surveys of Science Resources Series.

    Science.gov (United States)

    National Science Foundation, Washington, DC.

    During the March through July 1981 period a total of 36 Federal agencies and their subdivisions (95 individual respondents) submitted data in response to the Annual Survey of Federal Funds for Research and Development, Volume XXX, conducted by the National Science Foundation. The detailed statistical tables presented in this report were derived…

  12. Statistical sampling method for releasing decontaminated vehicles

    International Nuclear Information System (INIS)

    Lively, J.W.; Ware, J.A.

    1996-01-01

    Earth moving vehicles (e.g., dump trucks, belly dumps) commonly haul radiologically contaminated materials from a site being remediated to a disposal site. Traditionally, each vehicle must be surveyed before being released. The logistical difficulties of implementing the traditional approach on a large scale demand that an alternative be devised. A statistical method (MIL-STD-105E, open-quotes Sampling Procedures and Tables for Inspection by Attributesclose quotes) for assessing product quality from a continuous process was adapted to the vehicle decontamination process. This method produced a sampling scheme that automatically compensates and accommodates fluctuating batch sizes and changing conditions without the need to modify or rectify the sampling scheme in the field. Vehicles are randomly selected (sampled) upon completion of the decontamination process to be surveyed for residual radioactive surface contamination. The frequency of sampling is based on the expected number of vehicles passing through the decontamination process in a given period and the confidence level desired. This process has been successfully used for 1 year at the former uranium mill site in Monticello, Utah (a CERCLA regulated clean-up site). The method forces improvement in the quality of the decontamination process and results in a lower likelihood that vehicles exceeding the surface contamination standards are offered for survey. Implementation of this statistical sampling method on Monticello Projects has resulted in more efficient processing of vehicles through decontamination and radiological release, saved hundreds of hours of processing time, provided a high level of confidence that release limits are met, and improved the radiological cleanliness of vehicles leaving the controlled site

  13. Effects of exercise on glycemic control in type 2 diabetes mellitus in Koreans: the fifth Korea National Health and Nutrition Examination Survey (KNHANES V).

    Science.gov (United States)

    Park, Ji-Hye; Lee, Young-Eun

    2015-11-01

    [Purpose] The aim of this study was to investigate the effect of exercise on glycemic control using data from fifth Korea National Health and Nutrition Examination Survey and to provide appropriate exercise guidelines for patients with type 2 diabetes mellitus in Korea. [Subjects and Methods] We selected 1,328 patients from the fifth Korea National Health and Nutrition Examination Survey database who had type 2 diabetes and ranged in age from 30 to 90 years. Statistical analyses included χ(2) tests, multiple linear regression, and logistic regression. [Results] Factors found to be significantly related to glycemic control included income level, physical activity based on intensity of aerobic exercise, use of diabetes medicine, presence of hypertension, duration of diabetes, and waist circumference. In addition, engaging in combined low- and moderate-intensity aerobic exercise when adjusted for resistance exercise was found to lower the risk of glycemic control failure. [Conclusion] Patients with type 2 diabetes mellitus in Korea should engage in combined low- and moderate-intensity aerobic exercise such as walking for 30 minutes or more five times a week. Physical activity is likely to improve glycemic control and thus prevent the acute and chronic complications of diabetes mellitus.

  14. Statistical process control applied to the manufacturing of beryllia ceramics

    International Nuclear Information System (INIS)

    Ferguson, G.P.; Jech, D.E.; Sepulveda, J.L.

    1991-01-01

    To compete effectively in an international market, scrap and re-work costs must be minimized. Statistical Process Control (SPC) provides powerful tools to optimize production performance. These techniques are currently being applied to the forming, metallizing, and brazing of beryllia ceramic components. This paper describes specific examples of applications of SPC to dry-pressing of beryllium oxide 2x2 substrates, to Mo-Mn refractory metallization, and to metallization and brazing of plasma tubes used in lasers where adhesion strength is critical

  15. A survey of cross-infection control procedures: knowledge and attitudes of Turkish dentists

    Directory of Open Access Journals (Sweden)

    Emir Yüzbasioglu

    2009-12-01

    Full Text Available OBJECTIVES: The objective of this study was to investigate the knowledge, attitudes and behavior of Turkish dentists in Samsun City regarding cross-infection control. MATERIAL AND METHODS: A questionnaire was designed to obtain information about procedures used for the prevention of cross-infection in dental practices and determine the attitudes and perceptions of respondent dental practitioners to their procedures. The study population included all dentists in the city of Samsun, Turkey, in April 2005 (n=184. The questionnaire collected data on sociodemographic characteristics, knowledge and practice of infection control procedures, sterilization, wearing of gloves, mask, use of rubber dam, method of storing instruments and disposal methods of contaminated material, etc. Questionnaire data was entered into a computer and analyzed by SPSS statistical software. RESULTS: From the 184 dentists to whom the questionnaires were submitted, 135 participated in the study (overall response rate of 73.36%. As much as 74.10% dentists expressed concern about the risk of cross-infection from patients to themselves and their dental assistants. Forty-three percent of the participants were able to define "cross-infection" correctly. The greatest majority of the respondents (95.60% stated that all patients have to be considered as infectious and universal precautions must apply to all of them. The overall responses to the questionnaire showed that the dentists had moderate knowledge of infection control procedures. CONCLUSIONS: Improved compliance with recommended infection control procedures is required for all dentists evaluated in the present survey. Continuing education programs and short-time courses about cross-infection and infection control procedures are suitable to improve the knowledge of dentists.

  16. Evaluation of statistical protocols for quality control of ecosystem carbon dioxide fluxes

    Science.gov (United States)

    Jorge F. Perez-Quezada; Nicanor Z. Saliendra; William E. Emmerich; Emilio A. Laca

    2007-01-01

    The process of quality control of micrometeorological and carbon dioxide (CO2) flux data can be subjective and may lack repeatability, which would undermine the results of many studies. Multivariate statistical methods and time series analysis were used together and independently to detect and replace outliers in CO2 flux...

  17. Effectiveness of malaria control interventions in Madagascar: a nationwide case-control survey.

    Science.gov (United States)

    Kesteman, Thomas; Randrianarivelojosia, Milijaona; Raharimanga, Vaomalala; Randrianasolo, Laurence; Piola, Patrice; Rogier, Christophe

    2016-02-11

    Madagascar, as other malaria endemic countries, depends mainly on international funding for the implementation of malaria control interventions (MCI). As these funds no longer increase, policy makers need to know whether these MCI actually provide the expected protection. This study aimed at measuring the effectiveness of MCI deployed in all transmission patterns of Madagascar in 2012-2013 against the occurrence of clinical malaria cases. From September 2012 to August 2013, patients consulting for non-complicated malaria in 31 sentinel health centres (SHC) were asked to answer a short questionnaire about long-lasting insecticidal nets (LLIN) use, indoor residual spraying (IRS) in the household and intermittent preventive treatment of pregnant women (IPTp) intake. Controls were healthy all-ages individuals sampled from a concurrent cross-sectional survey conducted in areas surrounding the SHC. Cases and controls were retained in the database if they were resident of the same communes. The association between Plasmodium infection and exposure to MCI was calculated by multivariate multilevel models, and the protective effectiveness (PE) of an intervention was defined as 1 minus the odds ratio of this association. Data about 841 cases (out of 6760 cases observed in SHC) and 8284 controls was collected. The regular use of LLIN provided a significant 51 % PE (95 % CI [16-71]) in multivariate analysis, excluding in one transmission pattern where PE was -11 % (95 % CI [-251 to 65]) in univariate analysis. The PE of IRS was 51 % (95 % CI [31-65]), and the PE of exposure to both regular use of LLIN and IRS was 72 % (95 % CI [28-89]) in multivariate analyses. Vector control interventions avoided yearly over 100,000 clinical cases of malaria in Madagascar. The maternal PE of IPTp was 73 %. In Madagascar, LLIN and IRS had good PE against clinical malaria. These results may apply to other countries with similar transmission profiles, but such case-control surveys could be

  18. Bootstrap-based confidence estimation in PCA and multivariate statistical process control

    DEFF Research Database (Denmark)

    Babamoradi, Hamid

    be used to detect outliers in the data since the outliers can distort the bootstrap estimates. Bootstrap-based confidence limits were suggested as alternative to the asymptotic limits for control charts and contribution plots in MSPC (Paper II). The results showed that in case of the Q-statistic......Traditional/Asymptotic confidence estimation has limited applicability since it needs statistical theories to estimate the confidences, which are not available for all indicators/parameters. Furthermore, in case the theories are available for a specific indicator/parameter, the theories are based....... The goal was to improve process monitoring by improving the quality of MSPC charts and contribution plots. Bootstrapping algorithm to build confidence limits was illustrated in a case study format (Paper I). The main steps in the algorithm were discussed where a set of sensible choices (plus...

  19. 78 FR 78415 - Submission for Review: Customer Service Surveys, OMB Control No. 3206-0236

    Science.gov (United States)

    2013-12-26

    ... OFFICE OF PERSONNEL MANAGEMENT Submission for Review: Customer Service Surveys, OMB Control No... opportunity to comment on the information collection request (ICR) 3206-0236, Customer Service Surveys. As... workforce. Customer service surveys are valuable tools to gather information from our customers so we can...

  20. Statistical process control support during Defense Waste Processing Facility chemical runs

    International Nuclear Information System (INIS)

    Brown, K.G.

    1994-01-01

    The Product Composition Control System (PCCS) has been developed to ensure that the wasteforms produced by the Defense Waste Processing Facility (DWPF) at the Savannah River Site (SRS) will satisfy the regulatory and processing criteria that will be imposed. The PCCS provides rigorous, statistically-defensible management of a noisy, multivariate system subject to multiple constraints. The system has been successfully tested and has been used to control the production of the first two melter feed batches during DWPF Chemical Runs. These operations will demonstrate the viability of the DWPF process. This paper provides a brief discussion of the technical foundation for the statistical process control algorithms incorporated into PCCS, and describes the results obtained and lessons learned from DWPF Cold Chemical Run operations. The DWPF will immobilize approximately 130 million liters of high-level nuclear waste currently stored at the Site in 51 carbon steel tanks. Waste handling operations separate this waste into highly radioactive sludge and precipitate streams and less radioactive water soluble salts. (In a separate facility, soluble salts are disposed of as low-level waste in a mixture of cement slag, and flyash.) In DWPF, the precipitate steam (Precipitate Hydrolysis Aqueous or PHA) is blended with the insoluble sludge and ground glass frit to produce melter feed slurry which is continuously fed to the DWPF melter. The melter produces a molten borosilicate glass which is poured into stainless steel canisters for cooling and, ultimately, shipment to and storage in a geologic repository

  1. Descriptive and inferential statistical methods used in burns research.

    Science.gov (United States)

    Al-Benna, Sammy; Al-Ajam, Yazan; Way, Benjamin; Steinstraesser, Lars

    2010-05-01

    Burns research articles utilise a variety of descriptive and inferential methods to present and analyse data. The aim of this study was to determine the descriptive methods (e.g. mean, median, SD, range, etc.) and survey the use of inferential methods (statistical tests) used in articles in the journal Burns. This study defined its population as all original articles published in the journal Burns in 2007. Letters to the editor, brief reports, reviews, and case reports were excluded. Study characteristics, use of descriptive statistics and the number and types of statistical methods employed were evaluated. Of the 51 articles analysed, 11(22%) were randomised controlled trials, 18(35%) were cohort studies, 11(22%) were case control studies and 11(22%) were case series. The study design and objectives were defined in all articles. All articles made use of continuous and descriptive data. Inferential statistics were used in 49(96%) articles. Data dispersion was calculated by standard deviation in 30(59%). Standard error of the mean was quoted in 19(37%). The statistical software product was named in 33(65%). Of the 49 articles that used inferential statistics, the tests were named in 47(96%). The 6 most common tests used (Student's t-test (53%), analysis of variance/co-variance (33%), chi(2) test (27%), Wilcoxon & Mann-Whitney tests (22%), Fisher's exact test (12%)) accounted for the majority (72%) of statistical methods employed. A specified significance level was named in 43(88%) and the exact significance levels were reported in 28(57%). Descriptive analysis and basic statistical techniques account for most of the statistical tests reported. This information should prove useful in deciding which tests should be emphasised in educating burn care professionals. These results highlight the need for burn care professionals to have a sound understanding of basic statistics, which is crucial in interpreting and reporting data. Advice should be sought from professionals

  2. Statistical process control applied to intensity modulated radiotherapy pretreatment controls with portal dosimetry;Maitrise statistique des processus appliquee aux controles avant traitement par dosimetrie portale en radiotherapie conformationnelle avec modulation d'intensite

    Energy Technology Data Exchange (ETDEWEB)

    Villani, N.; Noel, A. [Laboratoire de recherche en radiophysique, CRAN UMR 7039, Nancy universite-CNRS, 54 - Vandoeuvre-les-Nancy (France); Villani, N.; Gerard, K.; Marchesi, V.; Huger, S.; Noel, A. [Departement de radiophysique, centre Alexis-Vautrin, 54 - Vandoeuvre-les-Nancy (France); Francois, P. [Institut Curie, 75 - Paris (France)

    2010-06-15

    Purpose The first purpose of this study was to illustrate the contribution of statistical process control for a better security in intensity modulated radiotherapy (I.M.R.T.) treatments. This improvement is possible by controlling the dose delivery process, characterized by pretreatment quality control results. So, it is necessary to put under control portal dosimetry measurements (currently, the ionisation chamber measurements were already monitored by statistical process control thanks to statistical process control tools). The second objective was to state whether it is possible to substitute ionisation chamber with portal dosimetry in order to optimize time devoted to pretreatment quality control. Patients and methods At Alexis-Vautrin center, pretreatment quality controls in I.M.R.T. for prostate and head and neck treatments were performed for each beam of each patient. These controls were made with an ionisation chamber, which is the reference detector for the absolute dose measurement, and with portal dosimetry for the verification of dose distribution. Statistical process control is a statistical analysis method, coming from industry, used to control and improve the studied process quality. It uses graphic tools as control maps to follow-up process, warning the operator in case of failure, and quantitative tools to evaluate the process toward its ability to respect guidelines: this is the capability study. The study was performed on 450 head and neck beams and on 100 prostate beams. Results Control charts, showing drifts, both slow and weak, and also both strong and fast, of mean and standard deviation have been established and have shown special cause introduced (manual shift of the leaf gap of the multi-leaf collimator). Correlation between dose measured at one point, given with the E.P.I.D. and the ionisation chamber has been evaluated at more than 97% and disagreement cases between the two measurements were identified. Conclusion The study allowed to

  3. Mathematical Anxiety among Business Statistics Students.

    Science.gov (United States)

    High, Robert V.

    A survey instrument was developed to identify sources of mathematics anxiety among undergraduate business students in a statistics class. A number of statistics classes were selected at two colleges in Long Island, New York. A final sample of n=102 respondents indicated that there was a relationship between the mathematics grade in prior…

  4. Exploring the use of statistical process control methods to assess course changes

    Science.gov (United States)

    Vollstedt, Ann-Marie

    This dissertation pertains to the field of Engineering Education. The Department of Mechanical Engineering at the University of Nevada, Reno (UNR) is hosting this dissertation under a special agreement. This study was motivated by the desire to find an improved, quantitative measure of student quality that is both convenient to use and easy to evaluate. While traditional statistical analysis tools such as ANOVA (analysis of variance) are useful, they are somewhat time consuming and are subject to error because they are based on grades, which are influenced by numerous variables, independent of student ability and effort (e.g. inflation and curving). Additionally, grades are currently the only measure of quality in most engineering courses even though most faculty agree that grades do not accurately reflect student quality. Based on a literature search, in this study, quality was defined as content knowledge, cognitive level, self efficacy, and critical thinking. Nineteen treatments were applied to a pair of freshmen classes in an effort in increase the qualities. The qualities were measured via quiz grades, essays, surveys, and online critical thinking tests. Results from the quality tests were adjusted and filtered prior to analysis. All test results were subjected to Chauvenet's criterion in order to detect and remove outlying data. In addition to removing outliers from data sets, it was felt that individual course grades needed adjustment to accommodate for the large portion of the grade that was defined by group work. A new method was developed to adjust grades within each group based on the residual of the individual grades within the group and the portion of the course grade defined by group work. It was found that the grade adjustment method agreed 78% of the time with the manual ii grade changes instructors made in 2009, and also increased the correlation between group grades and individual grades. Using these adjusted grades, Statistical Process Control

  5. Statistical theory of signal detection

    CERN Document Server

    Helstrom, Carl Wilhelm; Costrell, L; Kandiah, K

    1968-01-01

    Statistical Theory of Signal Detection, Second Edition provides an elementary introduction to the theory of statistical testing of hypotheses that is related to the detection of signals in radar and communications technology. This book presents a comprehensive survey of digital communication systems. Organized into 11 chapters, this edition begins with an overview of the theory of signal detection and the typical detection problem. This text then examines the goals of the detection system, which are defined through an analogy with the testing of statistical hypotheses. Other chapters consider

  6. Impact of Autocorrelation on Principal Components and Their Use in Statistical Process Control

    DEFF Research Database (Denmark)

    Vanhatalo, Erik; Kulahci, Murat

    2015-01-01

    A basic assumption when using principal component analysis (PCA) for inferential purposes, such as in statistical process control (SPC), is that the data are independent in time. In many industrial processes, frequent sampling and process dynamics make this assumption unrealistic rendering sampled...

  7. The current state of external quality control surveys in the German Federal Republic in the field of peptide hormone radioimmunoassays

    International Nuclear Information System (INIS)

    Marschner, I.; Scriba, P.C.; Wood, W.G.; Breuer, H.; Jungbluth, D.; Roehle, G.

    1977-01-01

    Two types of quality control surveys (QSC) are performed in the Federal Republic of Germany in the field of hormone assays: 1) The distribution of two lyophilized sera at regular intervals, in which the participants have to determine 7 or 8 different hormones. Because of the lack of reference methods for peptide hormones, the statistical evaluation of the results indicates only whether the results of the participants are 'correct' or contain systematic or nonsystematic errors with respect to the findings of the other participants. 2) The distribution of approximately 20 deep-frozen sera (including a standard curve in hormone-free serum) in which the participant has to assay a single hormone. These 20 sera-QCS are performed only at long intervals for a given hormone. The statistical analysis of the rate of the radioactive counts of the QCS-sera and those of the participants' standard curves allows - together with a methodological inquiry form - detection of probable causes for deviating results. (orig.) [de

  8. Statistical Literacy: Data Tell a Story

    Science.gov (United States)

    Sole, Marla A.

    2016-01-01

    Every day, students collect, organize, and analyze data to make decisions. In this data-driven world, people need to assess how much trust they can place in summary statistics. The results of every survey and the safety of every drug that undergoes a clinical trial depend on the correct application of appropriate statistics. Recognizing the…

  9. Statistical process control for alpha spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Richardson, W; Majoras, R E [Oxford Instruments, Inc. P.O. Box 2560, Oak Ridge TN 37830 (United States); Joo, I O; Seymour, R S [Accu-Labs Research, Inc. 4663 Table Mountain Drive, Golden CO 80403 (United States)

    1995-10-01

    Statistical process control(SPC) allows for the identification of problems in alpha spectroscopy processes before they occur, unlike standard laboratory Q C which only identifies problems after a process fails. SPC tools that are directly applicable to alpha spectroscopy include individual X-charts and X-bar charts, process capability plots, and scatter plots. Most scientists are familiar with the concepts the and methods employed by SPC. These tools allow analysis of process bias, precision, accuracy and reproducibility as well as process capability. Parameters affecting instrument performance are monitored and analyzed using SPC methods. These instrument parameters can also be compared to sampling, preparation, measurement, and analysis Q C parameters permitting the evaluation of cause effect relationships. Three examples of SPC, as applied to alpha spectroscopy , are presented. The first example investigates background contamination using averaging to show trends quickly. A second example demonstrates how SPC can identify sample processing problems, analyzing both how and why this problem occurred. A third example illustrates how SPC can predict when an alpha spectroscopy process is going to fail. This allows for an orderly and timely shutdown of the process to perform preventative maintenance, avoiding the need to repeat costly sample analyses. 7 figs., 2 tabs.

  10. Statistical process control for alpha spectroscopy

    International Nuclear Information System (INIS)

    Richardson, W.; Majoras, R.E.; Joo, I.O.; Seymour, R.S.

    1995-01-01

    Statistical process control(SPC) allows for the identification of problems in alpha spectroscopy processes before they occur, unlike standard laboratory Q C which only identifies problems after a process fails. SPC tools that are directly applicable to alpha spectroscopy include individual X-charts and X-bar charts, process capability plots, and scatter plots. Most scientists are familiar with the concepts the and methods employed by SPC. These tools allow analysis of process bias, precision, accuracy and reproducibility as well as process capability. Parameters affecting instrument performance are monitored and analyzed using SPC methods. These instrument parameters can also be compared to sampling, preparation, measurement, and analysis Q C parameters permitting the evaluation of cause effect relationships. Three examples of SPC, as applied to alpha spectroscopy , are presented. The first example investigates background contamination using averaging to show trends quickly. A second example demonstrates how SPC can identify sample processing problems, analyzing both how and why this problem occurred. A third example illustrates how SPC can predict when an alpha spectroscopy process is going to fail. This allows for an orderly and timely shutdown of the process to perform preventative maintenance, avoiding the need to repeat costly sample analyses. 7 figs., 2 tabs

  11. Methods for computational disease surveillance in infection prevention and control: Statistical process control versus Twitter's anomaly and breakout detection algorithms.

    Science.gov (United States)

    Wiemken, Timothy L; Furmanek, Stephen P; Mattingly, William A; Wright, Marc-Oliver; Persaud, Annuradha K; Guinn, Brian E; Carrico, Ruth M; Arnold, Forest W; Ramirez, Julio A

    2018-02-01

    Although not all health care-associated infections (HAIs) are preventable, reducing HAIs through targeted intervention is key to a successful infection prevention program. To identify areas in need of targeted intervention, robust statistical methods must be used when analyzing surveillance data. The objective of this study was to compare and contrast statistical process control (SPC) charts with Twitter's anomaly and breakout detection algorithms. SPC and anomaly/breakout detection (ABD) charts were created for vancomycin-resistant Enterococcus, Acinetobacter baumannii, catheter-associated urinary tract infection, and central line-associated bloodstream infection data. Both SPC and ABD charts detected similar data points as anomalous/out of control on most charts. The vancomycin-resistant Enterococcus ABD chart detected an extra anomalous point that appeared to be higher than the same time period in prior years. Using a small subset of the central line-associated bloodstream infection data, the ABD chart was able to detect anomalies where the SPC chart was not. SPC charts and ABD charts both performed well, although ABD charts appeared to work better in the context of seasonal variation and autocorrelation. Because they account for common statistical issues in HAI data, ABD charts may be useful for practitioners for analysis of HAI surveillance data. Copyright © 2018 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.

  12. A comprehensive analysis of the IMRT dose delivery process using statistical process control (SPC)

    Energy Technology Data Exchange (ETDEWEB)

    Gerard, Karine; Grandhaye, Jean-Pierre; Marchesi, Vincent; Kafrouni, Hanna; Husson, Francois; Aletti, Pierre [Research Center for Automatic Control (CRAN), Nancy University, CNRS, 54516 Vandoeuvre-les-Nancy (France); Department of Medical Physics, Alexis Vautrin Cancer Center, 54511 Vandoeuvre-les-Nancy Cedex (France) and DOSIsoft SA, 94230 Cachan (France); Research Laboratory for Innovative Processes (ERPI), Nancy University, EA 3767, 5400 Nancy Cedex (France); Department of Medical Physics, Alexis Vautrin Cancer Center, 54511 Vandoeuvre-les-Nancy Cedex (France); DOSIsoft SA, 94230 Cachan (France); Research Center for Automatic Control (CRAN), Nancy University, CNRS, 54516 Vandoeuvre-les-Nancy, France and Department of Medical Physics, Alexis Vautrin Cancer Center, 54511 Vandoeuvre-les-Nancy Cedex (France)

    2009-04-15

    The aim of this study is to introduce tools to improve the security of each IMRT patient treatment by determining action levels for the dose delivery process. To achieve this, the patient-specific quality control results performed with an ionization chamber--and which characterize the dose delivery process--have been retrospectively analyzed using a method borrowed from industry: Statistical process control (SPC). The latter consisted in fulfilling four principal well-structured steps. The authors first quantified the short term variability of ionization chamber measurements regarding the clinical tolerances used in the cancer center ({+-}4% of deviation between the calculated and measured doses) by calculating a control process capability (C{sub pc}) index. The C{sub pc} index was found superior to 4, which implies that the observed variability of the dose delivery process is not biased by the short term variability of the measurement. Then, the authors demonstrated using a normality test that the quality control results could be approximated by a normal distribution with two parameters (mean and standard deviation). Finally, the authors used two complementary tools--control charts and performance indices--to thoroughly analyze the IMRT dose delivery process. Control charts aim at monitoring the process over time using statistical control limits to distinguish random (natural) variations from significant changes in the process, whereas performance indices aim at quantifying the ability of the process to produce data that are within the clinical tolerances, at a precise moment. The authors retrospectively showed that the analysis of three selected control charts (individual value, moving-range, and EWMA control charts) allowed efficient drift detection of the dose delivery process for prostate and head-and-neck treatments before the quality controls were outside the clinical tolerances. Therefore, when analyzed in real time, during quality controls, they should

  13. A comprehensive analysis of the IMRT dose delivery process using statistical process control (SPC).

    Science.gov (United States)

    Gérard, Karine; Grandhaye, Jean-Pierre; Marchesi, Vincent; Kafrouni, Hanna; Husson, François; Aletti, Pierre

    2009-04-01

    The aim of this study is to introduce tools to improve the security of each IMRT patient treatment by determining action levels for the dose delivery process. To achieve this, the patient-specific quality control results performed with an ionization chamber--and which characterize the dose delivery process--have been retrospectively analyzed using a method borrowed from industry: Statistical process control (SPC). The latter consisted in fulfilling four principal well-structured steps. The authors first quantified the short-term variability of ionization chamber measurements regarding the clinical tolerances used in the cancer center (+/- 4% of deviation between the calculated and measured doses) by calculating a control process capability (C(pc)) index. The C(pc) index was found superior to 4, which implies that the observed variability of the dose delivery process is not biased by the short-term variability of the measurement. Then, the authors demonstrated using a normality test that the quality control results could be approximated by a normal distribution with two parameters (mean and standard deviation). Finally, the authors used two complementary tools--control charts and performance indices--to thoroughly analyze the IMRT dose delivery process. Control charts aim at monitoring the process over time using statistical control limits to distinguish random (natural) variations from significant changes in the process, whereas performance indices aim at quantifying the ability of the process to produce data that are within the clinical tolerances, at a precise moment. The authors retrospectively showed that the analysis of three selected control charts (individual value, moving-range, and EWMA control charts) allowed efficient drift detection of the dose delivery process for prostate and head-and-neck treatments before the quality controls were outside the clinical tolerances. Therefore, when analyzed in real time, during quality controls, they should improve the

  14. A survey on control schemes for distributed solar collector fields. Part I: Modeling and basic control approaches

    Energy Technology Data Exchange (ETDEWEB)

    Camacho, E.F.; Rubio, F.R. [Universidad de Sevilla, Escuela Superior de Ingenieros, Departamento de Ingenieria de Sistemas y Automatica, Camino de Los Descubrimientos s/n, E-41092, Sevilla (Spain); Berenguel, M. [Universidad de Almeria, Departamento de Lenguajes y Computacion, Area de Ingenieria de Sistemas y Automatica, Carretera Sacramento s/n, E-04120 La Canada, Almeria (Spain); Valenzuela, L. [Plataforma Solar de Almeria - CIEMAT, Carretera Senes s/n, P.O. Box 22, E-04200 Tabernas, Almeria (Spain)

    2007-10-15

    This article presents a survey of the different automatic control techniques that have been applied to control the outlet temperature of solar plants with distributed collectors during the last 25 years. Different aspects of the control problem involved in this kind of plants are treated, from modeling and simulation approaches to the different basic control schemes developed and successfully applied in real solar plants. A classification of the modeling and control approaches is used to explain the main features of each strategy. (author)

  15. Application of machine learning and expert systems to Statistical Process Control (SPC) chart interpretation

    Science.gov (United States)

    Shewhart, Mark

    1991-01-01

    Statistical Process Control (SPC) charts are one of several tools used in quality control. Other tools include flow charts, histograms, cause and effect diagrams, check sheets, Pareto diagrams, graphs, and scatter diagrams. A control chart is simply a graph which indicates process variation over time. The purpose of drawing a control chart is to detect any changes in the process signalled by abnormal points or patterns on the graph. The Artificial Intelligence Support Center (AISC) of the Acquisition Logistics Division has developed a hybrid machine learning expert system prototype which automates the process of constructing and interpreting control charts.

  16. Severe postpartum haemorrhage after vaginal delivery: a statistical process control chart to report seven years of continuous quality improvement.

    Science.gov (United States)

    Dupont, Corinne; Occelli, Pauline; Deneux-Tharaux, Catherine; Touzet, Sandrine; Duclos, Antoine; Bouvier-Colle, Marie-Hélène; Rudigoz, René-Charles; Huissoud, Cyril

    2014-07-01

    Severe postpartum haemorrhage after vaginal delivery: a statistical process control chart to report seven years of continuous quality improvement To use statistical process control charts to describe trends in the prevalence of severe postpartum haemorrhage after vaginal delivery. This assessment was performed 7 years after we initiated a continuous quality improvement programme that began with regular criteria-based audits Observational descriptive study, in a French maternity unit in the Rhône-Alpes region. Quarterly clinical audit meetings to analyse all cases of severe postpartum haemorrhage after vaginal delivery and provide feedback on quality of care with statistical process control tools. The primary outcomes were the prevalence of severe PPH after vaginal delivery and its quarterly monitoring with a control chart. The secondary outcomes included the global quality of care for women with severe postpartum haemorrhage, including the performance rate of each recommended procedure. Differences in these variables between 2005 and 2012 were tested. From 2005 to 2012, the prevalence of severe postpartum haemorrhage declined significantly, from 1.2% to 0.6% of vaginal deliveries (pcontrol limits, that is, been out of statistical control. The proportion of cases that were managed consistently with the guidelines increased for all of their main components. Implementation of continuous quality improvement efforts began seven years ago and used, among other tools, statistical process control charts. During this period, the prevalence of severe postpartum haemorrhage after vaginal delivery has been reduced by 50%. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  17. Effect of moulding sand on statistically controlled hybrid rapid casting solution for zinc alloys

    Energy Technology Data Exchange (ETDEWEB)

    Singh, Rupinder [Guru Nanak Dev Engineering College, Ludhiana (India)

    2010-08-15

    The purpose of the present investigations is to study the effect of moulding sand on decreasing shell wall thickness of mould cavities for economical and statistically controlled hybrid rapid casting solutions (combination of three dimensional printing and conventional sand casting) for zinc alloys. Starting from the identification of component/ benchmark, technological prototypes were produced at different shell wall thicknesses supported by three different types of sands (namely: dry, green and molasses). Prototypes prepared by the proposed process are for assembly check purpose and not for functional validation of the parts. The study suggested that a shell wall with a less than recommended thickness (12mm) is more suitable for dimensional accuracy. The best dimensional accuracy was obtained at 3mm shell wall thickness with green sand. The process was found to be under statistical control

  18. A Survey of Open-Source UAV Flight Controllers and Flight Simulators

    DEFF Research Database (Denmark)

    Ebeid, Emad Samuel Malki; Skriver, Martin; Terkildsen, Kristian Husum

    2018-01-01

    , which are all tightly linked to the UAV flight controller hardware and software. The lack of standardization of flight controller architectures and the use of proprietary closed-source flight controllers on many UAV platforms, however, complicates this work: solutions developed for one flight controller...... may be difficult to port to another without substantial extra development and testing. Using open-source flight controllers mitigates some of these challenges and enables other researchers to validate and build upon existing research. This paper presents a survey of the publicly available open...

  19. Statistically Controlling for Confounding Constructs Is Harder than You Think.

    Directory of Open Access Journals (Sweden)

    Jacob Westfall

    Full Text Available Social scientists often seek to demonstrate that a construct has incremental validity over and above other related constructs. However, these claims are typically supported by measurement-level models that fail to consider the effects of measurement (unreliability. We use intuitive examples, Monte Carlo simulations, and a novel analytical framework to demonstrate that common strategies for establishing incremental construct validity using multiple regression analysis exhibit extremely high Type I error rates under parameter regimes common in many psychological domains. Counterintuitively, we find that error rates are highest--in some cases approaching 100%--when sample sizes are large and reliability is moderate. Our findings suggest that a potentially large proportion of incremental validity claims made in the literature are spurious. We present a web application (http://jakewestfall.org/ivy/ that readers can use to explore the statistical properties of these and other incremental validity arguments. We conclude by reviewing SEM-based statistical approaches that appropriately control the Type I error rate when attempting to establish incremental validity.

  20. Person Fit Based on Statistical Process Control in an Adaptive Testing Environment. Research Report 98-13.

    Science.gov (United States)

    van Krimpen-Stoop, Edith M. L. A.; Meijer, Rob R.

    Person-fit research in the context of paper-and-pencil tests is reviewed, and some specific problems regarding person fit in the context of computerized adaptive testing (CAT) are discussed. Some new methods are proposed to investigate person fit in a CAT environment. These statistics are based on Statistical Process Control (SPC) theory. A…

  1. A rank-based algorithm of differential expression analysis for small cell line data with statistical control.

    Science.gov (United States)

    Li, Xiangyu; Cai, Hao; Wang, Xianlong; Ao, Lu; Guo, You; He, Jun; Gu, Yunyan; Qi, Lishuang; Guan, Qingzhou; Lin, Xu; Guo, Zheng

    2017-10-13

    To detect differentially expressed genes (DEGs) in small-scale cell line experiments, usually with only two or three technical replicates for each state, the commonly used statistical methods such as significance analysis of microarrays (SAM), limma and RankProd (RP) lack statistical power, while the fold change method lacks any statistical control. In this study, we demonstrated that the within-sample relative expression orderings (REOs) of gene pairs were highly stable among technical replicates of a cell line but often widely disrupted after certain treatments such like gene knockdown, gene transfection and drug treatment. Based on this finding, we customized the RankComp algorithm, previously designed for individualized differential expression analysis through REO comparison, to identify DEGs with certain statistical control for small-scale cell line data. In both simulated and real data, the new algorithm, named CellComp, exhibited high precision with much higher sensitivity than the original RankComp, SAM, limma and RP methods. Therefore, CellComp provides an efficient tool for analyzing small-scale cell line data. © The Author 2017. Published by Oxford University Press.

  2. The exclusion from welfare benefits: Resentment and survey attrition in a randomized controlled trial in Mexico.

    Science.gov (United States)

    Stecklov, Guy; Weinreb, Alexander; Winters, Paul

    2016-11-01

    Public policy programs must often impose limits on who may be eligible for benefits. Despite research on the impact of exclusion in developed countries, there is little evidence on how people react to being excluded from benefits in developing societies. Utilizing repeated waves of data from an experimental evaluation of Mexico's foundational PROGRESA antipoverty program, we examine the impact of exclusion and distinguish two separate forms. "Statistical exclusion" occurs where determination of benefits is based on randomized assignment to a treatment and control group. "Needs-based exclusion" occurs when benefits programs are designed to be selective rather than universal, basing eligibility on characteristics, like relative poverty, that are difficult to measure simply and accurately. Focusing on temporal variation in survey non-response as our behavioral outcome, we show that needs-based exclusion has much greater negative effects on continued participation than statistical exclusion. We also show that these effects are concentrated among the wealthy, that is, those furthest from the eligibility cut-off line. These findings reinforce general concerns about the validity of evaluation studies when incentives are at work. We discuss both the behavioral explanations that might underlie these findings as well as some potential approaches to reduce threats to evaluation validity. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. Mainstreaming Remedial Mathematics Students in Introductory Statistics: Results Using a Randomized Controlled Trial

    Science.gov (United States)

    Logue, Alexandra W.; Watanabe-Rose, Mari

    2014-01-01

    This study used a randomized controlled trial to determine whether students, assessed by their community colleges as needing an elementary algebra (remedial) mathematics course, could instead succeed at least as well in a college-level, credit-bearing introductory statistics course with extra support (a weekly workshop). Researchers randomly…

  4. Evaluating Statistical Process Control (SPC) techniques and computing the uncertainty of force calibrations

    Science.gov (United States)

    Navard, Sharon E.

    1989-01-01

    In recent years there has been a push within NASA to use statistical techniques to improve the quality of production. Two areas where statistics are used are in establishing product and process quality control of flight hardware and in evaluating the uncertainty of calibration of instruments. The Flight Systems Quality Engineering branch is responsible for developing and assuring the quality of all flight hardware; the statistical process control methods employed are reviewed and evaluated. The Measurement Standards and Calibration Laboratory performs the calibration of all instruments used on-site at JSC as well as those used by all off-site contractors. These calibrations must be performed in such a way as to be traceable to national standards maintained by the National Institute of Standards and Technology, and they must meet a four-to-one ratio of the instrument specifications to calibrating standard uncertainty. In some instances this ratio is not met, and in these cases it is desirable to compute the exact uncertainty of the calibration and determine ways of reducing it. A particular example where this problem is encountered is with a machine which does automatic calibrations of force. The process of force calibration using the United Force Machine is described in detail. The sources of error are identified and quantified when possible. Suggestions for improvement are made.

  5. Results of quality control surveys of radioimmunological determinations of thyrotropin in newborns

    International Nuclear Information System (INIS)

    Roehle, G.; Kruse, R.; Voigt, U.; Torresani, T.

    1983-01-01

    Within the quality control scheme of the Deutsche Gesellschaft fuer Klinische Chemie, seven quality control surveys of thyrotropin (TSH) determinations in blood dried on filter paper have been carried out since 1980. Ninety-six screening laboratories from 12 European countries took part in these surveys. In a single survey each participant usually analysed four different samples; each of these consisted of three spots of dried blood spiked with defined amounts of thyrotropin. For the evaluations of the surveys the participants were asked to give information about their analytical results, and from these, their diagnostic classifications. The medians of the analytical results correlated well with the given thyrotropin concentrations, but the individual estimations from different laboratories varied greatly. Major discrepancies of classification were also apparent, both in the given thyrotropin concentrations and in the individual estimations. Two special collaborative studies with nine selected laboratories showed on the one hand that analysis of the largest possible part of the dried blood sample can help to optimize the precision of the results; on the other hand, considerable drawbacks related to the reagents and the methods were sometimes observed. (orig.) [de

  6. Project T.E.A.M. (Technical Education Advancement Modules). Advanced Statistical Process Control.

    Science.gov (United States)

    Dunlap, Dale

    This instructional guide, one of a series developed by the Technical Education Advancement Modules (TEAM) project, is a 20-hour advanced statistical process control (SPC) and quality improvement course designed to develop the following competencies: (1) understanding quality systems; (2) knowing the process; (3) solving quality problems; and (4)…

  7. MAX and Survey Linkages

    Data.gov (United States)

    U.S. Department of Health & Human Services — CMS is interested in linking MAX files with survey data, including four surveys conducted by the National Center for Health Statistics (NCHS) - the National Health...

  8. A national survey of radiodiagnostic services in Ecuador

    International Nuclear Information System (INIS)

    Penaherrera S, P.; Echeverria T, F.; Buitron S, S.; Yela de Chacon, L.

    1979-11-01

    The Ecuadorian Atomic Energy Commission elaborated a Radiation Protection Regulation for Ecuador. In order to implement it, a national survey of radiodiagnostic services was implemented with the following objectives: a) Statistics of Radiodiagnostic services related to geography and population ulation density, b) To establish general patterns for X-Ray control and calibration, c) Evaluation of the professional and technical work in this field. (Author)

  9. Pengendalian Kualitas Produk Di Industri Garment Dengan Menggunakan Statistical Procces Control (SPC

    Directory of Open Access Journals (Sweden)

    Rizal Rachman

    2017-09-01

    Full Text Available Abstrak Perusahaan memandang bahwa kualitas sebagai faktor kunci yang membawa keberhasilan dan standar mutu yang telah ditetapkan oleh buyer. Tujuan penelitian ini adalah untuk mengetahui tingkat kerusakan produk dalam batas pengendalian kualitas pada proses produksi pakaian jadi pada PT. Asia Penta Garment. Penelitian ini menggunakan metode statistical procces control. Data yang diambil dalam penelitian ini mengunakan data sekunder berupa laporan jumlah produksi dan kerusakan pakaian jadi di bagian finishing pada Januari 2017. Berdasarkan hasil menunjukkan kerusakan diluar batas pengendalian yaitu ada yang diluar batas kendali (out of control dengan batas pengendalian atas (UCL dan batas pengendalian bawah (LCL dan rata-rata kerusakan diluar batas kendali.Untuk meningkatkan kualitas produk khususnya pakaian yang dihasilkan perusahaan, kebijakan mutu yang telah ditetapkan harus dijalankan dengan benar, antara lain dalam pemilihan negoisasi bahan baku dengan buyer sesuai standar, perekrutan tenaga kerja yang berpengalaman, kedisiplinan kerja yang tinggi, pembinaan para karyawan, pemberian bonus pada karyawan yang sesuai target dan disiplin tinggi, perbaikan mesin secara terus menerus dan memperbaiki lingkungan kerja yang bersih, nyaman, serta aman.   Kata Kunci : Pengendalian kualitas, Kualitas produk, SPC. Abstract The Company considers that quality as a key factor that brings success and quality standards set by the buyer. The purpose of this study was to determine the level of damage to the product within the limits of quality control in the production process apparel in PT. Asia Penta Garment. This study uses a statistical procces control methode. Data taken in this study using secondary data from reports on the number of production and damage to clothing in the finishing section in January 2017. Based on the results show the damage outside the control limits is nothing beyond the control limit (out of control with the upper control limit

  10. Pengendalian Kualitas Produk Di Industri Garment Dengan Menggunakan Statistical Procces Control (SPC)

    OpenAIRE

    Rizal Rachman

    2017-01-01

    Abstrak Perusahaan memandang bahwa kualitas sebagai faktor kunci yang membawa keberhasilan dan standar mutu yang telah ditetapkan oleh buyer. Tujuan penelitian ini adalah untuk mengetahui tingkat kerusakan produk dalam batas pengendalian kualitas pada proses produksi pakaian jadi pada PT. Asia Penta Garment. Penelitian ini menggunakan metode statistical procces control. Data yang diambil dalam penelitian ini mengunakan data sekunder berupa laporan jumlah produksi dan kerusakan pakaian jad...

  11. LHC Survey Laser Tracker Controls Renovation

    CERN Document Server

    Charrondière, C

    2011-01-01

    The LHC survey laser tracker control system is based on an industrial software package (Axyz) from Leica Geosystems™ that has an interface to Visual Basic™, which we used to automate the geometric measurements for the LHC magnets. With the new version of the Leica software, this Visual Basic™ interface is no longerb available and we had to redesign the interface software to adapt to a PC-DMIS server that replaced the Axyz software. As this package is no longer supported, we have taken the decision to recode the automation application in LabVIEW. This presentation describes the existing equipment, interface and application showing the reasons for our decisions to move to PC-DMIS and LabVIEW. A comparison between the new and legacy system is made

  12. Doctorate Education in Canada: Findings from the Survey of Earned Doctorates, 2005/2006. Culture, Tourism and the Centre for Education Statistics. Research Paper. Catalogue no. 81-595-M No. 069

    Science.gov (United States)

    King, Darren; Eisl-Culkin, Judy; Desjardins, Louise

    2008-01-01

    "Doctorate Education in Canada: Findings from the Survey of Earned Doctorates, 2005/2006" is the third paper in a series of reports written by the Learning Policy Directorate of Human Resources and Social Development Canada (HRSDC) and the Centre for Education Statistics of Statistics Canada. Each report presents an overview of doctoral…

  13. STATISTIC MODEL OF DYNAMIC DELAY AND DROPOUT ON CELLULAR DATA NETWORKED CONTROL SYSTEM

    Directory of Open Access Journals (Sweden)

    MUHAMMAD A. MURTI

    2017-07-01

    Full Text Available Delay and dropout are important parameters influence overall control performance in Networked Control System (NCS. The goal of this research is to find a model of delay and dropout of data communication link in the NCS. Experiments have been done in this research to a water level control of boiler tank as part of the NCS based on internet communication network using High Speed Packet Access (HSPA cellular technology. By this experiments have been obtained closed-loop system response as well as data delay and dropout of data packets. This research contributes on modeling of the NCS which is combination of controlled plant and data communication link. Another contribution is statistical model of delay and dropout on the NCS.

  14. Regional quality control survey of blood-gas analysis.

    Science.gov (United States)

    Minty, B D; Nunn, J F

    1977-09-01

    We undertook an external quality control survey of blood-gas analysis in 16 laboratories at 13 hospitals. All samples were prepared in the laboratories under investigation by equilibration of blood or serum with gas mixtures of known composition. pH of serum was measured with no significant bias but with an SD of random error 0.026 pH units, which was almost twice the SD of the reference range (0.015). An acceptable random error (half SD of reference range) was not obtained in a longitudinal internal quality control suvey although there were acceptable results for buffer pH in both field and internal surveys. Blood PO2 was measured with no significant bias but with SD of random error 1.38 kPa which reduced to 0.72 kPa by excluding one egregious result. The latter value was just over half of the SD of the reference range (1.2 kPa). PCO2 of blood was also measured without significant bias but with a much smaller SD of random error of 0.28 kPa (by excluding one egregious result), which was again just over half the SD of the reference range (0.51 kPa). Measurements of blood PO2 and PCO2 seem generally acceptable in relation to their respective reference ranges but measurements of pH were unsatisfactory in both internal and external trials.

  15. INNOVATIVE APPROACH TO EDUCATION AND TEACHING OF STATISTICS

    Directory of Open Access Journals (Sweden)

    Andrea Jindrová

    2010-06-01

    Full Text Available Educational and tutorial programs are being developed together, with the changing world of information technology it is a necessary course to adapt to and accept new possibilities and needs. Use of online learning tools can amplify our teaching resources and create new types of learning opportunities that did not exist in the pre-Internet age. The world is full of information, which needs to be constantly updated. Virtualisation of studying materials enables us to update and manage them quickly and easily. As an advantage, we see an asynchronous approach towards learning materials that can be tailored for the students´ needs and adjusted according to their time and availability. The specificness of statistical learning lies in various statistical programs. The high technical demands of these programs require tutorials (instructional presentations, which can help students to learn how to use them efficiently. Instructional presentation may be understood as a demonstration of how the statistical software program works. This is one of the options that students may use to simplify the utilization of control and navigation through the statistical system. Thanks to instructional presentations, students will be able to transfer their theoretical statistical knowledge into practical situation and real life and, therefore, improve their personal development process. The goal of this tutorial is to show an innovative approach for learning of statistics in the Czech University of Life Sciences. The use of presentations and their benefits for students was evaluated according to results obtained from a questionnaire survey completed by students of the 4th grade of the Faculty of Economics and Management. The aim of this pilot survey was to evaluate the benefits of these instructional presentations, and the students interest in using them. The information obtained was used as essential data for the evaluation of the efficiency of this new approach. Firstly

  16. Whither Statistics Education Research?

    Science.gov (United States)

    Watson, Jane

    2016-01-01

    This year marks the 25th anniversary of the publication of a "National Statement on Mathematics for Australian Schools", which was the first curriculum statement this country had including "Chance and Data" as a significant component. It is hence an opportune time to survey the history of the related statistics education…

  17. Survey of Business Owners and Self-Employed Persons (SBO)

    Science.gov (United States)

    Employment and Payroll Survey of Business Owners Work from Home Our statistics highlight trends in household statistics from multiple surveys. Data Tools & Apps Main American FactFinder Census Business Builder My Classification Codes (i.e., NAICS) Economic Census Economic Indicators Economic Studies Industry Statistics

  18. A survey of approaches combining safety and security for industrial control systems

    International Nuclear Information System (INIS)

    Kriaa, Siwar; Pietre-Cambacedes, Ludovic; Bouissou, Marc; Halgand, Yoran

    2015-01-01

    The migration towards digital control systems creates new security threats that can endanger the safety of industrial infrastructures. Addressing the convergence of safety and security concerns in this context, we provide a comprehensive survey of existing approaches to industrial facility design and risk assessment that consider both safety and security. We also provide a comparative analysis of the different approaches identified in the literature. - Highlights: • We raise awareness of safety and security convergence in numerical control systems. • We highlight safety and security interdependencies for modern industrial systems. • We give a survey of approaches combining safety and security engineering. • We discuss the potential of the approaches to model safety and security interactions

  19. Human factors survey of advanced instrumentation and controls technologies in nuclear plants

    International Nuclear Information System (INIS)

    Carter, R.J.

    1992-01-01

    A survey of advanced instrumentation and controls (I ampersand C) technologies and associated human factors issues in the US and Canadian nuclear industries was carried out by a team from Oak Ridge national laboratory to provide background for the development of regulatory policy, criteria, and guides for review of advanced I ampersand C systems as well as human engineering guidelines for evaluating these systems. The survey found those components of the US nuclear industry surveyed to be quite interested in advanced I ampersand C, but very cautious in implementing such systems in nuclear facilities and power plants. The trend in the facilities surveyed is to experiment cautiously when there is an intuitive advantage or short-term payoff. In the control room, the usual practice is direct substitution of digital and microprocessor-based instruments or systems that are functionally identical to the analog instruments or systems being replaced. The most advanced I ampersand C systems were found in the Canadian CANDU plants, where the newest plant has digital system in almost 100% of its control systems and in over 70% of its plant protection system. The hypothesis that properly 'introducing digital systems increases safety' is supported by the Canadian experience. The performance of these digital systems was achieved using an appropriate quality assurance program for the software development. The ability of digital systems to detect impending failures and initiate a fail-safe action, is a significant safety issue that should be of special interest to every US utility as well as to the US Nuclear Regulatory Commission. (orig.)

  20. Prepaid monetary incentives-Predictors of taking the money and completing the survey: Results from the International Tobacco Control (ITC) Four Country Survey.

    Science.gov (United States)

    Mutti, Seema; Kennedy, Ryan David; Thompson, Mary E; Fong, Geoffrey T

    2014-05-01

    Prepaid monetary incentives are used to address declining response rates in random-digit-dial surveys. There is concern among researchers that some respondents will accept the prepayment but not complete the survey. There is little research to understand check cashing and survey completing behaviors among respondents who receive pre-payment. Data from the International Tobacco Control Four Country Study-a longitudinal survey of smokers in Canada, the US, the UK, and Australia, were used to examine the impact of prepayment (in the form of checks, approximately $10USD) on sample profile. Approximately 14% of respondents cashed their check, but did not complete the survey, while about 14% did not cash their checks, but completed the survey. Younger adults (Canada, US), those of minority status (US), and those who had been in the survey for only two waves or less (Canada, US) were more likely to cash their checks and not complete the survey.

  1. Fiducial registration error as a statistical process control metric in image-guided radiotherapy with prostatic markers

    International Nuclear Information System (INIS)

    Ung, M.N.; Wee, Leonard

    2010-01-01

    Full text: Portal imaging of implanted fiducial markers has been in use for image-guided radiotherapy (TORT) of prostate cancer, with ample attention to localization accuracy and organ motion. The geometric uncertainties in point-based rigid-body (PBRB) image registration during localization of prostate fiducial markers can be quantified in terms of a fiducial registration error (FRE). Statistical process control charts for individual patients can be designed to identify potentially significant deviation of FRE from expected behaviour. In this study, the aim was to retrospectively apply statistical process control methods to FREs in 34 individuals to identify parameters that may impact on the process stability in image-based localization. A robust procedure for estimating control parameters, control lim its and fixed tolerance levels from a small number of initial observations has been proposed and discussed. Four distinct types of qualitative control chart behavior have been observed. Probable clinical factors leading to IORT process instability are discussed in light of the control chart behaviour. Control charts have been shown to be a useful decision-making tool for detecting potentially out-of control processes on an individual basis. It can sensitively identify potential problems that warrant more detailed investigation in the 10RT of prostate cancer.

  2. The macroeconomic consequences of controlling greenhouse gases: a survey

    International Nuclear Information System (INIS)

    Boero, Gianna; Clarke, Rosemary; Winters, L.A.

    1991-01-01

    This is the summary of a major report which provides a survey of existing estimates of the macroeconomic consequences of controlling greenhouse gas emissions, particularly carbon dioxide (CO 2 ). There are broadly speaking two main questions. What are the consequences of global warming for economic activity and welfare? What, if any, are the economic consequences of reducing the levels of greenhouse gas (GHG) emissions? This survey covers only those studies which quantify the overall (macroeconomic) costs of abating greenhouse gas emissions. It is not concerned with whether any particular degree of abatement is sufficient to reduce global warming, nor whether it is worth undertaking in the light of its benefits. These are topics for other researchers and other papers. Here we are concerned only to map the relationship between economic welfare and GHG abatement. (author)

  3. How well can online GPS PPP post-processing services be used to establish geodetic survey control networks?

    Science.gov (United States)

    Ebner, R.; Featherstone, W. E.

    2008-09-01

    Establishing geodetic control networks for subsequent surveys can be a costly business, even when using GPS. Multiple stations should be occupied simultaneously and post-processed with scientific software. However, the free availability of online GPS precise point positioning (PPP) post-processing services offer the opportunity to establish a whole geodetic control network with just one dual-frequency receiver and one field crew. To test this idea, we compared coordinates from a moderate-sized (~550 km by ~440 km) geodetic network of 46 points over part of south-western Western Australia, which were processed both with the Bernese v5 scientific software and with the CSRS (Canadian Spatial Reference System) PPP free online service. After rejection of five stations where the antenna type was not recognised by CSRS, the PPP solutions agreed on average with the Bernese solutions to 3.3 mm in east, 4.8 mm in north and 11.8 mm in height. The average standard deviations of the Bernese solutions were 1.0 mm in east, 1.2 mm in north and 6.2 mm in height, whereas for CSRS they were 3.9 mm in east, 1.9 mm in north and 7.8 mm in height, reflecting the inherently lower precision of PPP. However, at the 99% confidence level, only one CSRS solution was statistically different to the Bernese solution in the north component, due to a data interruption at that site. Nevertheless, PPP can still be used to establish geodetic survey control, albeit with a slightly lower quality because of the larger standard deviations. This approach may be of particular benefit in developing countries or remote regions, where geodetic infrastructure is sparse and would not normally be established without this approach.

  4. Fact-finding survey of nosocomial infection control in hospitals in Vietnam and application to training programs.

    Science.gov (United States)

    Ohara, Hiroshi; Hung, Nguyen Viet; Thu, Truong Anh

    2009-12-01

    Nosocomial infection control is crucial for improving the quality of medical care. It is also indispensable for implementing effective control measures for severe acute respiratory syndrome (SARS) and the possible occurrence of a human influenza pandemic. The present authors, in collaboration with Vietnamese hospital staff, performed a fact-finding survey of nosocomial infection control in hospitals in northern Vietnam and compared the results with those of a survey conducted 4 years previously. Remarkable improvement was recognized in this period, although there were considerable differences between the central hospitals in Hanoi and local hospitals. In the local hospitals, basic techniques and the systems for infection control were regarded as insufficient, and it is necessary to improve these techniques and systems under the guidance of hospitals in the central area. Based on the results of the survey, programs were prepared and training courses were organized in local hospitals. Evaluation conducted after the training courses showed a high degree of satisfaction among the trainees. The results of the survey and the training courses conducted during the study period are expected to contribute to the improvement of nosocomial infection control in remote areas of Vietnam.

  5. Automated statistical modeling of analytical measurement systems

    International Nuclear Information System (INIS)

    Jacobson, J.J.

    1992-01-01

    The statistical modeling of analytical measurement systems at the Idaho Chemical Processing Plant (ICPP) has been completely automated through computer software. The statistical modeling of analytical measurement systems is one part of a complete quality control program used by the Remote Analytical Laboratory (RAL) at the ICPP. The quality control program is an integration of automated data input, measurement system calibration, database management, and statistical process control. The quality control program and statistical modeling program meet the guidelines set forth by the American Society for Testing Materials and American National Standards Institute. A statistical model is a set of mathematical equations describing any systematic bias inherent in a measurement system and the precision of a measurement system. A statistical model is developed from data generated from the analysis of control standards. Control standards are samples which are made up at precise known levels by an independent laboratory and submitted to the RAL. The RAL analysts who process control standards do not know the values of those control standards. The object behind statistical modeling is to describe real process samples in terms of their bias and precision and, to verify that a measurement system is operating satisfactorily. The processing of control standards gives us this ability

  6. LHC survey laser tracker controls renovation

    International Nuclear Information System (INIS)

    Charrondiere, C.; Nybo, M.

    2012-01-01

    The LHC survey laser tracker control system is based on an industrial software package (Axyz) from Leica Geosystems (TM) that has an interface to Visual Basic (TM), which we used to automate the geometric measurements for the LHC magnets. With the new version of the Leica software, this Visual Basic (TM) interface is no longer available and we had to redesign the interface software to adapt to a PC-DMIS server that replaced the Axyz software. As this package is no longer supported, we have taken the decision to re-code the automation application in LabVIEW. This presentation describes the existing equipment, interface and application showing the reasons for our decisions to move to PC-DMIS and LabVIEW. A comparison between the new and the existing system is made. (authors)

  7. Disciplined Decision Making in an Interdisciplinary Environment: Some Implications for Clinical Applications of Statistical Process Control.

    Science.gov (United States)

    Hantula, Donald A.

    1995-01-01

    Clinical applications of statistical process control (SPC) in human service organizations are considered. SPC is seen as providing a standard set of criteria that serves as a common interface for data-based decision making, which may bring decision making under the control of established contingencies rather than the immediate contingencies of…

  8. Statistical Reporting Errors and Collaboration on Statistical Analyses in Psychological Science.

    Science.gov (United States)

    Veldkamp, Coosje L S; Nuijten, Michèle B; Dominguez-Alvarez, Linda; van Assen, Marcel A L M; Wicherts, Jelte M

    2014-01-01

    Statistical analysis is error prone. A best practice for researchers using statistics would therefore be to share data among co-authors, allowing double-checking of executed tasks just as co-pilots do in aviation. To document the extent to which this 'co-piloting' currently occurs in psychology, we surveyed the authors of 697 articles published in six top psychology journals and asked them whether they had collaborated on four aspects of analyzing data and reporting results, and whether the described data had been shared between the authors. We acquired responses for 49.6% of the articles and found that co-piloting on statistical analysis and reporting results is quite uncommon among psychologists, while data sharing among co-authors seems reasonably but not completely standard. We then used an automated procedure to study the prevalence of statistical reporting errors in the articles in our sample and examined the relationship between reporting errors and co-piloting. Overall, 63% of the articles contained at least one p-value that was inconsistent with the reported test statistic and the accompanying degrees of freedom, and 20% of the articles contained at least one p-value that was inconsistent to such a degree that it may have affected decisions about statistical significance. Overall, the probability that a given p-value was inconsistent was over 10%. Co-piloting was not found to be associated with reporting errors.

  9. ANALISIS KEHILANGAN MINYAK PADA CRUDE PALM OIL (CPO DENGAN MENGGUNAKAN METODE STATISTICAL PROCESS CONTROL

    Directory of Open Access Journals (Sweden)

    Vera Devani

    2014-06-01

    Full Text Available PKS “XYZ” merupakan perusahaan yang bergerak di bidang pengolahan kelapa sawit. Produk yang dihasilkan adalah Crude Palm Oil (CPO dan Palm Kernel Oil (PKO. Tujuan penelitian ini adalah menganalisa kehilangan minyak (oil losses dan faktor-faktor penyebab dengan menggunakan metoda Statistical Process Control. Statistical Process Control adalah sekumpulan strategi, teknik, dan tindakan yang diambil oleh sebuah organisasi untuk memastikan bahwa strategi tersebut menghasilkan produk yang berkualitas atau menyediakan pelayanan yang berkualitas. Sampel terjadinya oil losses pada CPO yang diteliti adalah tandan kosong (tankos, biji (nut, ampas (fibre, dan sludge akhir. Berdasarkan Peta Kendali I-MR dapat disimpulkan bahwa kondisi keempat jenis oil losses CPO berada dalam batas kendali dan konsisten. Sedangkan nilai Cpk dari total oil losses berada di luar batas kendali rata-rata proses, hal ini berarti CPO yang diproduksi telah memenuhi kebutuhan pelanggan, dengan total oil losses kurang dari batas maksimum yang ditetapkan oleh perusahaan yaitu 1,65%.

  10. Preliminary Retrospective Analysis of Daily Tomotherapy Output Constancy Checks Using Statistical Process Control.

    Science.gov (United States)

    Mezzenga, Emilio; D'Errico, Vincenzo; Sarnelli, Anna; Strigari, Lidia; Menghi, Enrico; Marcocci, Francesco; Bianchini, David; Benassi, Marcello

    2016-01-01

    The purpose of this study was to retrospectively evaluate the results from a Helical TomoTherapy Hi-Art treatment system relating to quality controls based on daily static and dynamic output checks using statistical process control methods. Individual value X-charts, exponentially weighted moving average charts, and process capability and acceptability indices were used to monitor the treatment system performance. Daily output values measured from January 2014 to January 2015 were considered. The results obtained showed that, although the process was in control, there was an out-of-control situation in the principal maintenance intervention for the treatment system. In particular, process capability indices showed a decreasing percentage of points in control which was, however, acceptable according to AAPM TG148 guidelines. Our findings underline the importance of restricting the acceptable range of daily output checks and suggest a future line of investigation for a detailed process control of daily output checks for the Helical TomoTherapy Hi-Art treatment system.

  11. Statistics for lawyers

    CERN Document Server

    Finkelstein, Michael O

    2015-01-01

    This classic text, first published in 1990, is designed to introduce law students, law teachers, practitioners, and judges to the basic ideas of mathematical probability and statistics as they have been applied in the law. The third edition includes over twenty new sections, including the addition of timely topics, like New York City police stops, exonerations in death-sentence cases, projecting airline costs, and new material on various statistical techniques such as the randomized response survey technique, rare-events meta-analysis, competing risks, and negative binomial regression. The book consists of sections of exposition followed by real-world cases and case studies in which statistical data have played a role. The reader is asked to apply the theory to the facts, to calculate results (a hand calculator is sufficient), and to explore legal issues raised by quantitative findings. The authors' calculations and comments are given in the back of the book. As with previous editions, the cases and case stu...

  12. The matchmaking paradox: a statistical explanation

    International Nuclear Information System (INIS)

    Eliazar, Iddo I; Sokolov, Igor M

    2010-01-01

    Medical surveys regarding the number of heterosexual partners per person yield different female and male averages-a result which, from a physical standpoint, is impossible. In this paper we term this puzzle the 'matchmaking paradox', and establish a statistical model explaining it. We consider a bipartite graph with N male and N female nodes (N >> 1), and B bonds connecting them (B >> 1). Each node is associated a random 'attractiveness level', and the bonds connect to the nodes randomly-with probabilities which are proportionate to the nodes' attractiveness levels. The population's average bonds-per-nodes B/N is estimated via a sample average calculated from a survey of size n (n >> 1). A comprehensive statistical analysis of this model is carried out, asserting that (i) the sample average well estimates the population average if and only if the attractiveness levels possess a finite mean; (ii) if the attractiveness levels are governed by a 'fat-tailed' probability law then the sample average displays wild fluctuations and strong skew-thus providing a statistical explanation to the matchmaking paradox.

  13. Statistics for Learning Genetics

    Science.gov (United States)

    Charles, Abigail Sheena

    This study investigated the knowledge and skills that biology students may need to help them understand statistics/mathematics as it applies to genetics. The data are based on analyses of current representative genetics texts, practicing genetics professors' perspectives, and more directly, students' perceptions of, and performance in, doing statistically-based genetics problems. This issue is at the emerging edge of modern college-level genetics instruction, and this study attempts to identify key theoretical components for creating a specialized biological statistics curriculum. The goal of this curriculum will be to prepare biology students with the skills for assimilating quantitatively-based genetic processes, increasingly at the forefront of modern genetics. To fulfill this, two college level classes at two universities were surveyed. One university was located in the northeastern US and the other in the West Indies. There was a sample size of 42 students and a supplementary interview was administered to a select 9 students. Interviews were also administered to professors in the field in order to gain insight into the teaching of statistics in genetics. Key findings indicated that students had very little to no background in statistics (55%). Although students did perform well on exams with 60% of the population receiving an A or B grade, 77% of them did not offer good explanations on a probability question associated with the normal distribution provided in the survey. The scope and presentation of the applicable statistics/mathematics in some of the most used textbooks in genetics teaching, as well as genetics syllabi used by instructors do not help the issue. It was found that the text books, often times, either did not give effective explanations for students, or completely left out certain topics. The omission of certain statistical/mathematical oriented topics was seen to be also true with the genetics syllabi reviewed for this study. Nonetheless

  14. Improved Statistical Method For Hydrographic Climatic Records Quality Control

    Science.gov (United States)

    Gourrion, J.; Szekely, T.

    2016-02-01

    Climate research benefits from the continuous development of global in-situ hydrographic networks in the last decades. Apart from the increasing volume of observations available on a large range of temporal and spatial scales, a critical aspect concerns the ability to constantly improve the quality of the datasets. In the context of the Coriolis Dataset for ReAnalysis (CORA) version 4.2, a new quality control method based on a local comparison to historical extreme values ever observed is developed, implemented and validated. Temperature, salinity and potential density validity intervals are directly estimated from minimum and maximum values from an historical reference dataset, rather than from traditional mean and standard deviation estimates. Such an approach avoids strong statistical assumptions on the data distributions such as unimodality, absence of skewness and spatially homogeneous kurtosis. As a new feature, it also allows addressing simultaneously the two main objectives of a quality control strategy, i.e. maximizing the number of good detections while minimizing the number of false alarms. The reference dataset is presently built from the fusion of 1) all ARGO profiles up to early 2014, 2) 3 historical CTD datasets and 3) the Sea Mammals CTD profiles from the MEOP database. All datasets are extensively and manually quality controlled. In this communication, the latest method validation results are also presented. The method has been implemented in the latest version of the CORA dataset and will benefit to the next version of the Copernicus CMEMS dataset.

  15. Helping Raise the Official Statistics Capability of Government Employees

    Directory of Open Access Journals (Sweden)

    Forbes Sharleen

    2016-12-01

    Full Text Available Both the production and the use of official statistics are important in the business of government. In New Zealand, concern persists about many government advisors’ low level of statistical capability. One programme designed specifically to enhance capability is New Zealand’s National Certificate of Official Statistics, first introduced in 2007 and originally targeted at government policy analysts and advisors. It now includes participants from many agencies, including the National Statistics Office. The competency-based 40-credit certificate comprises four taught units that aim to give students skills in basic official statistics and in critically evaluating statistical, research, policy, or media publications for their quality (of data, survey design, analysis, and conclusions and appropriateness for some policy issue (e.g., how to reduce problem gambling, together with an ‘umbrella’ workplace-based statistics project. Case studies are used to embed the statistics learning into the real-world context of these students. Several surveys of students and their managers were undertaken to evaluate the effectiveness of the certificate in terms of enhancing skill levels and meeting organisational needs and also to examine barriers to completion of the certificate. The results were used to both modify the programme and extend its international applicability.

  16. Using Statistical Process Control Charts to Study Stuttering Frequency Variability during a Single Day

    Science.gov (United States)

    Karimi, Hamid; O'Brian, Sue; Onslow, Mark; Jones, Mark; Menzies, Ross; Packman, Ann

    2013-01-01

    Purpose: Stuttering varies between and within speaking situations. In this study, the authors used statistical process control charts with 10 case studies to investigate variability of stuttering frequency. Method: Participants were 10 adults who stutter. The authors counted the percentage of syllables stuttered (%SS) for segments of their speech…

  17. Progress in XRCS-Survey plant instrumentation and control design for ITER

    International Nuclear Information System (INIS)

    Varshney, Sanjeev; Jha, Shivakant; Simrock, Stefan; Barnsley, Robin; Martin, Vincent; Mishra, Sapna; Patil, Prabhakant; Patel, Shreyas; Kumar, Vinay

    2016-01-01

    Highlights: • An identification of the major process functions system compliant to Plant Control Design Handbook (PCDH) has been made for XRCS-Survey plant I&C. • I&C Functional Breakdown Structure (FBS) and Operation Procedure (OP) have been drafted using Enterprise architect (EA). • I&C architecture, interface with ITER networks and Plants, configuration of cubicles are discussed towards nine design review deliverables. - Abstract: A real time, plasma impurity survey system based on X-ray Crystal Spectroscopy (XRCS) has been designed for ITER and will be made available in the set of first plasma diagnostics for measuring impurity ion concentrations and their in-flux. For the purpose of developing a component level design of XRCS-Survey plant I&C system that is compliant to the rules and guidelines defined in the Plant Control Design Handbook (PCDH), firstly an identification of the major process functions has been made. The preliminary plant I&C Functional Breakdown Structure (FBS) and Operation Procedure (OP) have been drafted using a system engineering tool, Enterprise Architect (EA). Conceptual I&C architecture, interface with the ITER networks and other Plants have been discussed along with the basic configuration of I&C cubicles aiming towards nine I&C deliverables for the design review.

  18. Progress in XRCS-Survey plant instrumentation and control design for ITER

    Energy Technology Data Exchange (ETDEWEB)

    Varshney, Sanjeev, E-mail: sanjeev.varshney@iter-india.org [ITER-India, Institute for Plasma Research, Bhat, Gandhinagar, 382 428 (India); Jha, Shivakant [ITER-India, Institute for Plasma Research, Bhat, Gandhinagar, 382 428 (India); Simrock, Stefan; Barnsley, Robin; Martin, Vincent [ITER-Organization, Route de Vinon sur Verdon, CS 90 046, 13067 St. Paul-Lez-Durance, Cedex (France); Mishra, Sapna [ITER-India, Institute for Plasma Research, Bhat, Gandhinagar, 382 428 (India); Patil, Prabhakant [ITER-Organization, Route de Vinon sur Verdon, CS 90 046, 13067 St. Paul-Lez-Durance, Cedex (France); Patel, Shreyas; Kumar, Vinay [ITER-India, Institute for Plasma Research, Bhat, Gandhinagar, 382 428 (India)

    2016-11-15

    Highlights: • An identification of the major process functions system compliant to Plant Control Design Handbook (PCDH) has been made for XRCS-Survey plant I&C. • I&C Functional Breakdown Structure (FBS) and Operation Procedure (OP) have been drafted using Enterprise architect (EA). • I&C architecture, interface with ITER networks and Plants, configuration of cubicles are discussed towards nine design review deliverables. - Abstract: A real time, plasma impurity survey system based on X-ray Crystal Spectroscopy (XRCS) has been designed for ITER and will be made available in the set of first plasma diagnostics for measuring impurity ion concentrations and their in-flux. For the purpose of developing a component level design of XRCS-Survey plant I&C system that is compliant to the rules and guidelines defined in the Plant Control Design Handbook (PCDH), firstly an identification of the major process functions has been made. The preliminary plant I&C Functional Breakdown Structure (FBS) and Operation Procedure (OP) have been drafted using a system engineering tool, Enterprise Architect (EA). Conceptual I&C architecture, interface with the ITER networks and other Plants have been discussed along with the basic configuration of I&C cubicles aiming towards nine I&C deliverables for the design review.

  19. Guideline implementation in clinical practice: use of statistical process control charts as visual feedback devices.

    Science.gov (United States)

    Al-Hussein, Fahad A

    2009-01-01

    To use statistical control charts in a series of audits to improve the acceptance and consistant use of guidelines, and reduce the variations in prescription processing in primary health care. A series of audits were done at the main satellite of King Saud Housing Family and Community Medicine Center, National Guard Health Affairs, Riyadh, where three general practitioners and six pharmacists provide outpatient care to about 3000 residents. Audits were carried out every fortnight to calculate the proportion of prescriptions that did not conform to the given guidelines of prescribing and dispensing. Simple random samples of thirty were chosen from a sampling frame of all prescriptions given in the two previous weeks. Thirty six audits were carried out from September 2004 to February 2006. P-charts were constructed around a parametric specification of non-conformities not exceeding 25%. Of the 1081 prescriptions, the most frequent non-conformity was failure to write generic names (35.5%), followed by the failure to record patient's weight (16.4%), pharmacist's name (14.3%), duration of therapy (9.1%), and the use of inappropriate abbreviations (6.0%). Initially, 100% of prescriptions did not conform to the guidelines, but within a period of three months, this came down to 40%. A process of audits in the context of statistical process control is necessary for any improvement in the implementation of guidelines in primary care. Statistical process control charts are an effective means of visual feedback to the care providers.

  20. Errors in patient specimen collection: application of statistical process control.

    Science.gov (United States)

    Dzik, Walter Sunny; Beckman, Neil; Selleng, Kathleen; Heddle, Nancy; Szczepiorkowski, Zbigniew; Wendel, Silvano; Murphy, Michael

    2008-10-01

    Errors in the collection and labeling of blood samples for pretransfusion testing increase the risk of transfusion-associated patient morbidity and mortality. Statistical process control (SPC) is a recognized method to monitor the performance of a critical process. An easy-to-use SPC method was tested to determine its feasibility as a tool for monitoring quality in transfusion medicine. SPC control charts were adapted to a spreadsheet presentation. Data tabulating the frequency of mislabeled and miscollected blood samples from 10 hospitals in five countries from 2004 to 2006 were used to demonstrate the method. Control charts were produced to monitor process stability. The participating hospitals found the SPC spreadsheet very suitable to monitor the performance of the sample labeling and collection and applied SPC charts to suit their specific needs. One hospital monitored subcategories of sample error in detail. A large hospital monitored the number of wrong-blood-in-tube (WBIT) events. Four smaller-sized facilities, each following the same policy for sample collection, combined their data on WBIT samples into a single control chart. One hospital used the control chart to monitor the effect of an educational intervention. A simple SPC method is described that can monitor the process of sample collection and labeling in any hospital. SPC could be applied to other critical steps in the transfusion processes as a tool for biovigilance and could be used to develop regional or national performance standards for pretransfusion sample collection. A link is provided to download the spreadsheet for free.

  1. Multivariate statistical process control of a continuous pharmaceutical twin-screw granulation and fluid bed drying process.

    Science.gov (United States)

    Silva, A F; Sarraguça, M C; Fonteyne, M; Vercruysse, J; De Leersnyder, F; Vanhoorne, V; Bostijn, N; Verstraeten, M; Vervaet, C; Remon, J P; De Beer, T; Lopes, J A

    2017-08-07

    A multivariate statistical process control (MSPC) strategy was developed for the monitoring of the ConsiGma™-25 continuous tablet manufacturing line. Thirty-five logged variables encompassing three major units, being a twin screw high shear granulator, a fluid bed dryer and a product control unit, were used to monitor the process. The MSPC strategy was based on principal component analysis of data acquired under normal operating conditions using a series of four process runs. Runs with imposed disturbances in the dryer air flow and temperature, in the granulator barrel temperature, speed and liquid mass flow and in the powder dosing unit mass flow were utilized to evaluate the model's monitoring performance. The impact of the imposed deviations to the process continuity was also evaluated using Hotelling's T 2 and Q residuals statistics control charts. The influence of the individual process variables was assessed by analyzing contribution plots at specific time points. Results show that the imposed disturbances were all detected in both control charts. Overall, the MSPC strategy was successfully developed and applied. Additionally, deviations not associated with the imposed changes were detected, mainly in the granulator barrel temperature control. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Asthma control in Latin America: the Asthma Insights and Reality in Latin America (AIRLA) survey.

    Science.gov (United States)

    Neffen, Hugo; Fritscher, Carlos; Schacht, Francisco Cuevas; Levy, Gur; Chiarella, Pascual; Soriano, Joan B; Mechali, Daniel

    2005-03-01

    The aims of this survey were (1) to assess the quality of asthma treatment and control in Latin America, (2) to determine how closely asthma management guidelines are being followed, and (3) to assess perception, knowledge and attitudes related to asthma in Latin America. We surveyed a household sample of 2,184 adults or parents of children with asthma in 2003 in 11 countries in Latin America. Respondents were asked about healthcare utilization, symptom severity, activity limitations and medication use. Daytime asthma symptoms were reported by 56% of the respondents, and 51% reported being awakened by their asthma at night. More than half of those surveyed had been hospitalized, attended a hospital emergency service or made unscheduled emergency visits to other healthcare facilities for asthma during the previous year. Patient perception of asthma control did not match symptom severity, even in patients with severe persistent asthma, 44.7% of whom regarded their disease as being well or completely controlled. Only 2.4% (2.3% adults and 2.6% children) met all criteria for asthma control. Although 37% reported treatment with prescription medications, only 6% were using inhaled corticosteroids. Most adults (79%) and children (68%) in this survey reported that asthma symptoms limited their activities. Absence from school and work was reported by 58% of the children and 31% of adults, respectively. Asthma control in Latin America falls short of goals in international guidelines, and in many aspects asthma care and control in Latin America suffer from the same shortcomings as in other areas of the world.

  3. Surveying Low-Mass Star Formation with the Submillimeter Array

    Science.gov (United States)

    Dunham, Michael

    2018-01-01

    Large astronomical surveys yield important statistical information that can’t be derived from single-object and small-number surveys. In this talk I will review two recent surveys in low-mass star formation undertaken by the Submillimeter Array (SMA): a millimeter continuum survey of disks surrounding variably accreting young stars, and a complete continuum and molecular line survey of all protostars in the nearby Perseus Molecular Cloud. I will highlight several new insights into the processes by which low-mass stars gain their mass that have resulted from the statistical power of these surveys.

  4. Measuring and improving the quality of postoperative epidural analgesia for major abdominal surgery using statistical process control charts.

    Science.gov (United States)

    Duncan, Fiona; Haigh, Carol

    2013-10-01

    To explore and improve the quality of continuous epidural analgesia for pain relief using Statistical Process Control tools. Measuring the quality of pain management interventions is complex. Intermittent audits do not accurately capture the results of quality improvement initiatives. The failure rate for one intervention, epidural analgesia, is approximately 30% in everyday practice, so it is an important area for improvement. Continuous measurement and analysis are required to understand the multiple factors involved in providing effective pain relief. Process control and quality improvement Routine prospectively acquired data collection started in 2006. Patients were asked about their pain and side effects of treatment. Statistical Process Control methods were applied for continuous data analysis. A multidisciplinary group worked together to identify reasons for variation in the data and instigated ideas for improvement. The key measure for improvement was a reduction in the percentage of patients with an epidural in severe pain. The baseline control charts illustrated the recorded variation in the rate of several processes and outcomes for 293 surgical patients. The mean visual analogue pain score (VNRS) was four. There was no special cause variation when data were stratified by surgeons, clinical area or patients who had experienced pain before surgery. Fifty-seven per cent of patients were hypotensive on the first day after surgery. We were able to demonstrate a significant improvement in the failure rate of epidurals as the project continued with quality improvement interventions. Statistical Process Control is a useful tool for measuring and improving the quality of pain management. The applications of Statistical Process Control methods offer the potential to learn more about the process of change and outcomes in an Acute Pain Service both locally and nationally. We have been able to develop measures for improvement and benchmarking in routine care that

  5. Resources on Quantitative/Statistical Research for Applied Linguists

    Science.gov (United States)

    Brown, James Dean

    2004-01-01

    The purpose of this review article is to survey and evaluate existing books on quantitative/statistical research in applied linguistics. The article begins by explaining the types of texts that will not be reviewed, then it briefly describes nine books that address how to do quantitative/statistical applied linguistics research. The review then…

  6. An introduction to statistical thermodynamics

    CERN Document Server

    Hill, Terrell L

    1987-01-01

    ""A large number of exercises of a broad range of difficulty make this book even more useful…a good addition to the literature on thermodynamics at the undergraduate level."" - Philosophical MagazineAlthough written on an introductory level, this wide-ranging text provides extensive coverage of topics of current interest in equilibrium statistical mechanics. Indeed, certain traditional topics are given somewhat condensed treatment to allow room for a survey of more recent advances.The book is divided into four major sections. Part I deals with the principles of quantum statistical mechanics a

  7. Software and control system for the VLT Survey Telescope

    International Nuclear Information System (INIS)

    Schipani, P; Marty, L; Dall'Ora, M; D'Orsi, S; Argomedo, J; Arcidiacono, C; Farinato, J; Magrin, D; Ragazzoni, R; Umbriaco, G

    2013-01-01

    The VLT Survey Telescope (VST) has started the regular operations in 2011 after a successful commissioning at Cerro Paranal (Chile), the site which hosts the best facilities for optical astronomy operated by the European Southern Observatory (ESO). After a short description of the instrument, this paper mainly focuses on the telescope control software, which is in charge of the real-time control of the hardware and of the overall coordination of the operations, including pointing and tracking, active optics and presets. We describe the main features of the software implementation in the context of the ESO observatory standards, and the goals reached during the commissioning phase and in the first year of operations.

  8. Statistical process control of cocrystallization processes: A comparison between OPLS and PLS.

    Science.gov (United States)

    Silva, Ana F T; Sarraguça, Mafalda Cruz; Ribeiro, Paulo R; Santos, Adenilson O; De Beer, Thomas; Lopes, João Almeida

    2017-03-30

    Orthogonal partial least squares regression (OPLS) is being increasingly adopted as an alternative to partial least squares (PLS) regression due to the better generalization that can be achieved. Particularly in multivariate batch statistical process control (BSPC), the use of OPLS for estimating nominal trajectories is advantageous. In OPLS, the nominal process trajectories are expected to be captured in a single predictive principal component while uncorrelated variations are filtered out to orthogonal principal components. In theory, OPLS will yield a better estimation of the Hotelling's T 2 statistic and corresponding control limits thus lowering the number of false positives and false negatives when assessing the process disturbances. Although OPLS advantages have been demonstrated in the context of regression, its use on BSPC was seldom reported. This study proposes an OPLS-based approach for BSPC of a cocrystallization process between hydrochlorothiazide and p-aminobenzoic acid monitored on-line with near infrared spectroscopy and compares the fault detection performance with the same approach based on PLS. A series of cocrystallization batches with imposed disturbances were used to test the ability to detect abnormal situations by OPLS and PLS-based BSPC methods. Results demonstrated that OPLS was generally superior in terms of sensibility and specificity in most situations. In some abnormal batches, it was found that the imposed disturbances were only detected with OPLS. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. National surveys: a way to manage treatment strategies in Parkinson's disease? Pharmaceutical prescribing patterns and patient experiences of symptom control and their impact on disease

    Directory of Open Access Journals (Sweden)

    Skogar Ö

    2013-07-01

    Full Text Available Örjan Skogar,1,2 Mats Nilsson,1 Carl-Johan Törnhage,3 Johan Lökk2 1Futurum Health Care Academy, Jönköping, 2Institution of Neurobiology, Care Sciences and Society, Karolinska Institutet, Karolinska University Hospital Huddinge, Stockholm, 3Department of Pediatrics, Skaraborg Hospital, Skövde, Sweden Background: The purpose of this study was to draw conclusions from patient-reported experiences in two national surveys from Scandinavia with the intention of comparing treatment strategies and increasing our knowledge of factors that affect the experiences of patients with Parkinson's disease (PD. Methods: A total of 2000 individuals in Sweden and 1300 in Norway were invited to complete postal surveys covering PD-related issues. Patient experiences of diagnostic procedures, symptom control, and follow-up in PD and the effects on symptom-related quality of life were collected. Pharmaceutical prescription data on anti-PD drugs and administrative data were collected from national registries. Results: The surveys were completed by 1553 (78% of the Swedish cohort and 1244 (96% of the Norwegian cohort. Only small differences were seen in disease duration and age distribution. Statistically as well as clinically significant differences in symptom control, diagnostic, and follow-up procedures, as well as in pharmacological treatment and impact on quality of life, were found between the national cohorts independent of disease duration. Conclusion: Information from separate national surveys has the potential to increase our knowledge of patient experiences in PD and can be used to compare, evaluate, educate, and guide health care staff and administrators in optimizing health care for patients with the disease. Keywords: parkinson's disease, diagnosis, follow-up, pharmaceutical prescription, quality of life, survey

  10. Blood phenylalanine control in phenylketonuria : a survey of 10 European centres

    NARCIS (Netherlands)

    Ahring, K.; Belanger-Quintana, A.; Dokoupil, K.; Gokmen-Ozel, H.; Lammardo, A. M.; MacDonald, A.; Motzfeldt, K.; Nowacka, M.; van Rijn, M.; Robert, M.

    Background: Only limited data are available on the blood phenylalanine (Phe) concentrations achieved in European patients with phenylketonuria (PKU) on a low-Phe diet. Objective: A survey was conducted to compare blood Phe control achieved in diet-treated patients with PKU of different age groups in

  11. An integrated model of statistical process control and maintenance based on the delayed monitoring

    International Nuclear Information System (INIS)

    Yin, Hui; Zhang, Guojun; Zhu, Haiping; Deng, Yuhao; He, Fei

    2015-01-01

    This paper develops an integrated model of statistical process control and maintenance decision. The proposal of delayed monitoring policy postpones the sampling process till a scheduled time and contributes to ten-scenarios of the production process, where equipment failure may occur besides quality shift. The equipment failure and the control chart alert trigger the corrective maintenance and the predictive maintenance, respectively. The occurrence probability, the cycle time and the cycle cost of each scenario are obtained by integral calculation; therefore, a mathematical model is established to minimize the expected cost by using the genetic algorithm. A Monte Carlo simulation experiment is conducted and compared with the integral calculation in order to ensure the analysis of the ten-scenario model. Another ordinary integrated model without delayed monitoring is also established as comparison. The results of a numerical example indicate satisfactory economic performance of the proposed model. Finally, a sensitivity analysis is performed to investigate the effect of model parameters. - Highlights: • We develop an integrated model of statistical process control and maintenance. • We propose delayed monitoring policy and derive an economic model with 10 scenarios. • We consider two deterioration mechanisms, quality shift and equipment failure. • The delayed monitoring policy will help reduce the expected cost

  12. A survey of drought and Variation of Vegetation by statistical indexes and remote sensing (Case study: Jahad forest in Bandar Abbas)

    International Nuclear Information System (INIS)

    Tamassoki, E; Soleymani, Z; Bahrami, F; Abbasgharemani, H

    2014-01-01

    The damages of drought as a climatic and creeping phenomenon are very enormous specially in deserts. Necessity of management and conflict with it is clear. In this case vegetation are damaged too, and even are changed faster. This paper describes the process of vegetation changes and surveys it with drought indexes such as statistical and remote sensing indexes and correlation between temperature and relative humidity by Geographical Information System (GIS) and Remote Sensing (RS) in forest park of Bandar Abbas in successive years. At the end the regression and determination-coefficient for showing the importance of droughts survey are computed. Results revealed that the correlation between vegetation and indexes was 0.5. The humidity had maximum correlation and when we close to 2009 the period of droughts increase and time intervals decrease that influence vegetation enormously and cause the more area lost its vegetation

  13. MASKED AREAS IN SHEAR PEAK STATISTICS: A FORWARD MODELING APPROACH

    International Nuclear Information System (INIS)

    Bard, D.; Kratochvil, J. M.; Dawson, W.

    2016-01-01

    The statistics of shear peaks have been shown to provide valuable cosmological information beyond the power spectrum, and will be an important constraint of models of cosmology in forthcoming astronomical surveys. Surveys include masked areas due to bright stars, bad pixels etc., which must be accounted for in producing constraints on cosmology from shear maps. We advocate a forward-modeling approach, where the impacts of masking and other survey artifacts are accounted for in the theoretical prediction of cosmological parameters, rather than correcting survey data to remove them. We use masks based on the Deep Lens Survey, and explore the impact of up to 37% of the survey area being masked on LSST and DES-scale surveys. By reconstructing maps of aperture mass the masking effect is smoothed out, resulting in up to 14% smaller statistical uncertainties compared to simply reducing the survey area by the masked area. We show that, even in the presence of large survey masks, the bias in cosmological parameter estimation produced in the forward-modeling process is ≈1%, dominated by bias caused by limited simulation volume. We also explore how this potential bias scales with survey area and evaluate how much small survey areas are impacted by the differences in cosmological structure in the data and simulated volumes, due to cosmic variance

  14. Monitoring the Future 2014 Survey Results

    Science.gov (United States)

    ... Statistics » Infographics » Monitoring the Future 2014 Survey Results Monitoring the Future 2014 Survey Results Email Facebook Twitter ... embed/iQDhYlYp81c?rel=0 Text Description of Infographic Monitoring the Future is an annual survey of 8th, ...

  15. Anxiety and Attitude of Graduate Students in On-Campus vs. Online Statistics Courses

    Science.gov (United States)

    DeVaney, Thomas A.

    2010-01-01

    This study compared levels of statistics anxiety and attitude toward statistics for graduate students in on-campus and online statistics courses. The Survey of Attitudes Toward Statistics and three subscales of the Statistics Anxiety Rating Scale were administered at the beginning and end of graduate level educational statistic courses.…

  16. Persuasiveness of Statistics and Patients’ and Mothers’ Narratives in Human Papillomavirus Vaccine Recommendation Messages: A Randomized Controlled Study in Japan

    Directory of Open Access Journals (Sweden)

    Tsuyoshi Okuhara

    2018-04-01

    Full Text Available BackgroundThe human papillomavirus (HPV vaccination percentage among age-eligible girls in Japan is only in the single digits. This signals the need for effective vaccine communication tactics. This study aimed to examine the influence of statistical data and narrative HPV vaccination recommendation massages on recipients’ vaccination intentions.MethodsThis randomized controlled study covered 1,432 mothers who had daughters aged 12–16 years. It compared message persuasiveness among four conditions: statistical messages only; narrative messages of a patient who experienced cervical cancer, in addition to statistical messages; narrative messages of a mother whose daughter experienced cervical cancer, in addition to statistical messages; and a control. Vaccination intentions to have one’s daughter(s receive the HPV vaccine before and after reading intervention materials were assessed. Statistical analysis was conducted using analysis of variance with Tukey’s test or Games–Howell post hoc test, and analysis of covariance with Bonferroni correction.ResultsVaccination intentions after intervention in the three intervention conditions were higher than the control condition (p < 0.001. A mother’s narrative messages in addition to statistical messages increased HPV vaccination intention the most of all tested intervention conditions. A significant difference in the estimated means of intention with the covariate adjustment for baseline value (i.e., intention before intervention was found between a mother’s narrative messages in addition to statistical messages and statistical messages only (p = 0.040.DiscussionMothers’ narrative messages may be persuasive when targeting mothers for promoting HPV vaccination. This may be because mothers can easily relate to and identify with communications from other mothers. However, for effective HPV vaccine communication, further studies are needed to understand more about persuasive

  17. Improved statistical method for temperature and salinity quality control

    Science.gov (United States)

    Gourrion, Jérôme; Szekely, Tanguy

    2017-04-01

    Climate research and Ocean monitoring benefit from the continuous development of global in-situ hydrographic networks in the last decades. Apart from the increasing volume of observations available on a large range of temporal and spatial scales, a critical aspect concerns the ability to constantly improve the quality of the datasets. In the context of the Coriolis Dataset for ReAnalysis (CORA) version 4.2, a new quality control method based on a local comparison to historical extreme values ever observed is developed, implemented and validated. Temperature, salinity and potential density validity intervals are directly estimated from minimum and maximum values from an historical reference dataset, rather than from traditional mean and standard deviation estimates. Such an approach avoids strong statistical assumptions on the data distributions such as unimodality, absence of skewness and spatially homogeneous kurtosis. As a new feature, it also allows addressing simultaneously the two main objectives of an automatic quality control strategy, i.e. maximizing the number of good detections while minimizing the number of false alarms. The reference dataset is presently built from the fusion of 1) all ARGO profiles up to late 2015, 2) 3 historical CTD datasets and 3) the Sea Mammals CTD profiles from the MEOP database. All datasets are extensively and manually quality controlled. In this communication, the latest method validation results are also presented. The method has already been implemented in the latest version of the delayed-time CMEMS in-situ dataset and will be deployed soon in the equivalent near-real time products.

  18. Race, Sex, and Their Influences on Introductory Statistics Education

    Science.gov (United States)

    van Es, Cindy; Weaver, Michelle M.

    2018-01-01

    The Survey of Attitudes Toward Statistics or SATS was administered for three consecutive years to students in an Introductory Statistics course at Cornell University. Questions requesting demographic information and expected final course grade were added. Responses were analyzed to investigate possible differences between sexes and racial/ethnic…

  19. Project T.E.A.M. (Technical Education Advancement Modules). Introduction to Statistical Process Control.

    Science.gov (United States)

    Billings, Paul H.

    This instructional guide, one of a series developed by the Technical Education Advancement Modules (TEAM) project, is a 6-hour introductory module on statistical process control (SPC), designed to develop competencies in the following skill areas: (1) identification of the three classes of SPC use; (2) understanding a process and how it works; (3)…

  20. Suicidal Behavior and Firearm Access: Results from the Second Injury Control and Risk Survey

    Science.gov (United States)

    Betz, Marian E.; Barber, Catherine; Miller, Matthew

    2011-01-01

    The association between home firearms and the likelihood and nature of suicidal thoughts and plans was examined using the Second Injury Control and Risk Survey, a 2001-2003 representative telephone survey of U.S. households. Of 9,483 respondents, 7.4% reported past-year suicidal thoughts, 21.3% with a plan. Similar proportions of those with and…

  1. Photovoltaic battery & charge controller market & applications survey. An evaluation of the photovoltaic system market for 1995

    Energy Technology Data Exchange (ETDEWEB)

    Hammond, R.L.; Turpin, J.F.; Corey, G.P. [and others

    1996-12-01

    Under the sponsorship of the Department of Energy, Office of Utility Technologies, the Battery Analysis and Evaluation Department and the Photovoltaic System Assistance Center of Sandia National Laboratories (SNL) initiated a U.S. industry-wide PV Energy Storage System Survey. Arizona State University (ASU) was contracted by SNL in June 1995 to conduct the survey. The survey included three separate segments tailored to: (a) PV system integrators, (b) battery manufacturers, and (c) PV charge controller manufacturers. The overall purpose of the survey was to: (a) quantify the market for batteries shipped with (or for) PV systems in 1995, (b) quantify the PV market segments by battery type and application for PV batteries, (c) characterize and quantify the charge controllers used in PV systems, (d) characterize the operating environment for energy storage components in PV systems, and (e) estimate the PV battery market for the year 2000. All three segments of the survey were mailed in January 1996. This report discusses the purpose, methodology, results, and conclusions of the survey.

  2. Survey of Current Status of Quality Control of Gamma Cameras in Republic of Korea

    International Nuclear Information System (INIS)

    Choe, Jae Gol; Joh, Cheol Woo

    2008-01-01

    It is widely recognized that good quality control (QC) program is essential for adequate imaging diagnosis using gamma camera. The purpose of this study is to survey the current status of QC of gamma cameras in Republic of Korea for implementing appropriate nationwide quality control guidelines and programs. A collection of data is done for personnel, equipment and appropriateness of each nuclear medicine imaging laboratory's' quality control practice. This survey is done by collection of formatted questionnaire by mails, e mails or interviews. We also reviewed the current recommendations concerning quality assurance by international societies. This survey revealed that practice of quality control is irregular and not satisfactory. The irregularity of the QC practice seems due partly to the lack of trained personnel, equipment, budget, time and hand-on guidelines. The implementation of QC program may cause additional burden to the hospitals, patients and nuclear medicine laboratories. However, the benefit of a good QC program is obvious that the hospitals can provide good quality nuclear medicine imaging studies to the patients. It is important to use least cumbersome QC protocol, to educate the nuclear medicine and hospital administrative personnel concerning QC, and to establish national QC guidelines to help each individual nuclear medicine laboratory

  3. Methods for estimating private forest ownership statistics: revised methods for the USDA Forest Service's National Woodland Owner Survey

    Science.gov (United States)

    Brenton J. ​Dickinson; Brett J. Butler

    2013-01-01

    The USDA Forest Service's National Woodland Owner Survey (NWOS) is conducted to better understand the attitudes and behaviors of private forest ownerships, which control more than half of US forestland. Inferences about the populations of interest should be based on theoretically sound estimation procedures. A recent review of the procedures disclosed an error in...

  4. The Sloan Digital Sky Survey Quasar Lens Search. IV. Statistical Lens Sample from the Fifth Data Release

    Energy Technology Data Exchange (ETDEWEB)

    Inada, Naohisa; /Wako, RIKEN /Tokyo U., ICEPP; Oguri, Masamune; /Natl. Astron. Observ. of Japan /Stanford U., Phys. Dept.; Shin, Min-Su; /Michigan U. /Princeton U. Observ.; Kayo, Issha; /Tokyo U., ICRR; Strauss, Michael A.; /Princeton U. Observ.; Hennawi, Joseph F.; /UC, Berkeley /Heidelberg, Max Planck Inst. Astron.; Morokuma, Tomoki; /Natl. Astron. Observ. of Japan; Becker, Robert H.; /LLNL, Livermore /UC, Davis; White, Richard L.; /Baltimore, Space Telescope Sci.; Kochanek, Christopher S.; /Ohio State U.; Gregg, Michael D.; /LLNL, Livermore /UC, Davis /Exeter U.

    2010-05-01

    We present the second report of our systematic search for strongly lensed quasars from the data of the Sloan Digital Sky Survey (SDSS). From extensive follow-up observations of 136 candidate objects, we find 36 lenses in the full sample of 77,429 spectroscopically confirmed quasars in the SDSS Data Release 5. We then define a complete sample of 19 lenses, including 11 from our previous search in the SDSS Data Release 3, from the sample of 36,287 quasars with i < 19.1 in the redshift range 0.6 < z < 2.2, where we require the lenses to have image separations of 1 < {theta} < 20 and i-band magnitude differences between the two images smaller than 1.25 mag. Among the 19 lensed quasars, 3 have quadruple-image configurations, while the remaining 16 show double images. This lens sample constrains the cosmological constant to be {Omega}{sub {Lambda}} = 0.84{sub -0.08}{sup +0.06}(stat.){sub -0.07}{sup + 0.09}(syst.) assuming a flat universe, which is in good agreement with other cosmological observations. We also report the discoveries of 7 binary quasars with separations ranging from 1.1 to 16.6, which are identified in the course of our lens survey. This study concludes the construction of our statistical lens sample in the full SDSS-I data set.

  5. Bioinspired Intelligent Algorithm and Its Applications for Mobile Robot Control: A Survey

    Science.gov (United States)

    Ni, Jianjun; Wu, Liuying; Fan, Xinnan; Yang, Simon X.

    2016-01-01

    Bioinspired intelligent algorithm (BIA) is a kind of intelligent computing method, which is with a more lifelike biological working mechanism than other types. BIAs have made significant progress in both understanding of the neuroscience and biological systems and applying to various fields. Mobile robot control is one of the main application fields of BIAs which has attracted more and more attention, because mobile robots can be used widely and general artificial intelligent algorithms meet a development bottleneck in this field, such as complex computing and the dependence on high-precision sensors. This paper presents a survey of recent research in BIAs, which focuses on the research in the realization of various BIAs based on different working mechanisms and the applications for mobile robot control, to help in understanding BIAs comprehensively and clearly. The survey has four primary parts: a classification of BIAs from the biomimetic mechanism, a summary of several typical BIAs from different levels, an overview of current applications of BIAs in mobile robot control, and a description of some possible future directions for research. PMID:26819582

  6. Bioinspired Intelligent Algorithm and Its Applications for Mobile Robot Control: A Survey.

    Science.gov (United States)

    Ni, Jianjun; Wu, Liuying; Fan, Xinnan; Yang, Simon X

    2016-01-01

    Bioinspired intelligent algorithm (BIA) is a kind of intelligent computing method, which is with a more lifelike biological working mechanism than other types. BIAs have made significant progress in both understanding of the neuroscience and biological systems and applying to various fields. Mobile robot control is one of the main application fields of BIAs which has attracted more and more attention, because mobile robots can be used widely and general artificial intelligent algorithms meet a development bottleneck in this field, such as complex computing and the dependence on high-precision sensors. This paper presents a survey of recent research in BIAs, which focuses on the research in the realization of various BIAs based on different working mechanisms and the applications for mobile robot control, to help in understanding BIAs comprehensively and clearly. The survey has four primary parts: a classification of BIAs from the biomimetic mechanism, a summary of several typical BIAs from different levels, an overview of current applications of BIAs in mobile robot control, and a description of some possible future directions for research.

  7. Bioinspired Intelligent Algorithm and Its Applications for Mobile Robot Control: A Survey

    Directory of Open Access Journals (Sweden)

    Jianjun Ni

    2016-01-01

    Full Text Available Bioinspired intelligent algorithm (BIA is a kind of intelligent computing method, which is with a more lifelike biological working mechanism than other types. BIAs have made significant progress in both understanding of the neuroscience and biological systems and applying to various fields. Mobile robot control is one of the main application fields of BIAs which has attracted more and more attention, because mobile robots can be used widely and general artificial intelligent algorithms meet a development bottleneck in this field, such as complex computing and the dependence on high-precision sensors. This paper presents a survey of recent research in BIAs, which focuses on the research in the realization of various BIAs based on different working mechanisms and the applications for mobile robot control, to help in understanding BIAs comprehensively and clearly. The survey has four primary parts: a classification of BIAs from the biomimetic mechanism, a summary of several typical BIAs from different levels, an overview of current applications of BIAs in mobile robot control, and a description of some possible future directions for research.

  8. Use of statistic control of the process as part of a quality assurance plan; Empleo del control estadistico de proceso como parte de un plan de aseguramiento de la calidad

    Energy Technology Data Exchange (ETDEWEB)

    Acosta, S.; Lewis, C., E-mail: sacosta@am.gob.ar [Autoridad Regulatoria Nuclear (ARN), Buenos Aires (Argentina)

    2013-07-01

    One of the technical requirements of the standard IRAM ISO 17025 for the accreditation of testing laboratories, is the assurance of the quality of the results through the control and monitoring of the factors influencing the reliability of them. The grade the factors contribute to the total measurement uncertainty, determines which of them should be considered when developing a quality assurance plan. The laboratory of environmental measurements of strontium-90 in the accreditation process, performs most of its determinations in samples with values close to the detection limit. For this reason the correct characterization of the white, is a critical parameter and is verified through a letter for statistical process control. The scope of the present work is concerned the control of whites and so it was collected a statistically significant amount of data, for a period of time that is covered of different conditions. This allowed consider significant variables in the process, such as temperature and humidity, and build a graph of white control, which forms the basis of a statistical process control. The data obtained were lower and upper limits for the preparation of the charter white control. In this way the process of characterization of white was considered to operate under statistical control and concludes that it can be used as part of a plan of insurance of the quality.

  9. A Survey of Congestion Control Techniques and Data Link Protocols in Satellite Networks

    OpenAIRE

    Fahmy, Sonia; Jain, Raj; Lu, Fang; Kalyanaraman, Shivkumar

    1998-01-01

    Satellite communication systems are the means of realizing a global broadband integrated services digital network. Due to the statistical nature of the integrated services traffic, the resulting rate fluctuations and burstiness render congestion control a complicated, yet indispensable function. The long propagation delay of the earth-satellite link further imposes severe demands and constraints on the congestion control schemes, as well as the media access control techniques and retransmissi...

  10. Sample design considerations of indoor air exposure surveys

    International Nuclear Information System (INIS)

    Cox, B.G.; Mage, D.T.; Immerman, F.W.

    1988-01-01

    Concern about the potential for indoor air pollution has prompted recent surveys of radon and NO 2 concentrations in homes and personal exposure studies of volatile organics, carbon monoxide and pesticides, to name a few. The statistical problems in designing sample surveys that measure the physical environment are diverse and more complicated than those encountered in traditional surveys of human attitudes and attributes. This paper addresses issues encountered when designing indoor air quality (IAQ) studies. General statistical concepts related to target population definition, frame creation, and sample selection for area household surveys and telephone surveys are presented. The implications of different measurement approaches are discussed, and response rate considerations are described

  11. Robust control charts in statistical process control

    NARCIS (Netherlands)

    Nazir, H.Z.

    2014-01-01

    The presence of outliers and contaminations in the output of the process highly affects the performance of the design structures of commonly used control charts and hence makes them of less practical use. One of the solutions to deal with this problem is to use control charts which are robust

  12. Research design and statistical methods in Indian medical journals: a retrospective survey.

    Science.gov (United States)

    Hassan, Shabbeer; Yellur, Rajashree; Subramani, Pooventhan; Adiga, Poornima; Gokhale, Manoj; Iyer, Manasa S; Mayya, Shreemathi S

    2015-01-01

    Good quality medical research generally requires not only an expertise in the chosen medical field of interest but also a sound knowledge of statistical methodology. The number of medical research articles which have been published in Indian medical journals has increased quite substantially in the past decade. The aim of this study was to collate all evidence on study design quality and statistical analyses used in selected leading Indian medical journals. Ten (10) leading Indian medical journals were selected based on impact factors and all original research articles published in 2003 (N = 588) and 2013 (N = 774) were categorized and reviewed. A validated checklist on study design, statistical analyses, results presentation, and interpretation was used for review and evaluation of the articles. Main outcomes considered in the present study were - study design types and their frequencies, error/defects proportion in study design, statistical analyses, and implementation of CONSORT checklist in RCT (randomized clinical trials). From 2003 to 2013: The proportion of erroneous statistical analyses did not decrease (χ2=0.592, Φ=0.027, p=0.4418), 25% (80/320) in 2003 compared to 22.6% (111/490) in 2013. Compared with 2003, significant improvement was seen in 2013; the proportion of papers using statistical tests increased significantly (χ2=26.96, Φ=0.16, pdesign decreased significantly (χ2=16.783, Φ=0.12 pdesigns has remained very low (7.3%, 43/588) with majority showing some errors (41 papers, 95.3%). Majority of the published studies were retrospective in nature both in 2003 [79.1% (465/588)] and in 2013 [78.2% (605/774)]. Major decreases in error proportions were observed in both results presentation (χ2=24.477, Φ=0.17, presearch seems to have made no major progress regarding using correct statistical analyses, but error/defects in study designs have decreased significantly. Randomized clinical trials are quite rarely published and have high proportion of

  13. Data exploration, quality control and statistical analysis of ChIP-exo/nexus experiments.

    Science.gov (United States)

    Welch, Rene; Chung, Dongjun; Grass, Jeffrey; Landick, Robert; Keles, Sündüz

    2017-09-06

    ChIP-exo/nexus experiments rely on innovative modifications of the commonly used ChIP-seq protocol for high resolution mapping of transcription factor binding sites. Although many aspects of the ChIP-exo data analysis are similar to those of ChIP-seq, these high throughput experiments pose a number of unique quality control and analysis challenges. We develop a novel statistical quality control pipeline and accompanying R/Bioconductor package, ChIPexoQual, to enable exploration and analysis of ChIP-exo and related experiments. ChIPexoQual evaluates a number of key issues including strand imbalance, library complexity, and signal enrichment of data. Assessment of these features are facilitated through diagnostic plots and summary statistics computed over regions of the genome with varying levels of coverage. We evaluated our QC pipeline with both large collections of public ChIP-exo/nexus data and multiple, new ChIP-exo datasets from Escherichia coli. ChIPexoQual analysis of these datasets resulted in guidelines for using these QC metrics across a wide range of sequencing depths and provided further insights for modelling ChIP-exo data. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  14. A control system of a mini survey facility for photometric monitoring

    Science.gov (United States)

    Tsutsui, Hironori; Yanagisawa, Kenshi; Izumiura, Hideyuki; Shimizu, Yasuhiro; Hanaue, Takumi; Ita, Yoshifusa; Ichikawa, Takashi; Komiyama, Takahiro

    2016-08-01

    We have built a control system for a mini survey facility dedicated to photometric monitoring of nearby bright (Kdome and a small (30-mm aperture) wide-field (5 × 5 sq. deg. field of view) infrared (1.0-2.5 microns) camera on an equatorial fork mount, as well as power sources and other associated equipment. All the components other than the camera are controlled by microcomputerbased I/O boards that were developed in-house and are in many of the open-use instruments in our observatory. We present the specifications and configuration of the facility hardware, as well as the structure of its control software.

  15. Control and Chance in Music and Art a Survey of Philosophies

    Directory of Open Access Journals (Sweden)

    Shuang Cai

    2018-02-01

    Full Text Available This article is a survey and review of several writings on the philosophies and compositional techniques involving control and chance in the creation of modern art and music. The purpose of discussing and comparing these writings is to trace different understandings, reactions, and interpretations of these philosophies in order to offer a more informed perspective on these oft misunderstood techniques. The first article analyzed is Robert Charles Clark’s “Total Control and Chance in Musics: A Philosophical Analysis,” which discusses fundamental issues regarding both total control and chance music. The second article, Stephanie Ross’ “Chance, Constraint, and Creativity: The Awfulness of Modern Music,” presents some of the adverse reactions to these methods of composition. The third and fourth articles, Roland Barthes’ “The Death of the Author” and “From Work to Text,” offer a broader philosophical viewpoint on the different roles of the author and their product when creating art. The final article, Jeongwon Joe and S. Hoon Song’s “Roland Barthes’ ‘Text’ and Aleatoric Music: Is the ‘Birth of the Reader’ the Birth of the Listener?” concludes this survey by tying Barthes’ concepts back to music.

  16. Adaptive sampling rate control for networked systems based on statistical characteristics of packet disordering.

    Science.gov (United States)

    Li, Jin-Na; Er, Meng-Joo; Tan, Yen-Kheng; Yu, Hai-Bin; Zeng, Peng

    2015-09-01

    This paper investigates an adaptive sampling rate control scheme for networked control systems (NCSs) subject to packet disordering. The main objectives of the proposed scheme are (a) to avoid heavy packet disordering existing in communication networks and (b) to stabilize NCSs with packet disordering, transmission delay and packet loss. First, a novel sampling rate control algorithm based on statistical characteristics of disordering entropy is proposed; secondly, an augmented closed-loop NCS that consists of a plant, a sampler and a state-feedback controller is transformed into an uncertain and stochastic system, which facilitates the controller design. Then, a sufficient condition for stochastic stability in terms of Linear Matrix Inequalities (LMIs) is given. Moreover, an adaptive tracking controller is designed such that the sampling period tracks a desired sampling period, which represents a significant contribution. Finally, experimental results are given to illustrate the effectiveness and advantages of the proposed scheme. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.

  17. Application of MARSSIM for Final Status Survey of the Decommissioning Project

    International Nuclear Information System (INIS)

    Hong, Sang Bum; Lee, Ki Won; Park, Jin Ho; Chung, Un Soo

    2011-01-01

    The release of a site and building from regulatory control is the final stage of the decommissioning process. The MARSSIM (Multi-Agency Radiation Survey and Site Investigation Manual) provides overall framework for conducting data collection for a final status survey to demonstrate compliance with site closure requirements. The KAERI carried out establishing a final status survey by using the guidance provided in the MARSSIM for of a site and building of the Korea Research Reactor. The release criteria for a site and building were set up based on these results of the site specific release levels which were calculated by using RESRAD and RESRAD-Build codes. The survey design for a site and building was classified by using the survey dataset and potential contamination. The number of samples in each survey unit was calculated by through a statistical test using the collected data from a scoping and characterization survey. The results of the final status survey were satisfied the release criteria based on an evaluation of the measured data.

  18. 78 FR 78966 - Board of Scientific Counselors, National Center for Health Statistics

    Science.gov (United States)

    2013-12-27

    ... Scientific Counselors, National Center for Health Statistics In accordance with section 10(a)(2) of the...), National Center for Health Statistics (NCHS) announces the following meeting of the aforementioned..., NCHS; discussion of vital statistics; future program reviews; National Health Interview Survey 2017...

  19. A survey of statistical downscaling techniques

    Energy Technology Data Exchange (ETDEWEB)

    Zorita, E.; Storch, H. von [GKSS-Forschungszentrum Geesthacht GmbH (Germany). Inst. fuer Hydrophysik

    1997-12-31

    The derivation of regional information from integrations of coarse-resolution General Circulation Models (GCM) is generally referred to as downscaling. The most relevant statistical downscaling techniques are described here and some particular examples are worked out in detail. They are classified into three main groups: linear methods, classification methods and deterministic non-linear methods. Their performance in a particular example, winter rainfall in the Iberian peninsula, is compared to a simple downscaling analog method. It is found that the analog method performs equally well than the more complicated methods. Downscaling analysis can be also used as a tool to validate regional performance of global climate models by analyzing the covariability of the simulated large-scale climate and the regional climates. (orig.) [Deutsch] Die Ableitung regionaler Information aus Integrationen grob aufgeloester Klimamodelle wird als `Regionalisierung` bezeichnet. Dieser Beitrag beschreibt die wichtigsten statistischen Regionalisierungsverfahren und gibt darueberhinaus einige detaillierte Beispiele. Regionalisierungsverfahren lassen sich in drei Hauptgruppen klassifizieren: lineare Verfahren, Klassifikationsverfahren und nicht-lineare deterministische Verfahren. Diese Methoden werden auf den Niederschlag auf der iberischen Halbinsel angewandt und mit den Ergebnissen eines einfachen Analog-Modells verglichen. Es wird festgestellt, dass die Ergebnisse der komplizierteren Verfahren im wesentlichen auch mit der Analog-Methode erzielt werden koennen. Eine weitere Anwendung der Regionalisierungsmethoden besteht in der Validierung globaler Klimamodelle, indem die simulierte und die beobachtete Kovariabilitaet zwischen dem grosskaligen und dem regionalen Klima miteinander verglichen wird. (orig.)

  20. A survey of statistical downscaling techniques

    Energy Technology Data Exchange (ETDEWEB)

    Zorita, E; Storch, H von [GKSS-Forschungszentrum Geesthacht GmbH (Germany). Inst. fuer Hydrophysik

    1998-12-31

    The derivation of regional information from integrations of coarse-resolution General Circulation Models (GCM) is generally referred to as downscaling. The most relevant statistical downscaling techniques are described here and some particular examples are worked out in detail. They are classified into three main groups: linear methods, classification methods and deterministic non-linear methods. Their performance in a particular example, winter rainfall in the Iberian peninsula, is compared to a simple downscaling analog method. It is found that the analog method performs equally well than the more complicated methods. Downscaling analysis can be also used as a tool to validate regional performance of global climate models by analyzing the covariability of the simulated large-scale climate and the regional climates. (orig.) [Deutsch] Die Ableitung regionaler Information aus Integrationen grob aufgeloester Klimamodelle wird als `Regionalisierung` bezeichnet. Dieser Beitrag beschreibt die wichtigsten statistischen Regionalisierungsverfahren und gibt darueberhinaus einige detaillierte Beispiele. Regionalisierungsverfahren lassen sich in drei Hauptgruppen klassifizieren: lineare Verfahren, Klassifikationsverfahren und nicht-lineare deterministische Verfahren. Diese Methoden werden auf den Niederschlag auf der iberischen Halbinsel angewandt und mit den Ergebnissen eines einfachen Analog-Modells verglichen. Es wird festgestellt, dass die Ergebnisse der komplizierteren Verfahren im wesentlichen auch mit der Analog-Methode erzielt werden koennen. Eine weitere Anwendung der Regionalisierungsmethoden besteht in der Validierung globaler Klimamodelle, indem die simulierte und die beobachtete Kovariabilitaet zwischen dem grosskaligen und dem regionalen Klima miteinander verglichen wird. (orig.)

  1. Comparative Study of Complex Survey Estimation Software in ONS

    Directory of Open Access Journals (Sweden)

    Andy Fallows

    2015-09-01

    Full Text Available Many official statistics across the UK Government Statistical Service (GSS are produced using data collected from sample surveys. These survey data are used to estimate population statistics through weighting and calibration techniques. For surveys with complex or unusual sample designs, the weighting can be fairly complicated. Even in more simple cases, appropriate software is required to implement survey weighting and estimation. As with other stages of the survey process, it is preferable to use a standard, generic calibration tool wherever possible. Standard tools allow for efficient use of resources and assist with the harmonisation of methods. In the case of calibration, the Office for National Statistics (ONS has experience of using the Statistics Canada Generalized Estimation System (GES across a range of business and social surveys. GES is a SAS-based system and so is only available in conjunction with an appropriate SAS licence. Given recent initiatives and encouragement to investigate open source solutions across government, it is appropriate to determine whether there are any open source calibration tools available that can provide the same service as GES. This study compares the use of GES with the calibration tool ‘R evolved Generalized software for sampling estimates and errors in surveys’ (ReGenesees available in R, an open source statistical programming language which is beginning to be used in many statistical offices. ReGenesees is a free R package which has been developed by the Italian statistics office (Istat and includes functionality to calibrate survey estimates using similar techniques to GES. This report describes analysis of the performance of ReGenesees in comparison to GES to calibrate a representative selection of ONS surveys. Section 1.1 provides a brief introduction to the current use of SAS and R in ONS. Section 2 describes GES and ReGenesees in more detail. Sections 3.1 and 3.2 consider methods for

  2. Resources on quantitative/statistical research for applied linguists

    OpenAIRE

    Brown , James Dean

    2004-01-01

    Abstract The purpose of this review article is to survey and evaluate existing books on quantitative/statistical research in applied linguistics. The article begins by explaining the types of texts that will not be reviewed, then it briefly describes nine books that address how to do quantitative/statistical applied linguistics research. The review then compares (in prose and tables) the general characteris...

  3. Crowdsourcing quality control for Dark Energy Survey images

    Science.gov (United States)

    Melchior, P.; Sheldon, E.; Drlica-Wagner, A.; Rykoff, E. S.; Abbott, T. M. C.; Abdalla, F. B.; Allam, S.; Benoit-Lévy, A.; Brooks, D.; Buckley-Geer, E.; Carnero Rosell, A.; Carrasco Kind, M.; Carretero, J.; Crocce, M.; D'Andrea, C. B.; da Costa, L. N.; Desai, S.; Doel, P.; Evrard, A. E.; Finley, D. A.; Flaugher, B.; Frieman, J.; Gaztanaga, E.; Gerdes, D. W.; Gruen, D.; Gruendl, R. A.; Honscheid, K.; James, D. J.; Jarvis, M.; Kuehn, K.; Li, T. S.; Maia, M. A. G.; March, M.; Marshall, J. L.; Nord, B.; Ogando, R.; Plazas, A. A.; Romer, A. K.; Sanchez, E.; Scarpine, V.; Sevilla-Noarbe, I.; Smith, R. C.; Soares-Santos, M.; Suchyta, E.; Swanson, M. E. C.; Tarle, G.; Vikram, V.; Walker, A. R.; Wester, W.; Zhang, Y.

    2016-07-01

    We have developed a crowdsourcing web application for image quality control employed by the Dark Energy Survey. Dubbed the "DES exposure checker", it renders science-grade images directly to a web browser and allows users to mark problematic features from a set of predefined classes. Users can also generate custom labels and thus help identify previously unknown problem classes. User reports are fed back to hardware and software experts to help mitigate and eliminate recognized issues. We report on the implementation of the application and our experience with its over 100 users, the majority of which are professional or prospective astronomers but not data management experts. We discuss aspects of user training and engagement, and demonstrate how problem reports have been pivotal to rapidly correct artifacts which would likely have been too subtle or infrequent to be recognized otherwise. We conclude with a number of important lessons learned, suggest possible improvements, and recommend this collective exploratory approach for future astronomical surveys or other extensive data sets with a sufficiently large user base. We also release open-source code of the web application and host an online demo version at http://des-exp-checker.pmelchior.net.

  4. The outlier sample effects on multivariate statistical data processing geochemical stream sediment survey (Moghangegh region, North West of Iran)

    International Nuclear Information System (INIS)

    Ghanbari, Y.; Habibnia, A.; Memar, A.

    2009-01-01

    In geochemical stream sediment surveys in Moghangegh Region in north west of Iran, sheet 1:50,000, 152 samples were collected and after the analyze and processing of data, it revealed that Yb, Sc, Ni, Li, Eu, Cd, Co, as contents in one sample is far higher than other samples. After detecting this sample as an outlier sample, the effect of this sample on multivariate statistical data processing for destructive effects of outlier sample in geochemical exploration was investigated. Pearson and Spear man correlation coefficient methods and cluster analysis were used for multivariate studies and the scatter plot of some elements together the regression profiles are given in case of 152 and 151 samples and the results are compared. After investigation of multivariate statistical data processing results, it was realized that results of existence of outlier samples may appear as the following relations between elements: - true relation between two elements, which have no outlier frequency in the outlier sample. - false relation between two elements which one of them has outlier frequency in the outlier sample. - complete false relation between two elements which both have outlier frequency in the outlier sample

  5. A Survey on Open-Source Flight Control Platforms of Unmanned Aerial Vehicle

    DEFF Research Database (Denmark)

    Ebeid, Emad Samuel Malki; Skriver, Martin; Jin, Jie

    2017-01-01

    Recently, Unmanned Aerial Vehicle (UAV), so-called drones, have gotten a lot of attention in academic research and commercial applications due to their simple structure, ease of operations and low-cost hardware components. Flight controller, embedded electronics component, represents the core part...... of the drone. It aims at performing the main operations of the drone (e.g., autonomous control and navigation). There are various types of flight controllers and each of them has its own characteristics and features. This paper presents an extensive survey on the publicly available open-source flight...

  6. Indexing contamination surveys

    International Nuclear Information System (INIS)

    Brown, R.L.

    1998-01-01

    The responsibility for safely managing the Tank Farms at Hanford belongs to Lockheed Martin Hanford Corporation which is part of the six company Project Hanford Management Team led by Fluor Daniel Hanford, Inc.. These Tank Farm Facilities contain numerous outdoor contamination areas which are surveyed at a periodicity consistent with the potential radiological conditions, occupancy, and risk of changes in radiological conditions. This document describes the survey documentation and data tracking method devised to track the results of contamination surveys this process is referred to as indexing. The indexing process takes a representative data set as an indicator for the contamination status of the facility. The data are further manipulated into a single value that can be tracked and trended using standard statistical methodology. To report meaningful data, the routine contamination surveys must be performed in a manner that allows the survey method and the data collection process to be recreated. Three key criteria are necessary to accomplish this goal: Accurate maps, consistent documentation, and consistent consolidation of data meeting these criteria provides data of sufficient quality to be tracked. Tracking of survey data is accomplished by converting the individual survey results into a weighted value, corrected for the actual number of survey points. This information can be compared over time using standard statistical analysis to identify trends. At the Tank Farms, the need to track and trend the facility's radiological status presents unique challenges. Many of these Tank Farm facilities date back to the second world war. The Tank Farm Facilities are exposed to weather extremes, plant and animal intrusion, as well as all of the normal challenges associated with handling radiological waste streams. Routine radiological surveys did not provide a radiological status adequate for continuing comparisons

  7. IMPROVING QUALITY OF STATISTICAL PROCESS CONTROL BY DEALING WITH NON‐NORMAL DATA IN AUTOMOTIVE INDUSTRY

    Directory of Open Access Journals (Sweden)

    Zuzana ANDRÁSSYOVÁ

    2012-07-01

    Full Text Available Study deals with an analysis of data to the effect that it improves the quality of statistical tools in processes of assembly of automobile seats. Normal distribution of variables is one of inevitable conditions for the analysis, examination, and improvement of the manufacturing processes (f. e.: manufacturing process capability although, there are constantly more approaches to non‐normal data handling. An appropriate probability distribution of measured data is firstly tested by the goodness of fit of empirical distribution with theoretical normal distribution on the basis of hypothesis testing using programme StatGraphics Centurion XV.II. Data are collected from the assembly process of 1st row automobile seats for each characteristic of quality (Safety Regulation ‐S/R individually. Study closely processes the measured data of an airbag´s assembly and it aims to accomplish the normal distributed data and apply it the statistical process control. Results of the contribution conclude in a statement of rejection of the null hypothesis (measured variables do not follow the normal distribution therefore it is necessary to begin to work on data transformation supported by Minitab15. Even this approach does not reach a normal distributed data and so should be proposed a procedure that leads to the quality output of whole statistical control of manufacturing processes.

  8. Contributions to sampling statistics

    CERN Document Server

    Conti, Pier; Ranalli, Maria

    2014-01-01

    This book contains a selection of the papers presented at the ITACOSM 2013 Conference, held in Milan in June 2013. ITACOSM is the bi-annual meeting of the Survey Sampling Group S2G of the Italian Statistical Society, intended as an international  forum of scientific discussion on the developments of theory and application of survey sampling methodologies and applications in human and natural sciences. The book gathers research papers carefully selected from both invited and contributed sessions of the conference. The whole book appears to be a relevant contribution to various key aspects of sampling methodology and techniques; it deals with some hot topics in sampling theory, such as calibration, quantile-regression and multiple frame surveys, and with innovative methodologies in important topics of both sampling theory and applications. Contributions cut across current sampling methodologies such as interval estimation for complex samples, randomized responses, bootstrap, weighting, modeling, imputati...

  9. Gonorrhea Statistics

    Science.gov (United States)

    ... Search Form Controls Cancel Submit Search the CDC Gonorrhea Note: Javascript is disabled or is not supported ... Twitter STD on Facebook Sexually Transmitted Diseases (STDs) Gonorrhea Statistics Recommend on Facebook Tweet Share Compartir Gonorrhea ...

  10. Methodological Issues in Survey Research: A Historical Review

    NARCIS (Netherlands)

    de Heer, W.; de Leeuw, E.D.; van der Zouwen, J.

    1999-01-01

    In this paper, we present a historical overview of social surveys and describe the historical development of scientific survey methodology and survey statistics. The origins of survey research can be traced back to the early 19th century and the first scientiflc survey was conducted in England in

  11. Fieldcrest Cannon, Inc. Advanced Technical Preparation. Statistical Process Control (SPC). PRE-SPC I. Instructor Book.

    Science.gov (United States)

    Averitt, Sallie D.

    This instructor guide, which was developed for use in a manufacturing firm's advanced technical preparation program, contains the materials required to present a learning module that is designed to prepare trainees for the program's statistical process control module by improving their basic math skills and instructing them in basic calculator…

  12. Errors in practical measurement in surveying, engineering, and technology

    International Nuclear Information System (INIS)

    Barry, B.A.; Morris, M.D.

    1991-01-01

    This book discusses statistical measurement, error theory, and statistical error analysis. The topics of the book include an introduction to measurement, measurement errors, the reliability of measurements, probability theory of errors, measures of reliability, reliability of repeated measurements, propagation of errors in computing, errors and weights, practical application of the theory of errors in measurement, two-dimensional errors and includes a bibliography. Appendices are included which address significant figures in measurement, basic concepts of probability and the normal probability curve, writing a sample specification for a procedure, classification, standards of accuracy, and general specifications of geodetic control surveys, the geoid, the frequency distribution curve and the computer and calculator solution of problems

  13. Statistics Anxiety and Instructor Immediacy

    Science.gov (United States)

    Williams, Amanda S.

    2010-01-01

    The purpose of this study was to investigate the relationship between instructor immediacy and statistics anxiety. It was predicted that students receiving immediacy would report lower levels of statistics anxiety. Using a pretest-posttest-control group design, immediacy was measured using the Instructor Immediacy scale. Statistics anxiety was…

  14. A Survey of Security Tools for the Industrial Control System Environment

    Energy Technology Data Exchange (ETDEWEB)

    Hurd, Carl M. [Idaho National Lab. (INL), Idaho Falls, ID (United States); McCarty, Michael V. [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2017-06-12

    This report details the results of a survey conducted by Idaho National Laboratory (INL) to identify existing tools which could be used to prevent, detect, mitigate, or investigate a cyber-attack in an industrial control system (ICS) environment. This report compiles a list of potentially applicable tools and shows the coverage of the tools in an ICS architecture.

  15. A Survey of Security Tools for the Industrial Control System Environment

    International Nuclear Information System (INIS)

    Hurd, Carl M.; McCarty, Michael V.

    2017-01-01

    This report details the results of a survey conducted by Idaho National Laboratory (INL) to identify existing tools which could be used to prevent, detect, mitigate, or investigate a cyber-attack in an industrial control system (ICS) environment. This report compiles a list of potentially applicable tools and shows the coverage of the tools in an ICS architecture.

  16. GALEX-SDSS CATALOGS FOR STATISTICAL STUDIES

    International Nuclear Information System (INIS)

    Budavari, Tamas; Heinis, Sebastien; Szalay, Alexander S.; Nieto-Santisteban, Maria; Bianchi, Luciana; Gupchup, Jayant; Shiao, Bernie; Smith, Myron; Chang Ruixiang; Kauffmann, Guinevere; Morrissey, Patrick; Wyder, Ted K.; Martin, D. Christopher; Barlow, Tom A.; Forster, Karl; Friedman, Peter G.; Schiminovich, David; Milliard, Bruno; Donas, Jose; Seibert, Mark

    2009-01-01

    We present a detailed study of the Galaxy Evolution Explorer's (GALEX) photometric catalogs with special focus on the statistical properties of the All-sky and Medium Imaging Surveys. We introduce the concept of primaries to resolve the issue of multiple detections and follow a geometric approach to define clean catalogs with well understood selection functions. We cross-identify the GALEX sources (GR2+3) with Sloan Digital Sky Survey (SDSS; DR6) observations, which indirectly provides an invaluable insight into the astrometric model of the UV sources and allows us to revise the band merging strategy. We derive the formal description of the GALEX footprints as well as their intersections with the SDSS coverage along with analytic calculations of their areal coverage. The crossmatch catalogs are made available for the public. We conclude by illustrating the implementation of typical selection criteria in SQL for catalog subsets geared toward statistical analyses, e.g., correlation and luminosity function studies.

  17. Using a spatial and tabular database to generate statistics from terrain and spectral data for soil surveys

    Science.gov (United States)

    Horvath , E.A.; Fosnight, E.A.; Klingebiel, A.A.; Moore, D.G.; Stone, J.E.; Reybold, W.U.; Petersen, G.W.

    1987-01-01

    A methodology has been developed to create a spatial database by referencing digital elevation, Landsat multispectral scanner data, and digitized soil premap delineations of a number of adjacent 7.5-min quadrangle areas to a 30-m Universal Transverse Mercator projection. Slope and aspect transformations are calculated from elevation data and grouped according to field office specifications. An unsupervised classification is performed on a brightness and greenness transformation of the spectral data. The resulting spectral, slope, and aspect maps of each of the 7.5-min quadrangle areas are then plotted and submitted to the field office to be incorporated into the soil premapping stages of a soil survey. A tabular database is created from spatial data by generating descriptive statistics for each data layer within each soil premap delineation. The tabular data base is then entered into a data base management system to be accessed by the field office personnel during the soil survey and to be used for subsequent resource management decisions.Large amounts of data are collected and archived during resource inventories for public land management. Often these data are stored as stacks of maps or folders in a file system in someone's office, with the maps in a variety of formats, scales, and with various standards of accuracy depending on their purpose. This system of information storage and retrieval is cumbersome at best when several categories of information are needed simultaneously for analysis or as input to resource management models. Computers now provide the resource scientist with the opportunity to design increasingly complex models that require even more categories of resource-related information, thus compounding the problem.Recently there has been much emphasis on the use of geographic information systems (GIS) as an alternative method for map data archives and as a resource management tool. Considerable effort has been devoted to the generation of tabular

  18. A survey of trust, control and information in networks

    DEFF Research Database (Denmark)

    Jakobsen, Morten

    This paper focuses on which characteristics managers take into account when they choose and evaluate business partners, and the interrelationship between the constructs trust, control and information. The paper is based on a survey which includes 101 small and middle-sized manufacturing companies...... and information is found. The findings indicate that the three constructs are relevant, and the level of embeddedness is found to influence both the absolute and the relative importance of the three constructs, and thereby the role of management accounting at different development stages of relationships....

  19. A survey of United States dental hygienists' knowledge, attitudes, and practices with infection control guidelines.

    Science.gov (United States)

    Garland, Kandis V

    2013-06-01

    To assess knowledge, attitudes and practices of U.S. dental hygienists with infection control guidelines (ICG). Research has shown improved compliance with specific aspects of dental ICG is needed. This study supports the American Dental Hygienists' Association National Research Agenda's Occupational Health and Safety objective to investigate methods to decrease errors, risks and or hazards in health care. Data are needed to assess compliance, prevention and behavioral issues with current ICG practices. A proportional stratified random sample (n=2,500) was recruited for an online survey. Descriptive statistics summarized demographic characteristics and knowledge, attitudes and practices responses. Spearman's rho correlations determined relationships between knowledge, attitudes and practices responses (pexpectations for using ICG (rs=0.529) and no time to use (rs=-0.537). Themes from comments indicated time is a barrier, and respondents' perceived a need for involvement of all co-workers. Dental hygienists are adhering with most aspects of the ICG. High compliance with ICG among respondents in this study was associated with positive safety beliefs and practices, whereas lower compliance with ICG was associated with less positive safety beliefs and practices. A safety culture appears to be a factor in compliance with ICG.

  20. Tidal controls on earthquake size-frequency statistics

    Science.gov (United States)

    Ide, S.; Yabe, S.; Tanaka, Y.

    2016-12-01

    The possibility that tidal stresses can trigger earthquakes is a long-standing issue in seismology. Except in some special cases, a causal relationship between seismicity and the phase of tidal stress has been rejected on the basis of studies using many small events. However, recently discovered deep tectonic tremors are highly sensitive to tidal stress levels, with the relationship being governed by a nonlinear law according to which the tremor rate increases exponentially with increasing stress; thus, slow deformation (and the probability of earthquakes) may be enhanced during periods of large tidal stress. Here, we show the influence of tidal stress on seismicity by calculating histories of tidal shear stress during the 2-week period before earthquakes. Very large earthquakes tend to occur near the time of maximum tidal stress, but this tendency is not obvious for small earthquakes. Rather, we found that tidal stress controls the earthquake size-frequency statistics; i.e., the fraction of large events increases (i.e. the b-value of the Gutenberg-Richter relation decreases) as the tidal shear stress increases. This correlation is apparent in data from the global catalog and in relatively homogeneous regional catalogues of earthquakes in Japan. The relationship is also reasonable, considering the well-known relationship between stress and the b-value. Our findings indicate that the probability of a tiny rock failure expanding to a gigantic rupture increases with increasing tidal stress levels. This finding has clear implications for probabilistic earthquake forecasting.

  1. Optimising UAV topographic surveys processed with structure-from-motion: Ground control quality, quantity and bundle adjustment

    Science.gov (United States)

    James, Mike R.; Robson, Stuart; d'Oleire-Oltmanns, Sebastian; Niethammer, Uwe

    2016-04-01

    Structure-from-motion (SfM) algorithms are greatly facilitating the production of detailed topographic models based on images collected by unmanned aerial vehicles (UAVs). However, SfM-based software does not generally provide the rigorous photogrammetric analysis required to fully understand survey quality. Consequently, error related to problems in control point data or the distribution of control points can remain undiscovered. Even if these errors are not large in magnitude, they can be systematic, and thus have strong implications for the use of products such as digital elevation models (DEMs) and orthophotos. Here, we develop a Monte Carlo approach to (1) improve the accuracy of products when SfM-based processing is used and (2) reduce the associated field effort by identifying suitable lower density deployments of ground control points. The method highlights over-parameterisation during camera self-calibration and provides enhanced insight into control point performance when rigorous error metrics are not available. Processing was implemented using commonly-used SfM-based software (Agisoft PhotoScan), which we augment with semi-automated and automated GCPs image measurement. We apply the Monte Carlo method to two contrasting case studies - an erosion gully survey (Taurodont, Morocco) carried out with an fixed-wing UAV, and an active landslide survey (Super-Sauze, France), acquired using a manually controlled quadcopter. The results highlight the differences in the control requirements for the two sites, and we explore the implications for future surveys. We illustrate DEM sensitivity to critical processing parameters and show how the use of appropriate parameter values increases DEM repeatability and reduces the spatial variability of error due to processing artefacts.

  2. Tools for Real-Time Control Systems Co-Design - A Survey

    OpenAIRE

    Henriksson, Dan; El-Khoury, Jad; Årzén, Karl-Erik; Törngren, Martin; Redell, Ola

    2005-01-01

    This report presents a survey of current simulation tools in the area of integrated control and real-time systems design. Each tool is presented with a quick overview followed by a more detailed section describing comparative aspects of the tool. These aspects describe the context and purpose of the tool (scenarios, development stages, activities, and qualities/constraints being addressed) and the actual tool technology (tool architecture, inputs, outputs, modeling content, extensibility and ...

  3. Transferring 2001 National Household Travel Survey

    Energy Technology Data Exchange (ETDEWEB)

    Hu, Patricia S [ORNL; Reuscher, Tim [ORNL; Schmoyer, Richard L [ORNL; Chin, Shih-Miao [ORNL

    2007-05-01

    Policy makers rely on transportation statistics, including data on personal travel behavior, to formulate strategic transportation policies, and to improve the safety and efficiency of the U.S. transportation system. Data on personal travel trends are needed to examine the reliability, efficiency, capacity, and flexibility of the Nation's transportation system to meet current demands and to accommodate future demand. These data are also needed to assess the feasibility and efficiency of alternative congestion-mitigating technologies (e.g., high-speed rail, magnetically levitated trains, and intelligent vehicle and highway systems); to evaluate the merits of alternative transportation investment programs; and to assess the energy-use and air-quality impacts of various policies. To address these data needs, the U.S. Department of Transportation (USDOT) initiated an effort in 1969 to collect detailed data on personal travel. The 1969 survey was the first Nationwide Personal Transportation Survey (NPTS). The survey was conducted again in 1977, 1983, 1990, 1995, and 2001. Data on daily travel were collected in 1969, 1977, 1983, 1990 and 1995. In 2001, the survey was renamed the National Household Travel Survey (NHTS) and it collected both daily and long-distance trips. The 2001 survey was sponsored by three USDOT agencies: Federal Highway Administration (FHWA), Bureau of Transportation Statistics (BTS), and National Highway Traffic Safety Administration (NHTSA). The primary objective of the survey was to collect trip-based data on the nature and characteristics of personal travel so that the relationships between the characteristics of personal travel and the demographics of the traveler can be established. Commercial and institutional travel were not part of the survey. Due to the survey's design, data in the NHTS survey series were not recommended for estimating travel statistics for categories smaller than the combination of Census division (e.g., New

  4. Publication selection and the income elasticity of the value of a statistical life.

    Science.gov (United States)

    Doucouliagos, Hristos; Stanley, T D; Viscusi, W Kip

    2014-01-01

    Estimates of the value of a statistical life (VSL) establish the price government agencies use to value fatality risks. Transferring these valuations to other populations often utilizes the income elasticity of the VSL, which typically draw on estimates from meta-analyses. Using a data set consisting of 101 estimates of the income elasticity of VSL from 14 previously reported meta-analyses, we find that after accounting for potential publication bias the income elasticity of value of a statistical life is clearly and robustly inelastic, with a value of approximately 0.25-0.63. There is also clear evidence of the importance of controlling for levels of risk, differential publication selection bias, and the greater income sensitivity of VSL from stated preference surveys. Copyright © 2013 Elsevier B.V. All rights reserved.

  5. The Use of Statistical Process Control-Charts for Person-Fit Analysis on Computerized Adaptive Testing. LSAC Research Report Series.

    Science.gov (United States)

    Meijer, Rob R.; van Krimpen-Stoop, Edith M. L. A.

    In this study a cumulative-sum (CUSUM) procedure from the theory of Statistical Process Control was modified and applied in the context of person-fit analysis in a computerized adaptive testing (CAT) environment. Six person-fit statistics were proposed using the CUSUM procedure, and three of them could be used to investigate the CAT in online test…

  6. The importance of the supportive control environment for internal audit effectiveness – the case of Croatian companies

    OpenAIRE

    Barišić, Ivana; Tušek, Boris

    2016-01-01

    The paper investigates whether a supportive control environment is associated with the internal audit effectiveness and what characteristics of a control environment are important in this respect. A survey was conducted via a questionnaire on 54 mostly large companies in Croatia. Appropriate methods of statistical analysis were used in order to analyse the survey results. According to the research results, in the case of a supportive control environment there is a greater ch...

  7. Results of a multicentre randomised controlled trial of statistical process control charts and structured diagnostic tools to reduce ward-acquired meticillin-resistant Staphylococcus aureus: the CHART Project.

    Science.gov (United States)

    Curran, E; Harper, P; Loveday, H; Gilmour, H; Jones, S; Benneyan, J; Hood, J; Pratt, R

    2008-10-01

    Statistical process control (SPC) charts have previously been advocated for infection control quality improvement. To determine their effectiveness, a multicentre randomised controlled trial was undertaken to explore whether monthly SPC feedback from infection control nurses (ICNs) to healthcare workers of ward-acquired meticillin-resistant Staphylococcus aureus (WA-MRSA) colonisation or infection rates would produce any reductions in incidence. Seventy-five wards in 24 hospitals in the UK were randomised into three arms: (1) wards receiving SPC chart feedback; (2) wards receiving SPC chart feedback in conjunction with structured diagnostic tools; and (3) control wards receiving neither type of feedback. Twenty-five months of pre-intervention WA-MRSA data were compared with 24 months of post-intervention data. Statistically significant and sustained decreases in WA-MRSA rates were identified in all three arms (Pcontrol wards, but with no significant difference between the control and intervention arms (P=0.23). There were significantly more post-intervention 'out-of-control' episodes (P=0.021) in the control arm (averages of 0.60, 0.28, and 0.28 for Control, SPC and SPC+Tools wards, respectively). Participants identified SPC charts as an effective communication tool and valuable for disseminating WA-MRSA data.

  8. On the limitations of statistical absorption studies with the Sloan Digital Sky Surveys I-III

    Science.gov (United States)

    Lan, Ting-Wen; Ménard, Brice; Baron, Dalya; Johnson, Sean; Poznanski, Dovi; Prochaska, J. Xavier; O'Meara, John M.

    2018-04-01

    We investigate the limitations of statistical absorption measurements with the SDSS optical spectroscopic surveys. We show that changes in the data reduction strategy throughout different data releases have led to a better accuracy at long wavelengths, in particular for sky line subtraction, but a degradation at short wavelengths with the emergence of systematic spectral features with an amplitude of about one percent. We show that these features originate from inaccuracy in the fitting of modeled F-star spectra used for flux calibration. The best-fit models for those stars are found to systematically over-estimate the strength of metal lines and under-estimate that of Lithium. We also identify the existence of artifacts due to masking and interpolation procedures at the wavelengths of the hydrogen Balmer series leading to the existence of artificial Balmer α absorption in all SDSS optical spectra. All these effects occur in the rest-frame of the standard stars and therefore present Galactic longitude variations due to the rotation of the Galaxy. We demonstrate that the detection of certain weak absorption lines reported in the literature are solely due to calibration effects. Finally, we discuss new strategies to mitigate these issues.

  9. Statistics on Science and Technology in Latin America, Experience with UNESCO Pilot Projects, 1972-1974.

    Science.gov (United States)

    Thebaud, Schiller

    This report examines four UNESCO pilot projects undertaken in 1972 in Brazil, Colombia, Peru, and Uruguay to study the methods used for national statistical surveys of science and technology. The projects specifically addressed the problems of comparing statistics gathered by different methods in different countries. Surveys carried out in Latin…

  10. Optimage central organised image quality control including statistics and reporting

    International Nuclear Information System (INIS)

    Jahnen, A.; Schilz, C.; Shannoun, F.; Schreiner, A.; Hermen, J.; Moll, C.

    2008-01-01

    Quality control of medical imaging systems is performed using dedicated phantoms. As the imaging systems are more and more digital, adequate image processing methods might help to save evaluation time and to receive objective results. The developed software package OPTIMAGE is focusing on this with a central approach: On one hand, OPTIMAGE provides a framework, which includes functions like database integration, DICOM data sources, multilingual user interface and image processing functionality. On the other hand, the test methods are implemented using modules which are able to process the images automatically for the common imaging systems. The integration of statistics and reporting into this environment is paramount: This is the only way to provide these functions in an interactive, user-friendly way. These features enable the users to discover degradation in performance quickly and document performed measurements easily. (authors)

  11. 1993 commodity flow survey : state summaries

    Science.gov (United States)

    1997-06-01

    This report summarizes the Commodity Flow Survey (CFS) state reports released between February 1996 and July 1996 by the Bureau of the Census and the 1993 Commodity Flow Survey: Preliminary Observations by the Bureau of Transportation Statistics. Inf...

  12. Sensitivity analysis and related analysis : A survey of statistical techniques

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    1995-01-01

    This paper reviews the state of the art in five related types of analysis, namely (i) sensitivity or what-if analysis, (ii) uncertainty or risk analysis, (iii) screening, (iv) validation, and (v) optimization. The main question is: when should which type of analysis be applied; which statistical

  13. Towards consistent and reliable Dutch and international energy statistics for the chemical industry

    International Nuclear Information System (INIS)

    Neelis, M.L.; Pouwelse, J.W.

    2008-01-01

    Consistent and reliable energy statistics are of vital importance for proper monitoring of energy-efficiency policies. In recent studies, irregularities have been reported in the Dutch energy statistics for the chemical industry. We studied in depth the company data that form the basis of the energy statistics in the Netherlands between 1995 and 2004 to find causes for these irregularities. We discovered that chemical products have occasionally been included, resulting in statistics with an inconsistent system boundary. Lack of guidance in the survey for the complex energy conversions in the chemical industry in the survey also resulted in large fluctuations for certain energy commodities. The findings of our analysis have been the basis for a new survey that has been used since 2007. We demonstrate that the annual questionnaire used for the international energy statistics can result in comparable problems as observed in the Netherlands. We suggest to include chemical residual gas as energy commodity in the questionnaire and to include the energy conversions in the chemical industry in the international energy statistics. In addition, we think the questionnaire should be explicit about the treatment of basic chemical products produced at refineries and in the petrochemical industry to avoid system boundary problems

  14. Statistical quality management using miniTAB 14

    International Nuclear Information System (INIS)

    An, Seong Jin

    2007-01-01

    This book explains statistical quality management giving descriptions of definition of quality, quality management, quality cost, basic methods of quality management, principles of control chart, control chart for variables, control chart for attributes, capability analysis, other issues of statistical process control, acceptance sampling, sampling for variable acceptance, design and analysis of experiment, Taguchi quality engineering, reaction surface methodology reliability analysis.

  15. Development, application, and validation of a survey for infectious disease control practices at equine boarding facilities.

    Science.gov (United States)

    Kirby, Alanna T; Traub-Dargatz, Josie L; Hill, Ashley E; Kogan, Lori R; Morley, Paul S; Heird, James C

    2010-11-15

    To develop a questionnaire for self-assessment of biosecurity practices at equine boarding facilities and to evaluate infectious disease control practices in these facilities in Colorado. Cross-sectional study. 64 equine boarding facilities in Colorado. Survey questions were rated according to importance for prevention and containment of equine infectious diseases. Point values (range, 0 to 20) were assigned for possible responses, with greater values given for optimal infection control methods. Questionnaires were mailed to equine boarding facilities in Colorado advertised on the World Wide Web. Survey responses were compared with assessments made by a member of the research team during visits to 30 randomly selected facilities. Agreement among results was analyzed via a kappa test and rated as poor, fair, moderate, substantial, or nearly perfect. Survey responses were received for 64 of 163 (39%) equine boarding facilities. Scores ranged from 106 to 402 points (maximum possible score, 418). Most facilities received better scores for movement and housing of equids than for other sections of the survey. Respondents at 24 of 48 (50%) facilities that routinely received new equids reported isolation of new arrivals. Agreement between self-assessment by survey respondents and evaluation by a member of the research team was determined to be fair to substantial. Most equine boarding facilities have opportunities to improve measures for prevention or containment of contagious diseases (eg, isolation of newly arrived equids and use of written health management protocols). Most self-assessments of infection control practices were accurate.

  16. Statistical estimates of absenteeism attributable to seasonal and pandemic influenza from the Canadian Labour Force Survey.

    Science.gov (United States)

    Schanzer, Dena L; Zheng, Hui; Gilmore, Jason

    2011-04-12

    As many respiratory viruses are responsible for influenza like symptoms, accurate measures of the disease burden are not available and estimates are generally based on statistical methods. The objective of this study was to estimate absenteeism rates and hours lost due to seasonal influenza and compare these estimates with estimates of absenteeism attributable to the two H1N1 pandemic waves that occurred in 2009. Key absenteeism variables were extracted from Statistics Canada's monthly labour force survey (LFS). Absenteeism and the proportion of hours lost due to own illness or disability were modelled as a function of trend, seasonality and proxy variables for influenza activity from 1998 to 2009. Hours lost due to the H1N1/09 pandemic strain were elevated compared to seasonal influenza, accounting for a loss of 0.2% of potential hours worked annually. In comparison, an estimated 0.08% of hours worked annually were lost due to seasonal influenza illnesses. Absenteeism rates due to influenza were estimated at 12% per year for seasonal influenza over the 1997/98 to 2008/09 seasons, and 13% for the two H1N1/09 pandemic waves. Employees who took time off due to a seasonal influenza infection took an average of 14 hours off. For the pandemic strain, the average absence was 25 hours. This study confirms that absenteeism due to seasonal influenza has typically ranged from 5% to 20%, with higher rates associated with multiple circulating strains. Absenteeism rates for the 2009 pandemic were similar to those occurring for seasonal influenza. Employees took more time off due to the pandemic strain than was typical for seasonal influenza.

  17. Statistical estimates of absenteeism attributable to seasonal and pandemic influenza from the Canadian Labour Force Survey

    Science.gov (United States)

    2011-01-01

    Background As many respiratory viruses are responsible for influenza like symptoms, accurate measures of the disease burden are not available and estimates are generally based on statistical methods. The objective of this study was to estimate absenteeism rates and hours lost due to seasonal influenza and compare these estimates with estimates of absenteeism attributable to the two H1N1 pandemic waves that occurred in 2009. Methods Key absenteeism variables were extracted from Statistics Canada's monthly labour force survey (LFS). Absenteeism and the proportion of hours lost due to own illness or disability were modelled as a function of trend, seasonality and proxy variables for influenza activity from 1998 to 2009. Results Hours lost due to the H1N1/09 pandemic strain were elevated compared to seasonal influenza, accounting for a loss of 0.2% of potential hours worked annually. In comparison, an estimated 0.08% of hours worked annually were lost due to seasonal influenza illnesses. Absenteeism rates due to influenza were estimated at 12% per year for seasonal influenza over the 1997/98 to 2008/09 seasons, and 13% for the two H1N1/09 pandemic waves. Employees who took time off due to a seasonal influenza infection took an average of 14 hours off. For the pandemic strain, the average absence was 25 hours. Conclusions This study confirms that absenteeism due to seasonal influenza has typically ranged from 5% to 20%, with higher rates associated with multiple circulating strains. Absenteeism rates for the 2009 pandemic were similar to those occurring for seasonal influenza. Employees took more time off due to the pandemic strain than was typical for seasonal influenza. PMID:21486453

  18. Workers' Participation and the Distribution of Control as Perceived by Members of Ten German Companies.

    Science.gov (United States)

    Bartolke, Klaus; And Others

    1982-01-01

    A survey of 601 managers and workers in 10 German manufacturing companies studied the implications of workers' participation for the exercise of control. Statistical analysis of data on control over work environments, production organization, personnel, and finance indicated that, in more participative companies, distribution of control is more…

  19. Statistical comparisons of Savannah River anemometer data applied to quality control of instrument networks

    International Nuclear Information System (INIS)

    Porch, W.M.; Dickerson, M.H.

    1976-08-01

    Continuous monitoring of extensive meteorological instrument arrays is a requirement in the study of important mesoscale atmospheric phenomena. The phenomena include pollution transport prediction from continuous area sources, or one time releases of toxic materials and wind energy prospecting in areas of topographic enhancement of the wind. Quality control techniques that can be applied to these data to determine if the instruments are operating within their prescribed tolerances were investigated. Savannah River Plant data were analyzed with both independent and comparative statistical techniques. The independent techniques calculate the mean, standard deviation, moments about the mean, kurtosis, skewness, probability density distribution, cumulative probability and power spectra. The comparative techniques include covariance, cross-spectral analysis and two dimensional probability density. At present the calculating and plotting routines for these statistical techniques do not reside in a single code so it is difficult to ascribe independent memory size and computation time accurately. However, given the flexibility of a data system which includes simple and fast running statistics at the instrument end of the data network (ASF) and more sophisticated techniques at the computational end (ACF) a proper balance will be attained. These techniques are described in detail and preliminary results are presented

  20. A study of statistics anxiety levels of graduate dental hygiene students.

    Science.gov (United States)

    Welch, Paul S; Jacks, Mary E; Smiley, Lynn A; Walden, Carolyn E; Clark, William D; Nguyen, Carol A

    2015-02-01

    In light of increased emphasis on evidence-based practice in the profession of dental hygiene, it is important that today's dental hygienist comprehend statistical measures to fully understand research articles, and thereby apply scientific evidence to practice. Therefore, the purpose of this study was to investigate statistics anxiety among graduate dental hygiene students in the U.S. A web-based self-report, anonymous survey was emailed to directors of 17 MSDH programs in the U.S. with a request to distribute to graduate students. The survey collected data on statistics anxiety, sociodemographic characteristics and evidence-based practice. Statistic anxiety was assessed using the Statistical Anxiety Rating Scale. Study significance level was α=0.05. Only 8 of the 17 invited programs participated in the study. Statistical Anxiety Rating Scale data revealed graduate dental hygiene students experience low to moderate levels of statistics anxiety. Specifically, the level of anxiety on the Interpretation Anxiety factor indicated this population could struggle with making sense of scientific research. A decisive majority (92%) of students indicated statistics is essential for evidence-based practice and should be a required course for all dental hygienists. This study served to identify statistics anxiety in a previously unexplored population. The findings should be useful in both theory building and in practical applications. Furthermore, the results can be used to direct future research. Copyright © 2015 The American Dental Hygienists’ Association.

  1. Radiological Control Technician: Phase 1, Site academic training study guides

    International Nuclear Information System (INIS)

    1992-10-01

    This volume is a study guide for training Radiological Control Technicians. Provided herein are support materials for learning radiological documentation, communication systems, counting errors and statistics, dosimetry, contamination control, airborne sampling program methods, respiratory protection, radiological source control, environmental monitoring, access control and work area setup, radiological work coverage, shipment and receipt for radioactive material, radiological incidents and emergencies, personnel decontamination, first aid, radiation survey instrumentation, contamination monitoring, air sampling, and counting room equipment

  2. The Association of Academic Health Sciences Libraries Annual Statistics: an exploratory twenty-five-year trend analysis.

    Science.gov (United States)

    Byrd, Gary D; Shedlock, James

    2003-04-01

    This paper presents an exploratory trend analysis of the statistics published over the past twenty-four editions of the Annual Statistics of Medical School Libraries in the United States and Canada. The analysis focuses on the small subset of nineteen consistently collected data variables (out of 656 variables collected during the history of the survey) to provide a general picture of the growth and changing dimensions of services and resources provided by academic health sciences libraries over those two and one-half decades. The paper also analyzes survey response patterns for U.S. and Canadian medical school libraries, as well as osteopathic medical school libraries surveyed since 1987. The trends show steady, but not dramatic, increases in annual means for total volumes collected, expenditures for staff, collections and other operating costs, personnel numbers and salaries, interlibrary lending and borrowing, reference questions, and service hours. However, when controlled for inflation, most categories of expenditure have just managed to stay level. The exceptions have been expenditures for staff development and travel and for collections, which have both outpaced inflation. The fill rate for interlibrary lending requests has remained steady at about 75%, but the mean ratio of items lent to items borrowed has decreased by nearly 50%.

  3. Statistical process control analysis for patient-specific IMRT and VMAT QA.

    Science.gov (United States)

    Sanghangthum, Taweap; Suriyapee, Sivalee; Srisatit, Somyot; Pawlicki, Todd

    2013-05-01

    This work applied statistical process control to establish the control limits of the % gamma pass of patient-specific intensity modulated radiotherapy (IMRT) and volumetric modulated arc therapy (VMAT) quality assurance (QA), and to evaluate the efficiency of the QA process by using the process capability index (Cpml). A total of 278 IMRT QA plans in nasopharyngeal carcinoma were measured with MapCHECK, while 159 VMAT QA plans were undertaken with ArcCHECK. Six megavolts with nine fields were used for the IMRT plan and 2.5 arcs were used to generate the VMAT plans. The gamma (3%/3 mm) criteria were used to evaluate the QA plans. The % gamma passes were plotted on a control chart. The first 50 data points were employed to calculate the control limits. The Cpml was calculated to evaluate the capability of the IMRT/VMAT QA process. The results showed higher systematic errors in IMRT QA than VMAT QA due to the more complicated setup used in IMRT QA. The variation of random errors was also larger in IMRT QA than VMAT QA because the VMAT plan has more continuity of dose distribution. The average % gamma pass was 93.7% ± 3.7% for IMRT and 96.7% ± 2.2% for VMAT. The Cpml value of IMRT QA was 1.60 and VMAT QA was 1.99, which implied that the VMAT QA process was more accurate than the IMRT QA process. Our lower control limit for % gamma pass of IMRT is 85.0%, while the limit for VMAT is 90%. Both the IMRT and VMAT QA processes are good quality because Cpml values are higher than 1.0.

  4. Web survey methodology

    CERN Document Server

    Callegaro, Mario; Vehovar, Asja

    2015-01-01

    Web Survey Methodology guides the reader through the past fifteen years of research in web survey methodology. It both provides practical guidance on the latest techniques for collecting valid and reliable data and offers a comprehensive overview of research issues. Core topics from preparation to questionnaire design, recruitment testing to analysis and survey software are all covered in a systematic and insightful way. The reader will be exposed to key concepts and key findings in the literature, covering measurement, non-response, adjustments, paradata, and cost issues. The book also discusses the hottest research topics in survey research today, such as internet panels, virtual interviewing, mobile surveys and the integration with passive measurements, e-social sciences, mixed modes and business intelligence. The book is intended for students, practitioners, and researchers in fields such as survey and market research, psychological research, official statistics and customer satisfaction research.

  5. Cosmology with weak lensing surveys

    International Nuclear Information System (INIS)

    Munshi, Dipak; Valageas, Patrick; Waerbeke, Ludovic van; Heavens, Alan

    2008-01-01

    Weak gravitational lensing is responsible for the shearing and magnification of the images of high-redshift sources due to the presence of intervening matter. The distortions are due to fluctuations in the gravitational potential, and are directly related to the distribution of matter and to the geometry and dynamics of the Universe. As a consequence, weak gravitational lensing offers unique possibilities for probing the Dark Matter and Dark Energy in the Universe. In this review, we summarise the theoretical and observational state of the subject, focussing on the statistical aspects of weak lensing, and consider the prospects for weak lensing surveys in the future. Weak gravitational lensing surveys are complementary to both galaxy surveys and cosmic microwave background (CMB) observations as they probe the unbiased non-linear matter power spectrum at modest redshifts. Most of the cosmological parameters are accurately estimated from CMB and large-scale galaxy surveys, so the focus of attention is shifting to understanding the nature of Dark Matter and Dark Energy. On the theoretical side, recent advances in the use of 3D information of the sources from photometric redshifts promise greater statistical power, and these are further enhanced by the use of statistics beyond two-point quantities such as the power spectrum. The use of 3D information also alleviates difficulties arising from physical effects such as the intrinsic alignment of galaxies, which can mimic weak lensing to some extent. On the observational side, in the next few years weak lensing surveys such as CFHTLS, VST-KIDS and Pan-STARRS, and the planned Dark Energy Survey, will provide the first weak lensing surveys covering very large sky areas and depth. In the long run even more ambitious programmes such as DUNE, the Supernova Anisotropy Probe (SNAP) and Large-aperture Synoptic Survey Telescope (LSST) are planned. Weak lensing of diffuse components such as the CMB and 21 cm emission can also

  6. Cosmology with weak lensing surveys

    Energy Technology Data Exchange (ETDEWEB)

    Munshi, Dipak [Institute of Astronomy, Madingley Road, Cambridge, CB3 OHA (United Kingdom); Astrophysics Group, Cavendish Laboratory, Madingley Road, Cambridge CB3 OHE (United Kingdom)], E-mail: munshi@ast.cam.ac.uk; Valageas, Patrick [Service de Physique Theorique, CEA Saclay, 91191 Gif-sur-Yvette (France); Waerbeke, Ludovic van [University of British Columbia, Department of Physics and Astronomy, 6224 Agricultural Road, Vancouver, BC V6T 1Z1 (Canada); Heavens, Alan [SUPA - Scottish Universities Physics Alliance, Institute for Astronomy, University of Edinburgh, Blackford Hill, Edinburgh EH9 3HJ (United Kingdom)

    2008-06-15

    Weak gravitational lensing is responsible for the shearing and magnification of the images of high-redshift sources due to the presence of intervening matter. The distortions are due to fluctuations in the gravitational potential, and are directly related to the distribution of matter and to the geometry and dynamics of the Universe. As a consequence, weak gravitational lensing offers unique possibilities for probing the Dark Matter and Dark Energy in the Universe. In this review, we summarise the theoretical and observational state of the subject, focussing on the statistical aspects of weak lensing, and consider the prospects for weak lensing surveys in the future. Weak gravitational lensing surveys are complementary to both galaxy surveys and cosmic microwave background (CMB) observations as they probe the unbiased non-linear matter power spectrum at modest redshifts. Most of the cosmological parameters are accurately estimated from CMB and large-scale galaxy surveys, so the focus of attention is shifting to understanding the nature of Dark Matter and Dark Energy. On the theoretical side, recent advances in the use of 3D information of the sources from photometric redshifts promise greater statistical power, and these are further enhanced by the use of statistics beyond two-point quantities such as the power spectrum. The use of 3D information also alleviates difficulties arising from physical effects such as the intrinsic alignment of galaxies, which can mimic weak lensing to some extent. On the observational side, in the next few years weak lensing surveys such as CFHTLS, VST-KIDS and Pan-STARRS, and the planned Dark Energy Survey, will provide the first weak lensing surveys covering very large sky areas and depth. In the long run even more ambitious programmes such as DUNE, the Supernova Anisotropy Probe (SNAP) and Large-aperture Synoptic Survey Telescope (LSST) are planned. Weak lensing of diffuse components such as the CMB and 21 cm emission can also

  7. Statistical process control for radiotherapy quality assurance

    International Nuclear Information System (INIS)

    Pawlicki, Todd; Whitaker, Matthew; Boyer, Arthur L.

    2005-01-01

    Every quality assurance process uncovers random and systematic errors. These errors typically consist of many small random errors and a very few number of large errors that dominate the result. Quality assurance practices in radiotherapy do not adequately differentiate between these two sources of error. The ability to separate these types of errors would allow the dominant source(s) of error to be efficiently detected and addressed. In this work, statistical process control is applied to quality assurance in radiotherapy for the purpose of setting action thresholds that differentiate between random and systematic errors. The theoretical development and implementation of process behavior charts are described. We report on a pilot project is which these techniques are applied to daily output and flatness/symmetry quality assurance for a 10 MV photon beam in our department. This clinical case was followed over 52 days. As part of our investigation, we found that action thresholds set using process behavior charts were able to identify systematic changes in our daily quality assurance process. This is in contrast to action thresholds set using the standard deviation, which did not identify the same systematic changes in the process. The process behavior thresholds calculated from a subset of the data detected a 2% change in the process whereas with a standard deviation calculation, no change was detected. Medical physicists must make decisions on quality assurance data as it is acquired. Process behavior charts help decide when to take action and when to acquire more data before making a change in the process

  8. DEVELOPMENT AND APPLICATION OF THE KEY TECHNOLOGIES FOR THE QUALITY CONTROL AND INSPECTION OF NATIONAL GEOGRAPHICAL CONDITIONS SURVEY PRODUCTS

    Directory of Open Access Journals (Sweden)

    Y. Zhao

    2018-04-01

    Full Text Available The First National Geographical Condition Survey is a predecessor task to dynamically master basic situations of the nature, ecology and human activities on the earth’s surface and it is the brand-new mapping geographic information engineering. In order to ensure comprehensive, real and accurate survey results and achieve the quality management target which the qualified rate is 100 % and the yield is more than 80 %, it is necessary to carry out the quality control and result inspection for national geographical conditions survey on a national scale. To ensure that achievement quality meets quality target requirements, this paper develops the key technology method of “five-in-one” quality control that is constituted by “quality control system of national geographical condition survey, quality inspection technology system, quality evaluation system, quality inspection information management system and national linked quality control institutions” by aiming at large scale, wide coverage range, more undertaking units, more management levels, technical updating, more production process and obvious regional differences in the national geographical condition survey and combining with novel achievement manifestation, complicated dependency, more special reference data, and large data size. This project fully considering the domestic and foreign related research results and production practice experience, combined with the technology development and the needs of the production, it stipulates the inspection methods and technical requirements of each stage in the quality inspection of the geographical condition survey results, and extends the traditional inspection and acceptance technology, and solves the key technologies that are badly needed in the first national geographic survey.

  9. Development and Application of the Key Technologies for the Quality Control and Inspection of National Geographical Conditions Survey Products

    Science.gov (United States)

    Zhao, Y.; Zhang, L.; Ma, W.; Zhang, P.; Zhao, T.

    2018-04-01

    The First National Geographical Condition Survey is a predecessor task to dynamically master basic situations of the nature, ecology and human activities on the earth's surface and it is the brand-new mapping geographic information engineering. In order to ensure comprehensive, real and accurate survey results and achieve the quality management target which the qualified rate is 100 % and the yield is more than 80 %, it is necessary to carry out the quality control and result inspection for national geographical conditions survey on a national scale. To ensure that achievement quality meets quality target requirements, this paper develops the key technology method of "five-in-one" quality control that is constituted by "quality control system of national geographical condition survey, quality inspection technology system, quality evaluation system, quality inspection information management system and national linked quality control institutions" by aiming at large scale, wide coverage range, more undertaking units, more management levels, technical updating, more production process and obvious regional differences in the national geographical condition survey and combining with novel achievement manifestation, complicated dependency, more special reference data, and large data size. This project fully considering the domestic and foreign related research results and production practice experience, combined with the technology development and the needs of the production, it stipulates the inspection methods and technical requirements of each stage in the quality inspection of the geographical condition survey results, and extends the traditional inspection and acceptance technology, and solves the key technologies that are badly needed in the first national geographic survey.

  10. Methodology for performing measurements to release material from radiological control

    International Nuclear Information System (INIS)

    Durham, J.S.; Gardner, D.L.

    1993-09-01

    This report describes the existing and proposed methodologies for performing measurements of contamination prior to releasing material for uncontrolled use at the Hanford Site. The technical basis for the proposed methodology, a modification to the existing contamination survey protocol, is also described. The modified methodology, which includes a large-area swipe followed by a statistical survey, can be used to survey material that is unlikely to be contaminated for release to controlled and uncontrolled areas. The material evaluation procedure that is used to determine the likelihood of contamination is also described

  11. Using Fun in the Statistics Classroom: An Exploratory Study of College Instructors' Hesitations and Motivations

    Science.gov (United States)

    Lesser, Lawrence M.; Wall, Amitra A.; Carver, Robert H.; Pearl, Dennis K.; Martin, Nadia; Kuiper, Shonda; Posner, Michael A.; Erickson, Patricia; Liao, Shu-Min; Albert, Jim; Weber, John J., III

    2013-01-01

    This study examines statistics instructors' use of fun as well as their motivations, hesitations, and awareness of resources. In 2011, a survey was administered to attendees at a national statistics education conference, and follow-up qualitative interviews were conducted with 16 of those ("N" = 249) surveyed to provide further…

  12. Statistics for non-statisticians

    CERN Document Server

    Madsen, Birger Stjernholm

    2016-01-01

    This book was written for those who need to know how to collect, analyze and present data. It is meant to be a first course for practitioners, a book for private study or brush-up on statistics, and supplementary reading for general statistics classes. The book is untraditional, both with respect to the choice of topics and the presentation: Topics were determined by what is most useful for practical statistical work, and the presentation is as non-mathematical as possible. The book contains many examples using statistical functions in spreadsheets. In this second edition, new topics have been included e.g. within the area of statistical quality control, in order to make the book even more useful for practitioners working in industry. .

  13. Pre-Statistical Process Control: Making Numbers Count! JobLink Winning at Work Instructor's Manual, Module 3.

    Science.gov (United States)

    Coast Community Coll. District, Costa Mesa, CA.

    This instructor's manual for workplace trainers contains the materials required to conduct a course in pre-statistical process control. The course consists of six lessons for workers and two lessons for supervisors that discuss the following: concepts taught in the six lessons; workers' progress in the individual lessons; and strategies for…

  14. Compilation of streamflow statistics calculated from daily mean streamflow data collected during water years 1901–2015 for selected U.S. Geological Survey streamgages

    Science.gov (United States)

    Granato, Gregory E.; Ries, Kernell G.; Steeves, Peter A.

    2017-10-16

    Streamflow statistics are needed by decision makers for many planning, management, and design activities. The U.S. Geological Survey (USGS) StreamStats Web application provides convenient access to streamflow statistics for many streamgages by accessing the underlying StreamStatsDB database. In 2016, non-interpretive streamflow statistics were compiled for streamgages located throughout the Nation and stored in StreamStatsDB for use with StreamStats and other applications. Two previously published USGS computer programs that were designed to help calculate streamflow statistics were updated to better support StreamStats as part of this effort. These programs are named “GNWISQ” (Get National Water Information System Streamflow (Q) files), updated to version 1.1.1, and “QSTATS” (Streamflow (Q) Statistics), updated to version 1.1.2.Statistics for 20,438 streamgages that had 1 or more complete years of record during water years 1901 through 2015 were calculated from daily mean streamflow data; 19,415 of these streamgages were within the conterminous United States. About 89 percent of the 20,438 streamgages had 3 or more years of record, and about 65 percent had 10 or more years of record. Drainage areas of the 20,438 streamgages ranged from 0.01 to 1,144,500 square miles. The magnitude of annual average streamflow yields (streamflow per square mile) for these streamgages varied by almost six orders of magnitude, from 0.000029 to 34 cubic feet per second per square mile. About 64 percent of these streamgages did not have any zero-flow days during their available period of record. The 18,122 streamgages with 3 or more years of record were included in the StreamStatsDB compilation so they would be available via the StreamStats interface for user-selected streamgages. All the statistics are available in a USGS ScienceBase data release.

  15. TU-FG-201-05: Varian MPC as a Statistical Process Control Tool

    International Nuclear Information System (INIS)

    Carver, A; Rowbottom, C

    2016-01-01

    Purpose: Quality assurance in radiotherapy requires the measurement of various machine parameters to ensure they remain within permitted values over time. In Truebeam release 2.0 the Machine Performance Check (MPC) was released allowing beam output and machine axis movements to be assessed in a single test. We aim to evaluate the Varian Machine Performance Check (MPC) as a tool for Statistical Process Control (SPC). Methods: Varian’s MPC tool was used on three Truebeam and one EDGE linac for a period of approximately one year. MPC was commissioned against independent systems. After this period the data were reviewed to determine whether or not the MPC was useful as a process control tool. Analyses on individual tests were analysed using Shewhart control plots, using Matlab for analysis. Principal component analysis was used to determine if a multivariate model was of any benefit in analysing the data. Results: Control charts were found to be useful to detect beam output changes, worn T-nuts and jaw calibration issues. Upper and lower control limits were defined at the 95% level. Multivariate SPC was performed using Principal Component Analysis. We found little evidence of clustering beyond that which might be naively expected such as beam uniformity and beam output. Whilst this makes multivariate analysis of little use it suggests that each test is giving independent information. Conclusion: The variety of independent parameters tested in MPC makes it a sensitive tool for routine machine QA. We have determined that using control charts in our QA programme would rapidly detect changes in machine performance. The use of control charts allows large quantities of tests to be performed on all linacs without visual inspection of all results. The use of control limits alerts users when data are inconsistent with previous measurements before they become out of specification. A. Carver has received a speaker’s honorarium from Varian

  16. Statistics for Engineers

    International Nuclear Information System (INIS)

    Kim, Jin Gyeong; Park, Jin Ho; Park, Hyeon Jin; Lee, Jae Jun; Jun, Whong Seok; Whang, Jin Su

    2009-08-01

    This book explains statistics for engineers using MATLAB, which includes arrangement and summary of data, probability, probability distribution, sampling distribution, assumption, check, variance analysis, regression analysis, categorical data analysis, quality assurance such as conception of control chart, consecutive control chart, breakthrough strategy and analysis using Matlab, reliability analysis like measurement of reliability and analysis with Maltab, and Markov chain.

  17. Statistical power and the Rorschach: 1975-1991.

    Science.gov (United States)

    Acklin, M W; McDowell, C J; Orndoff, S

    1992-10-01

    The Rorschach Inkblot Test has been the source of long-standing controversies as to its nature and its psychometric properties. Consistent with behavioral science research in general, the concept of statistical power has been entirely ignored by Rorschach researchers. The concept of power is introduced and discussed, and a power survey of the Rorschach literature published between 1975 and 1991 in the Journal of Personality Assessment, Journal of Consulting and Clinical Psychology, Journal of Abnormal Psychology, Journal of Clinical Psychology, Journal of Personality, Psychological Bulletin, American Journal of Psychiatry, and Journal of Personality and Social Psychology was undertaken. Power was calculated for 2,300 statistical tests in 158 journal articles. Power to detect small, medium, and large effect sizes was .13, .56, and .85, respectively. Similar to the findings in other power surveys conducted on behavioral science research, we concluded that Rorschach research is underpowered to detect the differences under investigation. This undoubtedly contributes to the inconsistency of research findings which has been a source of controversy and criticism over the decades. It appears that research conducted according to the Comprehensive System for the Rorschach is more powerful. Recommendations are offered for improving power and strengthening the design sensitivity of Rorschach research, including increasing sample sizes, use of parametric statistics, reduction of error variance, more accurate reporting of findings, and editorial policies reflecting concern about the magnitude of relationships beyond an exclusive focus on levels of statistical significance.

  18. An Exploration of Student Attitudes and Satisfaction in a GAISE-Influenced Introductory Statistics Course

    Science.gov (United States)

    Paul, Warren; Cunnington, R. Clare

    2017-01-01

    We used the Survey of Attitudes Toward Statistics to (1) evaluate using presemester data the Students' Attitudes Toward Statistics Model (SATS-M), and (2) test the effect on attitudes of an introductory statistics course redesigned according to the Guidelines for Assessment and Instruction in Statistics Education (GAISE) by examining the change in…

  19. A survey on the VXIbus and validity analyses for instrumentation and control in NPPs

    International Nuclear Information System (INIS)

    Kwon, Kee Choon; Park, Won Man

    1997-06-01

    This document presents the technical status of the VXIbus system and its interface. VMEbus, while developed as a backplane for Motorola processors, can be used for data acquisition, control and other instrumentation applications. The VXIbus and its associated standard for form, fit, and electrical interface have simplified the process of putting together automated instrumentation systems. The VXIplug and play system alliance was founded in 1993, the alliance's charter to improve the effectiveness of VXI-based solutions by increasing ease-of-use and improving the interoperability of mainframes, computers, instrumentations, and software through open, multivendor standards and practices. This technical report surveys surveys the instrumentation and control in NPPs apply to the VXI-based instruments which are studied expendability, interoperability, maintainability and other features. (author). 10 refs., 4 tabs., 25 figs

  20. Multivariate statistical characterization of groundwater quality in Ain ...

    African Journals Online (AJOL)

    Administrator

    depends much on the sustainability of the available water resources. Water of .... 18 wells currently in use were selected based on the preliminary field survey carried out to ... In recent times, multivariate statistical methods have been applied ...

  1. Influence of Locus Control on Real and Perceived Relationships ...

    African Journals Online (AJOL)

    They included the Nowicki-Strickland Internal – External Locus of Control Scale for children by Nowicki and Strickland (1973) and Emotional – Social Loneliness Inventory by Vincenzi and Grabosky, (1987). A cross sectional survey design was used while regression analysis and multivariate statistics were used in data ...

  2. Statistical estimates of absenteeism attributable to seasonal and pandemic influenza from the Canadian Labour Force Survey

    Directory of Open Access Journals (Sweden)

    Zheng Hui

    2011-04-01

    Full Text Available Abstract Background As many respiratory viruses are responsible for influenza like symptoms, accurate measures of the disease burden are not available and estimates are generally based on statistical methods. The objective of this study was to estimate absenteeism rates and hours lost due to seasonal influenza and compare these estimates with estimates of absenteeism attributable to the two H1N1 pandemic waves that occurred in 2009. Methods Key absenteeism variables were extracted from Statistics Canada's monthly labour force survey (LFS. Absenteeism and the proportion of hours lost due to own illness or disability were modelled as a function of trend, seasonality and proxy variables for influenza activity from 1998 to 2009. Results Hours lost due to the H1N1/09 pandemic strain were elevated compared to seasonal influenza, accounting for a loss of 0.2% of potential hours worked annually. In comparison, an estimated 0.08% of hours worked annually were lost due to seasonal influenza illnesses. Absenteeism rates due to influenza were estimated at 12% per year for seasonal influenza over the 1997/98 to 2008/09 seasons, and 13% for the two H1N1/09 pandemic waves. Employees who took time off due to a seasonal influenza infection took an average of 14 hours off. For the pandemic strain, the average absence was 25 hours. Conclusions This study confirms that absenteeism due to seasonal influenza has typically ranged from 5% to 20%, with higher rates associated with multiple circulating strains. Absenteeism rates for the 2009 pandemic were similar to those occurring for seasonal influenza. Employees took more time off due to the pandemic strain than was typical for seasonal influenza.

  3. Exploring Factors Related to Completion of an Online Undergraduate-Level Introductory Statistics Course

    Science.gov (United States)

    Zimmerman, Whitney Alicia; Johnson, Glenn

    2017-01-01

    Data were collected from 353 online undergraduate introductory statistics students at the beginning of a semester using the Goals and Outcomes Associated with Learning Statistics (GOALS) instrument and an abbreviated form of the Statistics Anxiety Rating Scale (STARS). Data included a survey of expected grade, expected time commitment, and the…

  4. Application of Statistical Increase in Industrial Quality

    International Nuclear Information System (INIS)

    Akhmad-Fauzy

    2000-01-01

    Application of statistical method in industrial field is slightly newcompared with agricultural and biology. Statistical method which is appliedin industrial field more focus on industrial system control and useful formaintaining economical control of produce quality which is produced on bigscale. Application of statistical method in industrial field has increasedrapidly. This fact is supported by release of ISO 9000 quality system in 1987as international quality standard which is adopted by more than 100countries. (author)

  5. INOPS Survey data report for Sweden

    DEFF Research Database (Denmark)

    Lindholst, Andrej Christian; Severin, Majbritt Christine

    This data report provides statistics on the organization, management and performance of different ways of providing maintenance services within the municipal park and road sector(s) in Sweden. The statistics rely on data collected in the period from May 2015 to June 2015 through an online survey...

  6. Survey of LWR environmental control technology performance and cost

    International Nuclear Information System (INIS)

    Heeb, C.M.; Aaberg, R.L.; Cole, B.M.; Engel, R.L.; Kennedy, W.E. Jr.; Lewallen, M.A.

    1980-03-01

    This study attempts to establish a ranking for species that are routinely released to the environment for a projected nuclear power growth scenario. Unlike comparisons made to existing standards, which are subject to frequent revision, the ranking of releases can be used to form a more logical basis for identifying the areas where further development of control technology could be required. This report describes projections of releases for several fuel cycle scenarios, identifies areas where alternative control technologies may be implemented, and discusses the available alternative control technologies. The release factors were used in a computer code system called ENFORM, which calculates the annual release of any species from any part of the LWR nuclear fuel cycle given a projection of installed nuclear generation capacity. This survey of fuel cycle releases was performed for three reprocessing scenarios (stowaway, reprocessing without recycle of Pu and reprocessing with full recycle of U and Pu) for a 100-year period beginning in 1977. The radioactivity releases were ranked on the basis of a relative ranking factor. The relative ranking factor is based on the 100-year summation of the 50-year population dose commitment from an annual release of radioactive effluents. The nonradioactive releases were ranked on the basis of dilution factor. The twenty highest ranking radioactive releases were identified and each of these was analyzed in terms of the basis for calculating the release and a description of the currently employed control method. Alternative control technology is then discussed, along with the available capital and operating cost figures for alternative control methods

  7. Application of statistical process control to qualitative molecular diagnostic assays.

    Science.gov (United States)

    O'Brien, Cathal P; Finn, Stephen P

    2014-01-01

    Modern pathology laboratories and in particular high throughput laboratories such as clinical chemistry have developed a reliable system for statistical process control (SPC). Such a system is absent from the majority of molecular laboratories and where present is confined to quantitative assays. As the inability to apply SPC to an assay is an obvious disadvantage this study aimed to solve this problem by using a frequency estimate coupled with a confidence interval calculation to detect deviations from an expected mutation frequency. The results of this study demonstrate the strengths and weaknesses of this approach and highlight minimum sample number requirements. Notably, assays with low mutation frequencies and detection of small deviations from an expected value require greater sample numbers to mitigate a protracted time to detection. Modeled laboratory data was also used to highlight how this approach might be applied in a routine molecular laboratory. This article is the first to describe the application of SPC to qualitative laboratory data.

  8. Application of statistical process control to qualitative molecular diagnostic assays

    LENUS (Irish Health Repository)

    O'Brien, Cathal P.

    2014-11-01

    Modern pathology laboratories and in particular high throughput laboratories such as clinical chemistry have developed a reliable system for statistical process control (SPC). Such a system is absent from the majority of molecular laboratories and where present is confined to quantitative assays. As the inability to apply SPC to an assay is an obvious disadvantage this study aimed to solve this problem by using a frequency estimate coupled with a confidence interval calculation to detect deviations from an expected mutation frequency. The results of this study demonstrate the strengths and weaknesses of this approach and highlight minimum sample number requirements. Notably, assays with low mutation frequencies and detection of small deviations from an expected value require greater sample numbers to mitigate a protracted time to detection. Modeled laboratory data was also used to highlight how this approach might be applied in a routine molecular laboratory. This article is the first to describe the application of SPC to qualitative laboratory data.

  9. A Study of Students' Learning Styles, Discipline Attitudes and Knowledge Acquisition in Technology-Enhanced Probability and Statistics Education

    Science.gov (United States)

    Christou, Nicolas; Dinov, Ivo D.

    2011-01-01

    Many modern technological advances have direct impact on the format, style and efficacy of delivery and consumption of educational content. For example, various novel communication and information technology tools and resources enable efficient, timely, interactive and graphical demonstrations of diverse scientific concepts. In this manuscript, we report on a meta-study of 3 controlled experiments of using the Statistics Online Computational Resources in probability and statistics courses. Web-accessible SOCR applets, demonstrations, simulations and virtual experiments were used in different courses as treatment and compared to matched control classes utilizing traditional pedagogical approaches. Qualitative and quantitative data we collected for all courses included Felder-Silverman-Soloman index of learning styles, background assessment, pre and post surveys of attitude towards the subject, end-point satisfaction survey, and varieties of quiz, laboratory and test scores. Our findings indicate that students' learning styles and attitudes towards a discipline may be important confounds of their final quantitative performance. The observed positive effects of integrating information technology with established pedagogical techniques may be valid across disciplines within the broader spectrum courses in the science education curriculum. The two critical components of improving science education via blended instruction include instructor training, and development of appropriate activities, simulations and interactive resources. PMID:21603097

  10. A Study of Students' Learning Styles, Discipline Attitudes and Knowledge Acquisition in Technology-Enhanced Probability and Statistics Education.

    Science.gov (United States)

    Christou, Nicolas; Dinov, Ivo D

    2010-09-01

    Many modern technological advances have direct impact on the format, style and efficacy of delivery and consumption of educational content. For example, various novel communication and information technology tools and resources enable efficient, timely, interactive and graphical demonstrations of diverse scientific concepts. In this manuscript, we report on a meta-study of 3 controlled experiments of using the Statistics Online Computational Resources in probability and statistics courses. Web-accessible SOCR applets, demonstrations, simulations and virtual experiments were used in different courses as treatment and compared to matched control classes utilizing traditional pedagogical approaches. Qualitative and quantitative data we collected for all courses included Felder-Silverman-Soloman index of learning styles, background assessment, pre and post surveys of attitude towards the subject, end-point satisfaction survey, and varieties of quiz, laboratory and test scores. Our findings indicate that students' learning styles and attitudes towards a discipline may be important confounds of their final quantitative performance. The observed positive effects of integrating information technology with established pedagogical techniques may be valid across disciplines within the broader spectrum courses in the science education curriculum. The two critical components of improving science education via blended instruction include instructor training, and development of appropriate activities, simulations and interactive resources.

  11. Age and Educational Inequalities in Smoking Cessation Due to Three Population-Level Tobacco Control Interventions: Findings from the International Tobacco Control (ITC) Netherlands Survey

    Science.gov (United States)

    Nagelhout, Gera E.; Crone, Matty R.; van den Putte, Bas; Willemsen, Marc C.; Fong, Geoffrey T.; de Vries, Hein

    2013-01-01

    This study aimed to examine age and educational inequalities in smoking cessation due to the implementation of a tobacco tax increase, smoke-free legislation and a cessation campaign. Longitudinal data from 962 smokers aged 15 years and older were used from three survey waves of the International Tobacco Control (ITC) Netherlands Survey. The 2008…

  12. British Association for the Study of Community Dentistry (BASCD) guidance on the statistical aspects of training and calibration of examiners for surveys of child dental health. A BASCD coordinated dental epidemiology programme quality standard.

    Science.gov (United States)

    Pine, C M; Pitts, N B; Nugent, Z J

    1997-03-01

    The British Association for the Study of Community Dentistry (BASCD) is responsible for the coordination of locally based surveys of child dental health which permit local and national comparisons between health authorities and regions. These surveys began in 1985/86 in England and Wales, 1987/88 in Scotland and 1993/94 in Northern Ireland. BASCD has taken an increasing lead in setting quality standards in discussion with the NHS Epidemiology Coordinators of the Dental Epidemiology Programme. This paper comprises guidance on the statistical aspects of training and calibration of examiners for these surveys.

  13. Statistical fluid mechanics

    CERN Document Server

    Monin, A S

    2007-01-01

    ""If ever a field needed a definitive book, it is the study of turbulence; if ever a book on turbulence could be called definitive, it is this book."" - ScienceWritten by two of Russia's most eminent and productive scientists in turbulence, oceanography, and atmospheric physics, this two-volume survey is renowned for its clarity as well as its comprehensive treatment. The first volume begins with an outline of laminar and turbulent flow. The remainder of the book treats a variety of aspects of turbulence: its statistical and Lagrangian descriptions, shear flows near surfaces and free turbulenc

  14. Statistical assessment on a combined analysis of GRYN-ROMN-UCBN upland vegetation vital signs

    Science.gov (United States)

    Irvine, Kathryn M.; Rodhouse, Thomas J.

    2014-01-01

    As of 2013, Rocky Mountain and Upper Columbia Basin Inventory and Monitoring Networks have multiple years of vegetation data and Greater Yellowstone Network has three years of vegetation data and monitoring is ongoing in all three networks. Our primary objective is to assess whether a combined analysis of these data aimed at exploring correlations with climate and weather data is feasible. We summarize the core survey design elements across protocols and point out the major statistical challenges for a combined analysis at present. The dissimilarity in response designs between ROMN and UCBN-GRYN network protocols presents a statistical challenge that has not been resolved yet. However, the UCBN and GRYN data are compatible as they implement a similar response design; therefore, a combined analysis is feasible and will be pursued in future. When data collected by different networks are combined, the survey design describing the merged dataset is (likely) a complex survey design. A complex survey design is the result of combining datasets from different sampling designs. A complex survey design is characterized by unequal probability sampling, varying stratification, and clustering (see Lohr 2010 Chapter 7 for general overview). Statistical analysis of complex survey data requires modifications to standard methods, one of which is to include survey design weights within a statistical model. We focus on this issue for a combined analysis of upland vegetation from these networks, leaving other topics for future research. We conduct a simulation study on the possible effects of equal versus unequal probability selection of points on parameter estimates of temporal trend using available packages within the R statistical computing package. We find that, as written, using lmer or lm for trend detection in a continuous response and clm and clmm for visually estimated cover classes with “raw” GRTS design weights specified for the weight argument leads to substantially

  15. A quality improvement project using statistical process control methods for type 2 diabetes control in a resource-limited setting.

    Science.gov (United States)

    Flood, David; Douglas, Kate; Goldberg, Vera; Martinez, Boris; Garcia, Pablo; Arbour, MaryCatherine; Rohloff, Peter

    2017-08-01

    Quality improvement (QI) is a key strategy for improving diabetes care in low- and middle-income countries (LMICs). This study reports on a diabetes QI project in rural Guatemala whose primary aim was to improve glycemic control of a panel of adult diabetes patients. Formative research suggested multiple areas for programmatic improvement in ambulatory diabetes care. This project utilized the Model for Improvement and Agile Global Health, our organization's complementary healthcare implementation framework. A bundle of improvement activities were implemented at the home, clinic and institutional level. Control charts of mean hemoglobin A1C (HbA1C) and proportion of patients meeting target HbA1C showed improvement as special cause variation was identified 3 months after the intervention began. Control charts for secondary process measures offered insights into the value of different components of the intervention. Intensity of home-based diabetes education emerged as an important driver of panel glycemic control. Diabetes QI work is feasible in resource-limited settings in LMICs and can improve glycemic control. Statistical process control charts are a promising methodology for use with panels or registries of diabetes patients. © The Author 2017. Published by Oxford University Press in association with the International Society for Quality in Health Care. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  16. Fundamentals of statistics

    CERN Document Server

    Mulholland, Henry

    1968-01-01

    Fundamentals of Statistics covers topics on the introduction, fundamentals, and science of statistics. The book discusses the collection, organization and representation of numerical data; elementary probability; the binomial Poisson distributions; and the measures of central tendency. The text describes measures of dispersion for measuring the spread of a distribution; continuous distributions for measuring on a continuous scale; the properties and use of normal distribution; and tests involving the normal or student's 't' distributions. The use of control charts for sample means; the ranges

  17. A statistical manual for chemists

    CERN Document Server

    Bauer, Edward

    1971-01-01

    A Statistical Manual for Chemists, Second Edition presents simple and fast statistical tools for data analysis of working chemists. This edition is organized into nine chapters and begins with an overview of the fundamental principles of the statistical techniques used in experimental data analysis. The subsequent chapters deal with the concept of statistical average, experimental design, and analysis of variance. The discussion then shifts to control charts, with particular emphasis on variable charts that are more useful to chemists and chemical engineers. A chapter focuses on the effect

  18. Bathymetric survey of the Cayuga Inlet flood-control channel and selected tributaries in Ithaca, New York, 2016

    Science.gov (United States)

    Wernly, John F.; Nystrom, Elizabeth A.; Coon, William F.

    2017-09-08

    From July 14 to July 20, 2016, the U.S. Geological Survey, in cooperation with the City of Ithaca, New York, and the New York State Department of State, surveyed the bathymetry of the Cayuga Inlet flood-control channel and the mouths of selected tributaries to Cayuga Inlet and Cayuga Lake in Ithaca, N.Y. The flood-control channel, built by the U.S. Army Corps of Engineers between 1965 and 1970, was designed to convey flood flows from the Cayuga Inlet watershed through the City of Ithaca and minimize possible flood damages. Since that time, the channel has infrequently been maintained by dredging, and sediment accumulation and resultant shoaling have greatly decreased the conveyance of the channel and its navigational capability.U.S. Geological Survey personnel collected bathymetric data by using an acoustic Doppler current profiler. The survey produced a dense dataset of water depths that were converted to bottom elevations. These elevations were then used to generate a geographic information system bathymetric surface. The bathymetric data and resultant bathymetric surface show the current condition of the channel and provide the information that governmental agencies charged with maintaining the Cayuga Inlet for flood-control and navigational purposes need to make informed decisions regarding future maintenance measures.

  19. Introduction of an automated mine surveying system - a method for effective control of mining operations

    Energy Technology Data Exchange (ETDEWEB)

    Mazhdrakov, M.

    1987-04-01

    Reviews developments in automated processing of mine survey data in Bulgaria for 1965-1970. This development has occurred in three phases. In the first phase, computers calculated coordinates of mine survey points; in the second phase, these data were electronically processed; in the third phase, surface and underground mine development is controlled by electronic data processing equipment. Centralized and decentralized electronic processing of data has been introduced at major coal mines. The Bulgarian Pravets 82 microcomputer and the ASMO-MINI program package are in current use at major coal mines. A lack of plotters, due to financial limitations, handicaps large-scale application of automated mine surveying in Bulgaria.

  20. Using a statistical process control chart during the quality assessment of cancer registry data.

    Science.gov (United States)

    Myles, Zachary M; German, Robert R; Wilson, Reda J; Wu, Manxia

    2011-01-01

    Statistical process control (SPC) charts may be used to detect acute variations in the data while simultaneously evaluating unforeseen aberrations that may warrant further investigation by the data user. Using cancer stage data captured by the Summary Stage 2000 (SS2000) variable, we sought to present a brief report highlighting the utility of the SPC chart during the quality assessment of cancer registry data. Using a county-level caseload for the diagnosis period of 2001-2004 (n=25,648), we found the overall variation of the SS2000 variable to be in control during diagnosis years of 2001 and 2002, exceeded the lower control limit (LCL) in 2003, and exceeded the upper control limit (UCL) in 2004; in situ/localized stages were in control throughout the diagnosis period, regional stage exceeded UCL in 2004, and distant stage exceeded the LCL in 2001 and the UCL in 2004. Our application of the SPC chart with cancer registry data illustrates that the SPC chart may serve as a readily available and timely tool for identifying areas of concern during the data collection and quality assessment of central cancer registry data.

  1. The effect of a monetary incentive for administrative assistants on the survey response rate: a randomized controlled trial

    Directory of Open Access Journals (Sweden)

    Arnav Agarwal

    2016-08-01

    Full Text Available Abstract Background There is sufficient evidence that monetary incentives are effective in increasing survey response rates in the general population as well as with physicians. The objective of this study was to assess the impact of a monetary incentive intended for administrative assistants on the survey response rate of physicians in leadership positions. Methods This was an ancillary study to a national survey of chairs of academic Departments of Medicine in the United States about measuring faculty productivity. We randomized survey participants to receive or not receive a $5 gift card enclosed in the survey package. The cover letter explained that the gift card was intended for the administrative assistants as a “thank you for their time.” We compared the response rates between the 2 study arms using the Chi-square test. Results Out of 152 participants to whom survey packages were mailed to, a total of 78 responses were received (51 % response rate. The response rates were 59 % in the incentive arm and 46 % in the no incentive arm. The relative effect of the incentive compared to no monetary incentive was borderline statistically significant (relative risk (RR = 1.36, 95 % confidence interval (CI 0.99 to 1.87; p = 0.055. Conclusion Monetary incentives intended for administrative assistants likely increase the response rate of physicians in leadership positions.

  2. Statistics for nuclear engineers and scientists. Part 1. Basic statistical inference

    Energy Technology Data Exchange (ETDEWEB)

    Beggs, W.J.

    1981-02-01

    This report is intended for the use of engineers and scientists working in the nuclear industry, especially at the Bettis Atomic Power Laboratory. It serves as the basis for several Bettis in-house statistics courses. The objectives of the report are to introduce the reader to the language and concepts of statistics and to provide a basic set of techniques to apply to problems of the collection and analysis of data. Part 1 covers subjects of basic inference. The subjects include: descriptive statistics; probability; simple inference for normally distributed populations, and for non-normal populations as well; comparison of two populations; the analysis of variance; quality control procedures; and linear regression analysis.

  3. ENVIRONMENTAL MONITORING AND ASSESSMENT PROGRAM (EMAP): WESTERN STREAMS AND RIVERS STATISTICAL SUMMARY

    Science.gov (United States)

    This statistical summary reports data from the Environmental Monitoring and Assessment Program (EMAP) Western Pilot (EMAP-W). EMAP-W was a sample survey (or probability survey, often simply called 'random') of streams and rivers in 12 states of the western U.S. (Arizona, Californ...

  4. National Mental Health Services Survey (N-MHSS-2010)

    Data.gov (United States)

    U.S. Department of Health & Human Services — The National Mental Health Services Survey (N-MHSS) is an annual survey designed to collect statistical information on the numbers and characteristics of all known...

  5. The Association of Academic Health Sciences Libraries Annual Statistics: a thematic history.

    Science.gov (United States)

    Shedlock, James; Byrd, Gary D

    2003-04-01

    The Annual Statistics of Medical School Libraries in the United States and Canada (Annual Statistics) is the most recognizable achievement of the Association of Academic Health Sciences Libraries in its history to date. This article gives a thematic history of the Annual Statistics, emphasizing the leadership role of editors and Editorial Boards, the need for cooperation and membership support to produce comparable data useful for everyday management of academic medical center libraries and the use of technology as a tool for data gathering and publication. The Annual Statistics' origin is recalled, and survey features and content are related to the overall themes. The success of the Annual Statistics is evident in the leadership skills of the first editor, Richard Lyders, executive director of the Houston Academy of Medicine-Texas Medical Center Library. The history shows the development of a survey instrument that strives to produce reliable and valid data for a diverse group of libraries while reflecting the many complex changes in the library environment. The future of the Annual Statistics is assured by the anticipated changes facing academic health sciences libraries, namely the need to reflect the transition from a physical environment to an electronic operation.

  6. Do Introductory Statistics Courses in the United States Improve Students' Attitudes?

    Science.gov (United States)

    Schau, Candace; Emmioglu, Esma

    2012-01-01

    We examined the attitudes of about 2200 students enrolled in 101 sections of post-secondary introductory statistics service courses located across the United States. Using the "Survey of Attitudes Toward Statistics-36," we assessed students' attitudes when they entered and left their courses, as well as changes in attitudes across their courses.…

  7. STATISTICAL DISTRIBUTION PATTERNS IN MECHANICAL AND FATIGUE PROPERTIES OF METALLIC MATERIALS

    OpenAIRE

    Tatsuo, SAKAI; Masaki, NAKAJIMA; Keiro, TOKAJI; Norihiko, HASEGAWA; Department of Mechanical Engineering, Ritsumeikan University; Department of Mechanical Engineering, Toyota College of Technology; Department of Mechanical Engineering, Gifu University; Department of Mechanical Engineering, Gifu University

    1997-01-01

    Many papers on the statistical aspect of materials strength have been collected and reviewed by The Research Group for Statistical Aspects of Materials Strength.A book of "Statistical Aspects of Materials Strength" was written by this group, and published in 1992.Based on the experimental data compiled in this book, distribution patterns of mechanical properties are systematically surveyed paying an attention to metallic materials.Thus one can obtain the fundamental knowledge for a reliabilit...

  8. The impact of televised tobacco control advertising content on campaign recall: Evidence from the International Tobacco Control (ITC) United Kingdom Survey

    Science.gov (United States)

    2014-01-01

    Background Although there is some evidence to support an association between exposure to televised tobacco control campaigns and recall among youth, little research has been conducted among adults. In addition, no previous work has directly compared the impact of different types of emotive campaign content. The present study examined the impact of increased exposure to tobacco control advertising with different types of emotive content on rates and durations of self-reported recall. Methods Data on recall of televised campaigns from 1,968 adult smokers residing in England through four waves of the International Tobacco Control (ITC) United Kingdom Survey from 2005 to 2009 were merged with estimates of per capita exposure to government-run televised tobacco control advertising (measured in GRPs, or Gross Rating Points), which were categorised as either “positive” or “negative” according to their emotional content. Results Increased overall campaign exposure was found to significantly increase probability of recall. For every additional 1,000 GRPs of per capita exposure to negative emotive campaigns in the six months prior to survey, there was a 41% increase in likelihood of recall (OR = 1.41, 95% CI: 1.24–1.61), while positive campaigns had no significant effect. Increased exposure to negative campaigns in both the 1–3 months and 4–6 month periods before survey was positively associated with recall. Conclusions Increased per capita exposure to negative emotive campaigns had a greater effect on campaign recall than positive campaigns, and was positively associated with increased recall even when the exposure had occurred more than three months previously. PMID:24885426

  9. Appalachian National Scenic Trail pilot survey

    Science.gov (United States)

    Stan Zarnoch; Michael Bowker; Ken Cordell; Matt Owens; Gary T. Green; Allison Ginn

    2011-01-01

    Visitation statistics on the Appalachian National Scenic Trail (AT) are important for management and Federal Government reporting purposes. However, no survey methodology has been developed to obtain accurate trailwide estimates over linear trails that traverse many hundreds of back-country miles. This research develops a stratified random survey design which utilizes...

  10. Statistical process control as a tool for controlling operating room performance: retrospective analysis and benchmarking.

    Science.gov (United States)

    Chen, Tsung-Tai; Chang, Yun-Jau; Ku, Shei-Ling; Chung, Kuo-Piao

    2010-10-01

    There is much research using statistical process control (SPC) to monitor surgical performance, including comparisons among groups to detect small process shifts, but few of these studies have included a stabilization process. This study aimed to analyse the performance of surgeons in operating room (OR) and set a benchmark by SPC after stabilized process. The OR profile of 499 patients who underwent laparoscopic cholecystectomy performed by 16 surgeons at a tertiary hospital in Taiwan during 2005 and 2006 were recorded. SPC was applied to analyse operative and non-operative times using the following five steps: first, the times were divided into two segments; second, they were normalized; third, they were evaluated as individual processes; fourth, the ARL(0) was calculated;, and fifth, the different groups (surgeons) were compared. Outliers were excluded to ensure stability for each group and to facilitate inter-group comparison. The results showed that in the stabilized process, only one surgeon exhibited a significantly shorter total process time (including operative time and non-operative time). In this study, we use five steps to demonstrate how to control surgical and non-surgical time in phase I. There are some measures that can be taken to prevent skew and instability in the process. Also, using SPC, one surgeon can be shown to be a real benchmark. © 2010 Blackwell Publishing Ltd.

  11. Minerals industry survey, 1984

    Energy Technology Data Exchange (ETDEWEB)

    1984-01-01

    This is the seventh edition of the statistical survey commissioned by the Australian Mining Industry Council. It represents the most comprehensive review of the financial position of the Australian minerals industry and provides timely financial data on the minerals industry. The tables of this survey have been prepared for AMIC by Coopers and Lybrand, Chartered Accountants, based on information supplied to them in confidence by the respondent companies. For the purpose of the survey, the minerals industry has been defined as including exploration for, and extraction and primary processing of, minerals in Australia. The oil and gas industry is not included.

  12. Secondary Data Analysis of National Surveys in Japan Toward Improving Population Health

    Science.gov (United States)

    Ikeda, Nayu

    2016-01-01

    Secondary data analysis of national health surveys of the general population is a standard methodology for health metrics and evaluation; it is used to monitor trends in population health over time and benchmark the performance of health systems. In Japan, the government has established electronic databases of individual records from national surveys of the population’s health. However, the number of publications based on these datasets is small considering the scale and coverage of the surveys. There appear to be two major obstacles to the secondary use of Japanese national health survey data: strict data access control under the Statistics Act and an inadequate interdisciplinary research environment for resolving methodological difficulties encountered when dealing with secondary data. The usefulness of secondary analysis of survey data is evident with examples from the author’s previous studies based on vital records and the National Health and Nutrition Surveys, which showed that (i) tobacco smoking and high blood pressure are the major risk factors for adult mortality from non-communicable diseases in Japan; (ii) the decrease in mean blood pressure in Japan from the late 1980s to the early 2000s was partly attributable to the increased use of antihypertensive medication and reduced dietary salt intake; and (iii) progress in treatment coverage and control of high blood pressure is slower in Japan than in the United States and Britain. National health surveys in Japan are an invaluable asset, and findings from secondary analyses of these surveys would provide important suggestions for improving health in people around the world. PMID:26902170

  13. Technical basis for tumbleweed survey requirements and disposal criteria

    International Nuclear Information System (INIS)

    J. D. Arana

    2000-01-01

    This technical basis document describes the technique for surveying potentially contaminated tumbleweeds in areas where the Environmental Restoration Contractor has jurisdiction and the disposal criteria based on these survey results. The report also discusses the statistical basis for surveys and the historical basis for the assumptions that are used to interpret the surveys

  14. Technical Basis for Tumbleweed Survey Requirements and Disposal Criteria

    International Nuclear Information System (INIS)

    Arana, J.D.

    2000-01-01

    This technical basis document describes the technique for surveying potentially contaminated tumbleweeds in areas where the Environmental Restoration Contractor has jurisdiction and the disposal criteria based on these survey results. The report also discusses the statistical basis for surveys and the historical basis for the assumptions that are used to interpret the surveys

  15. Instruction of Statistics via Computer-Based Tools: Effects on Statistics' Anxiety, Attitude, and Achievement

    Science.gov (United States)

    Ciftci, S. Koza; Karadag, Engin; Akdal, Pinar

    2014-01-01

    The purpose of this study was to determine the effect of statistics instruction using computer-based tools, on statistics anxiety, attitude, and achievement. This study was designed as quasi-experimental research and the pattern used was a matched pre-test/post-test with control group design. Data was collected using three scales: a Statistics…

  16. Using Statistical Process Control Charts to Identify the Steroids Era in Major League Baseball: An Educational Exercise

    Science.gov (United States)

    Hill, Stephen E.; Schvaneveldt, Shane J.

    2011-01-01

    This article presents an educational exercise in which statistical process control charts are constructed and used to identify the Steroids Era in American professional baseball. During this period (roughly 1993 until the present), numerous baseball players were alleged or proven to have used banned, performance-enhancing drugs. Also observed…

  17. An Analysis of Research Methods and Statistical Techniques Used by Doctoral Dissertation at the Education Sciences in Turkey

    Science.gov (United States)

    Karadag, Engin

    2010-01-01

    To assess research methods and analysis of statistical techniques employed by educational researchers, this study surveyed unpublished doctoral dissertation from 2003 to 2007. Frequently used research methods consisted of experimental research; a survey; a correlational study; and a case study. Descriptive statistics, t-test, ANOVA, factor…

  18. Order-Specific Fertility Rates for Germany
    Estimates from Perinatal Statistics for the Period 2001-2008

    OpenAIRE

    Michaela Kreyenfeld; Rembrandt Scholz; Frederik Peters; Ines Wlosnewski

    2011-01-01

    Until 2008, Germany’s vital statistics did not include information on the biological order of each birth. This resulted in a dearth of important demographic indicators, such as the mean age at first birth and the level of childlessness. Researchers have tried to fill this gap by generating order-specific birth rates from survey data, and by combining survey data with vital statistics. This paper takes a different approach by using Perinatal Statistics to generate birth order-specific fertilit...

  19. American Housing Survey (AHS)

    Data.gov (United States)

    Department of Housing and Urban Development — The AHS is the largest, regular national housing sample survey in the United States. The U.S. Census Bureau conducts the AHS to obtain up-to-date housing statistics...

  20. The Extended Northern ROSAT Galaxy Cluster Survey (NORAS II). I. Survey Construction and First Results

    International Nuclear Information System (INIS)

    Böhringer, Hans; Chon, Gayoung; Trümper, Joachim; Retzlaff, Jörg; Meisenheimer, Klaus; Schartel, Norbert

    2017-01-01

    As the largest, clearly defined building blocks of our universe, galaxy clusters are interesting astrophysical laboratories and important probes for cosmology. X-ray surveys for galaxy clusters provide one of the best ways to characterize the population of galaxy clusters. We provide a description of the construction of the NORAS II galaxy cluster survey based on X-ray data from the northern part of the ROSAT All-Sky Survey. NORAS II extends the NORAS survey down to a flux limit of 1.8 × 10 −12 erg s −1 cm −2 (0.1–2.4 keV), increasing the sample size by about a factor of two. The NORAS II cluster survey now reaches the same quality and depth as its counterpart, the southern REFLEX II survey, allowing us to combine the two complementary surveys. The paper provides information on the determination of the cluster X-ray parameters, the identification process of the X-ray sources, the statistics of the survey, and the construction of the survey selection function, which we provide in numerical format. Currently NORAS II contains 860 clusters with a median redshift of z  = 0.102. We provide a number of statistical functions, including the log N –log S and the X-ray luminosity function and compare these to the results from the complementary REFLEX II survey. Using the NORAS II sample to constrain the cosmological parameters, σ 8 and Ω m , yields results perfectly consistent with those of REFLEX II. Overall, the results show that the two hemisphere samples, NORAS II and REFLEX II, can be combined without problems into an all-sky sample, just excluding the zone of avoidance.

  1. The Extended Northern ROSAT Galaxy Cluster Survey (NORAS II). I. Survey Construction and First Results

    Energy Technology Data Exchange (ETDEWEB)

    Böhringer, Hans; Chon, Gayoung; Trümper, Joachim [Max-Planck-Institut für Extraterrestrische Physik, D-85748 Garching (Germany); Retzlaff, Jörg [ESO, D-85748 Garching (Germany); Meisenheimer, Klaus [Max-Planck-Institut für Astronomy, Königstuhl 17, D-69117 Heidelberg (Germany); Schartel, Norbert [ESAC, Camino Bajo del Castillo, Villanueva de la Cañada, E-28692 Madrid (Spain)

    2017-05-01

    As the largest, clearly defined building blocks of our universe, galaxy clusters are interesting astrophysical laboratories and important probes for cosmology. X-ray surveys for galaxy clusters provide one of the best ways to characterize the population of galaxy clusters. We provide a description of the construction of the NORAS II galaxy cluster survey based on X-ray data from the northern part of the ROSAT All-Sky Survey. NORAS II extends the NORAS survey down to a flux limit of 1.8 × 10{sup −12} erg s{sup −1} cm{sup −2} (0.1–2.4 keV), increasing the sample size by about a factor of two. The NORAS II cluster survey now reaches the same quality and depth as its counterpart, the southern REFLEX II survey, allowing us to combine the two complementary surveys. The paper provides information on the determination of the cluster X-ray parameters, the identification process of the X-ray sources, the statistics of the survey, and the construction of the survey selection function, which we provide in numerical format. Currently NORAS II contains 860 clusters with a median redshift of z  = 0.102. We provide a number of statistical functions, including the log N –log S and the X-ray luminosity function and compare these to the results from the complementary REFLEX II survey. Using the NORAS II sample to constrain the cosmological parameters, σ {sub 8} and Ω{sub m}, yields results perfectly consistent with those of REFLEX II. Overall, the results show that the two hemisphere samples, NORAS II and REFLEX II, can be combined without problems into an all-sky sample, just excluding the zone of avoidance.

  2. Forest statistics for West Virginia--1975 and 1989

    Science.gov (United States)

    Dawn M. Di Giovanni; Dawn M. Di Giovanni

    1990-01-01

    A statistical report on the fourth forest survey of West Virginia (1989). Findings are displayed in 119 tables containing estimates of forest area, number of trees, timber volume, tree biomass, and timber products output. Data are presented at three levels: state, geographic unit, and county.

  3. An analysis on intersectional collaboration on non-communicable chronic disease prevention and control in China: a cross-sectional survey on main officials of community health service institutions.

    Science.gov (United States)

    Li, Xing-Ming; Rasooly, Alon; Peng, Bo; JianWang; Xiong, Shu-Yu

    2017-11-10

    Our study aimed to design a tool of evaluating intersectional collaboration on Non-communicable Chronic Disease (NCD) prevention and control, and further to understand the current status of intersectional collaboration in community health service institutions of China. We surveyed 444 main officials of community health service institutions in Beijing, Tianjin, Hubei and Ningxia regions of China in 2014 by using a questionnaire. A model of collaboration measurement, including four relational dimensions of governance, shared goals and vision, formalization and internalization, was used to compare the scores of evaluation scale in NCD management procedures across community healthcare institutions and other ones. Reliability and validity of the evaluation tool on inter-organizational collaboration on NCD prevention and control were verified. The test on tool evaluating inter-organizational collaboration in community NCD management revealed a good reliability and validity (Cronbach's Alpha = 0.89,split-half reliability = 0.84, the variance contribution rate of an extracted principal component = 49.70%). The results of inter-organizational collaboration of different departments and management segments showed there were statistically significant differences in formalization dimension for physical examination (p = 0.01).There was statistically significant difference in governance dimension, formalization dimension and total score of the collaboration scale for health record sector (p = 0.01,0.00,0.00). Statistical differences were found in the formalization dimension for exercise and nutrition health education segment (p = 0.01). There were no statistically significant difference in formalization dimension of medication guidance for psychological consultation, medical referral service and rehabilitation guidance (all p > 0.05). The multi-department collaboration mechanism of NCD prevention and control has been rudimentarily established. Community management

  4. Fieldcrest Cannon, Inc. Advanced Technical Preparation. Statistical Process Control (SPC). PRE-SPC 11: SPC & Graphs. Instructor Book.

    Science.gov (United States)

    Averitt, Sallie D.

    This instructor guide, which was developed for use in a manufacturing firm's advanced technical preparation program, contains the materials required to present a learning module that is designed to prepare trainees for the program's statistical process control module by improving their basic math skills in working with line graphs and teaching…

  5. Business Statistics at the Top 50 US Business Programmes

    Science.gov (United States)

    Haskin, Heather N.; Krehbiel, Timothy C.

    2012-01-01

    We surveyed fifty leading undergraduate business schools concerning their statistics requirements. We report on many aspects including credit-hours required, topics covered, computer integration, faculty background, teaching pedagogy, textbooks, and recent and proposed changes. (Contains 8 tables.)

  6. Fishing effort statistics of the artisanal fisheries of the Cross River ...

    African Journals Online (AJOL)

    Frame surveys were carried out in 1997 and 1998 to assess the effort statistics of the artisanal fisheries of the Cross River Estuary. These surveys covered the inner Estuary and the West coast of the outer Estuary. Fishing effort was taken as number of fishers, number of canoes, and types of fishing gears. A total of 64 fishing ...

  7. Multivariate statistical process control in product quality review assessment - A case study.

    Science.gov (United States)

    Kharbach, M; Cherrah, Y; Vander Heyden, Y; Bouklouze, A

    2017-11-01

    According to the Food and Drug Administration and the European Good Manufacturing Practices (GMP) guidelines, Annual Product Review (APR) is a mandatory requirement in GMP. It consists of evaluating a large collection of qualitative or quantitative data in order to verify the consistency of an existing process. According to the Code of Federal Regulation Part 11 (21 CFR 211.180), all finished products should be reviewed annually for the quality standards to determine the need of any change in specification or manufacturing of drug products. Conventional Statistical Process Control (SPC) evaluates the pharmaceutical production process by examining only the effect of a single factor at the time using a Shewhart's chart. It neglects to take into account the interaction between the variables. In order to overcome this issue, Multivariate Statistical Process Control (MSPC) can be used. Our case study concerns an APR assessment, where 164 historical batches containing six active ingredients, manufactured in Morocco, were collected during one year. Each batch has been checked by assaying the six active ingredients by High Performance Liquid Chromatography according to European Pharmacopoeia monographs. The data matrix was evaluated both by SPC and MSPC. The SPC indicated that all batches are under control, while the MSPC, based on Principal Component Analysis (PCA), for the data being either autoscaled or robust scaled, showed four and seven batches, respectively, out of the Hotelling T 2 95% ellipse. Also, an improvement of the capability of the process is observed without the most extreme batches. The MSPC can be used for monitoring subtle changes in the manufacturing process during an APR assessment. Copyright © 2017 Académie Nationale de Pharmacie. Published by Elsevier Masson SAS. All rights reserved.

  8. Elementary methods for statistical systems, mean field, large-n, and duality

    International Nuclear Information System (INIS)

    Itzykson, C.

    1983-01-01

    Renormalizable field theories are singled out by such precise restraints that regularization schemes must be used to break these invariances. Statistical methods can be adapted to these problems where asymptotically free models fail. This lecture surveys approximation schemes developed in the context of statistical mechanics. The confluence point of statistical mechanics and field theory is the use of discretized path integrals, where continuous space time has been replaced by a regular lattice. Dynamic variables, a Boltzman weight factor, and boundary conditions are the ingredients. Mean field approximations --field equations, Random field transform, and gauge invariant systems--are surveyed. Under Large-N limits vector models are found to simplify tremendously. The reasons why matrix models drawn from SU (n) gauge theories do not simplify are discussed. In the epilogue, random curves versus random surfaces are offered as an example where global and local symmetries are not alike

  9. Development of Statistical Process Control Methodology for an Environmentally Compliant Surface Cleaning Process in a Bonding Laboratory

    Science.gov (United States)

    Hutchens, Dale E.; Doan, Patrick A.; Boothe, Richard E.

    1997-01-01

    Bonding labs at both MSFC and the northern Utah production plant prepare bond test specimens which simulate or witness the production of NASA's Reusable Solid Rocket Motor (RSRM). The current process for preparing the bonding surfaces employs 1,1,1-trichloroethane vapor degreasing, which simulates the current RSRM process. Government regulations (e.g., the 1990 Amendments to the Clean Air Act) have mandated a production phase-out of a number of ozone depleting compounds (ODC) including 1,1,1-trichloroethane. In order to comply with these regulations, the RSRM Program is qualifying a spray-in-air (SIA) precision cleaning process using Brulin 1990, an aqueous blend of surfactants. Accordingly, surface preparation prior to bonding process simulation test specimens must reflect the new production cleaning process. The Bonding Lab Statistical Process Control (SPC) program monitors the progress of the lab and its capabilities, as well as certifies the bonding technicians, by periodically preparing D6AC steel tensile adhesion panels with EA-91 3NA epoxy adhesive using a standardized process. SPC methods are then used to ensure the process is statistically in control, thus producing reliable data for bonding studies, and identify any problems which might develop. Since the specimen cleaning process is being changed, new SPC limits must be established. This report summarizes side-by-side testing of D6AC steel tensile adhesion witness panels and tapered double cantilevered beams (TDCBs) using both the current baseline vapor degreasing process and a lab-scale spray-in-air process. A Proceco 26 inches Typhoon dishwasher cleaned both tensile adhesion witness panels and TDCBs in a process which simulates the new production process. The tests were performed six times during 1995, subsequent statistical analysis of the data established new upper control limits (UCL) and lower control limits (LCL). The data also demonstrated that the new process was equivalent to the vapor

  10. Statistical Data Processing with R – Metadata Driven Approach

    Directory of Open Access Journals (Sweden)

    Rudi SELJAK

    2016-06-01

    Full Text Available In recent years the Statistical Office of the Republic of Slovenia has put a lot of effort into re-designing its statistical process. We replaced the classical stove-pipe oriented production system with general software solutions, based on the metadata driven approach. This means that one general program code, which is parametrized with process metadata, is used for data processing for a particular survey. Currently, the general program code is entirely based on SAS macros, but in the future we would like to explore how successfully statistical software R can be used for this approach. Paper describes the metadata driven principle for data validation, generic software solution and main issues connected with the use of statistical software R for this approach.

  11. Excluding Institutionalized Elderly from Surveys: Consequences for Income and Poverty Statistics

    Science.gov (United States)

    Peeters, Hans; Debels, Annelies; Verpoorten, Rika

    2013-01-01

    Growing life expectancy and changes in financial, marriage and labour markets have placed the income position of the elderly at the center of scientific and political discourse. As a consequence, the last decades witnessed the publication of various influential reports that contained comparative statistics on old age income inequalities on the…

  12. A Statistical Study of Brown Dwarf Companions from the SDSS-III MARVELS Survey

    Science.gov (United States)

    Grieves, Nolan; Ge, Jian; Thomas, Neil; Ma, Bo; De Lee, Nathan M.; Lee, Brian L.; Fleming, Scott W.; Sithajan, Sirinrat; Varosi, Frank; Liu, Jian; Zhao, Bo; Li, Rui; Agol, Eric; MARVELS Team

    2016-01-01

    We present 23 new Brown Dwarf (BD) candidates from the Multi-object APO Radial-Velocity Exoplanet Large-Area Survey (MARVELS) of the Sloan Digital Sky Survey III (SDSS-III). The BD candidates were selected from the processed MARVELS data using the latest University of Florida 2D pipeline, which shows significant improvement and reduction of systematic errors over the 1D pipeline results included in the SDSS Data Release 12. This sample is the largest BD yield from a single radial velocity survey. Of the 23 candidates, 18 are around main sequence stars and 5 are around giant stars. Given a giant contamination rate of ~24% for the MARVELS survey, we find a BD occurrence rate around main sequence stars of ~0.7%, which agrees with previous studies and confirms the BD desert, while the BD occurrence rate around the MARVELS giant stars is ~0.6%. Preliminary results show that our new candidates around solar type stars support a two population hypothesis, where BDs are divided at a mass of ~42.5 MJup. BDs less massive than 42.5 MJup have eccentricity distributions consistent with planet-planet scattering models, where BDs more massive than 42.5 MJup have both period and eccentricity distributions similar to that of stellar binaries. Special Brown Dwarf systems such as multiple BD systems and highly eccentric BDs will also be presented.

  13. Six Sigma Quality Management System and Design of Risk-based Statistical Quality Control.

    Science.gov (United States)

    Westgard, James O; Westgard, Sten A

    2017-03-01

    Six sigma concepts provide a quality management system (QMS) with many useful tools for managing quality in medical laboratories. This Six Sigma QMS is driven by the quality required for the intended use of a test. The most useful form for this quality requirement is the allowable total error. Calculation of a sigma-metric provides the best predictor of risk for an analytical examination process, as well as a design parameter for selecting the statistical quality control (SQC) procedure necessary to detect medically important errors. Simple point estimates of sigma at medical decision concentrations are sufficient for laboratory applications. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. Evaluation of undergraduate nursing students' attitudes towards statistics courses, before and after a course in applied statistics.

    Science.gov (United States)

    Hagen, Brad; Awosoga, Olu; Kellett, Peter; Dei, Samuel Ofori

    2013-09-01

    Undergraduate nursing students must often take a course in statistics, yet there is scant research to inform teaching pedagogy. The objectives of this study were to assess nursing students' overall attitudes towards statistics courses - including (among other things) overall fear and anxiety, preferred learning and teaching styles, and the perceived utility and benefit of taking a statistics course - before and after taking a mandatory course in applied statistics. The authors used a pre-experimental research design (a one-group pre-test/post-test research design), by administering a survey to nursing students at the beginning and end of the course. The study was conducted at a University in Western Canada that offers an undergraduate Bachelor of Nursing degree. Participants included 104 nursing students, in the third year of a four-year nursing program, taking a course in statistics. Although students only reported moderate anxiety towards statistics, student anxiety about statistics had dropped by approximately 40% by the end of the course. Students also reported a considerable and positive change in their attitudes towards learning in groups by the end of the course, a potential reflection of the team-based learning that was used. Students identified preferred learning and teaching approaches, including the use of real-life examples, visual teaching aids, clear explanations, timely feedback, and a well-paced course. Students also identified preferred instructor characteristics, such as patience, approachability, in-depth knowledge of statistics, and a sense of humor. Unfortunately, students only indicated moderate agreement with the idea that statistics would be useful and relevant to their careers, even by the end of the course. Our findings validate anecdotal reports on statistics teaching pedagogy, although more research is clearly needed, particularly on how to increase students' perceptions of the benefit and utility of statistics courses for their nursing

  15. A Study of Faculty Views of Statistics and Student Preparation beyond an Introductory Class

    Science.gov (United States)

    Doehler, Kirsten; Taylor, Laura; Smith, Jessalyn

    2013-01-01

    The purpose of this research is to better understand the role of statistics in teaching and research by faculty from all disciplines and their perceptions of the statistical preparation of their students. This study reports the findings of a survey administered to faculty from seven colleges and universities regarding the use of statistics in…

  16. Estimating the Time to Benefit for Preventive Drugs with the Statistical Process Control Method: An Example with Alendronate

    NARCIS (Netherlands)

    van de Glind, Esther M. M.; Willems, Hanna C.; Eslami, Saeid; Abu-Hanna, Ameen; Lems, Willem F.; Hooft, Lotty; de Rooij, Sophia E.; Black, Dennis M.; van Munster, Barbara C.

    2016-01-01

    For physicians dealing with patients with a limited life expectancy, knowing the time to benefit (TTB) of preventive medication is essential to support treatment decisions. The aim of this study was to investigate the usefulness of statistical process control (SPC) for determining the TTB in

  17. Estimating the Time to Benefit for Preventive Drugs with the Statistical Process Control Method : An Example with Alendronate

    NARCIS (Netherlands)

    van de Glind, Esther M. M.; Willems, Hanna C.; Eslami, Saeid; Abu-Hanna, Ameen; Lems, Willem F.; Hooft, Lotty; de Rooij, Sophia E.; Black, Dennis M.; van Munster, Barbara C.

    For physicians dealing with patients with a limited life expectancy, knowing the time to benefit (TTB) of preventive medication is essential to support treatment decisions. The aim of this study was to investigate the usefulness of statistical process control (SPC) for determining the TTB in

  18. Minerals industry survey 1987

    Energy Technology Data Exchange (ETDEWEB)

    1987-01-01

    This is the eleventh Minerals Industry Survey produced by the Australian Mining Industry Council. It represents an invaluable time series on the minerals industry's financial performance, as well as an up to date description of the industry for the latest financial year. The survey has been conceived as a supplement to and expansion of the various Australian Bureau of Statistics and Bureau of Mineral Resources, Geology and Geophysics publications which describe the exploration, mining and smelting and refining industries in Australia. The tables in this survey have been prepared by Coopers and Lybrand, Chartered Accountants, based on information supplied to them in confidence by the respondent companies.

  19. A knowledge - based system to assist in the design of soil survey schemes

    NARCIS (Netherlands)

    Domburg, P.

    1994-01-01

    Soil survey information with quantified accuracy is relevant to decisions on land use and environmental problems. To obtain such information statistical strategies should be used for collecting and analysing data. A survey project based on a statistical sampling strategy requires a soil

  20. Regret and rationalization among smokers in Thailand and Malaysia: findings from the International Tobacco Control Southeast Asia Survey.

    Science.gov (United States)

    Lee, Wonkyong B; Fong, Geoffrey T; Zanna, Mark P; Omar, Maizurah; Sirirassamee, Buppha; Borland, Ron

    2009-07-01

    To test whether differences of history and strength in tobacco control policies will influence social norms, which, in turn, will influence quit intentions, by influencing smokers' regret and rationalization. The data were from the International Tobacco Control (ITC) Policy Evaluation Southeast Asia Survey, a cohort survey of representative samples of adult smokers in Thailand (N = 2,000) and Malaysia (N = 2,006). The survey used a stratified multistage sampling design. Measures included regret, rationalization, social norms, and quit intention. Thai smokers were more likely to have quit intentions than Malaysian smokers. This difference in quit intentions was, in part, explained by the country differences in social norms, regret, and rationalization. Reflecting Thailand's history of stronger tobacco control policies, Thai smokers, compared with Malaysian smokers, perceived more negative social norms toward smoking, were more likely to regret, and less likely to rationalize smoking. Mediational analyses revealed that these differences in social norms, accounted, in part, for the country-quit intention relation and that regret and rationalization accounted, in part, for the social norm-quit intention relation. The results suggest that social norms toward smoking, which are shaped by tobacco control policies, and smokers' regret and rationalization influence quit intentions.

  1. A statistical approach to estimating effects of performance shaping factors on human error probabilities of soft controls

    International Nuclear Information System (INIS)

    Kim, Yochan; Park, Jinkyun; Jung, Wondea; Jang, Inseok; Hyun Seong, Poong

    2015-01-01

    Despite recent efforts toward data collection for supporting human reliability analysis, there remains a lack of empirical basis in determining the effects of performance shaping factors (PSFs) on human error probabilities (HEPs). To enhance the empirical basis regarding the effects of the PSFs, a statistical methodology using a logistic regression and stepwise variable selection was proposed, and the effects of the PSF on HEPs related with the soft controls were estimated through the methodology. For this estimation, more than 600 human error opportunities related to soft controls in a computerized control room were obtained through laboratory experiments. From the eight PSF surrogates and combinations of these variables, the procedure quality, practice level, and the operation type were identified as significant factors for screen switch and mode conversion errors. The contributions of these significant factors to HEPs were also estimated in terms of a multiplicative form. The usefulness and limitation of the experimental data and the techniques employed are discussed herein, and we believe that the logistic regression and stepwise variable selection methods will provide a way to estimate the effects of PSFs on HEPs in an objective manner. - Highlights: • It is necessary to develop an empirical basis for the effects of the PSFs on the HEPs. • A statistical method using a logistic regression and variable selection was proposed. • The effects of PSFs on the HEPs of soft controls were empirically investigated. • The significant factors were identified and their effects were estimated

  2. [Statistical Process Control (SPC) can help prevent treatment errors without increasing costs in radiotherapy].

    Science.gov (United States)

    Govindarajan, R; Llueguera, E; Melero, A; Molero, J; Soler, N; Rueda, C; Paradinas, C

    2010-01-01

    Statistical Process Control (SPC) was applied to monitor patient set-up in radiotherapy and, when the measured set-up error values indicated a loss of process stability, its root cause was identified and eliminated to prevent set-up errors. Set up errors were measured for medial-lateral (ml), cranial-caudal (cc) and anterior-posterior (ap) dimensions and then the upper control limits were calculated. Once the control limits were known and the range variability was acceptable, treatment set-up errors were monitored using sub-groups of 3 patients, three times each shift. These values were plotted on a control chart in real time. Control limit values showed that the existing variation was acceptable. Set-up errors, measured and plotted on a X chart, helped monitor the set-up process stability and, if and when the stability was lost, treatment was interrupted, the particular cause responsible for the non-random pattern was identified and corrective action was taken before proceeding with the treatment. SPC protocol focuses on controlling the variability due to assignable cause instead of focusing on patient-to-patient variability which normally does not exist. Compared to weekly sampling of set-up error in each and every patient, which may only ensure that just those sampled sessions were set-up correctly, the SPC method enables set-up error prevention in all treatment sessions for all patients and, at the same time, reduces the control costs. Copyright © 2009 SECA. Published by Elsevier Espana. All rights reserved.

  3. Cost and quality effectiveness of objective-based and statistically-based quality control for volatile organic compounds analyses of gases

    International Nuclear Information System (INIS)

    Bennett, J.T.; Crowder, C.A.; Connolly, M.J.

    1994-01-01

    Gas samples from drums of radioactive waste at the Department of Energy (DOE) Idaho National Engineering Laboratory are being characterized for 29 volatile organic compounds to determine the feasibility of storing the waste in DOE's Waste Isolation Pilot Plant (WIPP) in Carlsbad, New Mexico. Quality requirements for the gas chromatography (GC) and GC/mass spectrometry chemical methods used to analyze the waste are specified in the Quality Assurance Program Plan for the WIPP Experimental Waste Characterization Program. Quality requirements consist of both objective criteria (data quality objectives, DQOs) and statistical criteria (process control). The DQOs apply to routine sample analyses, while the statistical criteria serve to determine and monitor precision and accuracy (P ampersand A) of the analysis methods and are also used to assign upper confidence limits to measurement results close to action levels. After over two years and more than 1000 sample analyses there are two general conclusions concerning the two approaches to quality control: (1) Objective criteria (e.g., ± 25% precision, ± 30% accuracy) based on customer needs and the usually prescribed criteria for similar EPA- approved methods are consistently attained during routine analyses. (2) Statistical criteria based on short term method performance are almost an order of magnitude more stringent than objective criteria and are difficult to satisfy following the same routine laboratory procedures which satisfy the objective criteria. A more cost effective and representative approach to establishing statistical method performances criteria would be either to utilize a moving average of P ampersand A from control samples over a several month time period or to determine within a sample variation by one-way analysis of variance of several months replicate sample analysis results or both. Confidence intervals for results near action levels could also be determined by replicate analysis of the sample in

  4. Attitude towards statistics and performance among post-graduate students

    Science.gov (United States)

    Rosli, Mira Khalisa; Maat, Siti Mistima

    2017-05-01

    For student to master Statistics is a necessity, especially for those post-graduates that are involved in the research field. The purpose of this research was to identify the attitude towards Statistics among the post-graduates and to determine the relationship between the attitude towards Statistics and post-graduates' of Faculty of Education, UKM, Bangi performance. 173 post-graduate students were chosen randomly to participate in the study. These students registered in Research Methodology II course that was introduced by faculty. A survey of attitude toward Statistics using 5-points Likert scale was used for data collection purposes. The instrument consists of four components such as affective, cognitive competency, value and difficulty. The data was analyzed using the SPSS version 22 in producing the descriptive and inferential Statistics output. The result of this research showed that there is a medium and positive relation between attitude towards statistics and students' performance. As a conclusion, educators need to access students' attitude towards the course to accomplish the learning outcomes.

  5. Active control on high-order coherence and statistic characterization on random phase fluctuation of two classical point sources.

    Science.gov (United States)

    Hong, Peilong; Li, Liming; Liu, Jianji; Zhang, Guoquan

    2016-03-29

    Young's double-slit or two-beam interference is of fundamental importance to understand various interference effects, in which the stationary phase difference between two beams plays the key role in the first-order coherence. Different from the case of first-order coherence, in the high-order optical coherence the statistic behavior of the optical phase will play the key role. In this article, by employing a fundamental interfering configuration with two classical point sources, we showed that the high- order optical coherence between two classical point sources can be actively designed by controlling the statistic behavior of the relative phase difference between two point sources. Synchronous position Nth-order subwavelength interference with an effective wavelength of λ/M was demonstrated, in which λ is the wavelength of point sources and M is an integer not larger than N. Interestingly, we found that the synchronous position Nth-order interference fringe fingerprints the statistic trace of random phase fluctuation of two classical point sources, therefore, it provides an effective way to characterize the statistic properties of phase fluctuation for incoherent light sources.

  6. Characteristics and Performance of Students in an Online Section of Business Statistics

    Science.gov (United States)

    Dutton, John; Dutton, Marilyn

    2005-01-01

    We compare students in online and lecture sections of a business statistics class taught simultaneously by the same instructor using the same content, assignments, and exams in the fall of 2001. Student data are based on class grades, registration records, and two surveys. The surveys asked for information on preparedness, reasons for section…

  7. Self-rated health assessed by web versus mail modes in a mixed mode survey: the digital divide effect and the genuine survey mode effect.

    Science.gov (United States)

    Shim, Jae-Mahn; Shin, Eunjung; Johnson, Timothy P

    2013-09-01

    To investigate differences in self-rated health (SRH) between web and mail questionnaires in a mixed mode survey and to provide a model that explains those differences. A total of 15,200 mail respondents and 17,829 web respondents from the 2008 US National Health Survey conducted by the Gallup Panel. Respondents were recruited using random digit dialing and assigned to one of the two survey modes (web or mail). Respondents with household Internet connection and frequent Internet usage were invited to complete the survey through the web mode. Respondents who had no Internet connection or who used the Internet infrequently were invited to the mail mode. Thus, respondents with better Internet access used the web mode. Respondents completed a questionnaire that asked about SRH status, objective health conditions, health behaviors, and other socioeconomic variables. Statistical associations were analyzed with ordered Logit and negative binomial models. Web respondents reported better SRH than mail respondents. This difference is in part reflective of variability in objective health status between these two groups, and in part attributable to the effects of survey mode. These results maintained with age controlled. The alignment between survey mode selection, Internet access, and health disparities, as well as genuine survey mode characteristics, leads to web-mail differences in SRH. Unless the digital divide and its influences on survey mode selection are resolved and differential genuine mode effects are fully comprehended, we recommend that both modes be simultaneously used on a complementary basis.

  8. Software for creating quality control database in diagnostic radiology

    International Nuclear Information System (INIS)

    Stoeva, M.; Spassov, G.; Tabakov, S.

    2000-01-01

    The paper describes a PC based program with database for quality control (QC). It keeps information about all surveyed equipment and measured parameters. The first function of the program is to extract information from old (existing) MS Excel spreadsheets with QC surveys. The second function is used for input of measurements which are automatically organized in MS Excel spreadsheets and built into the database. The spreadsheets are based on the protocols described in the EMERALD Training Scheme. In addition, the program can make statistics of all measured parameters, both in absolute term and in time

  9. Empirical Correction to the Likelihood Ratio Statistic for Structural Equation Modeling with Many Variables.

    Science.gov (United States)

    Yuan, Ke-Hai; Tian, Yubin; Yanagihara, Hirokazu

    2015-06-01

    Survey data typically contain many variables. Structural equation modeling (SEM) is commonly used in analyzing such data. The most widely used statistic for evaluating the adequacy of a SEM model is T ML, a slight modification to the likelihood ratio statistic. Under normality assumption, T ML approximately follows a chi-square distribution when the number of observations (N) is large and the number of items or variables (p) is small. However, in practice, p can be rather large while N is always limited due to not having enough participants. Even with a relatively large N, empirical results show that T ML rejects the correct model too often when p is not too small. Various corrections to T ML have been proposed, but they are mostly heuristic. Following the principle of the Bartlett correction, this paper proposes an empirical approach to correct T ML so that the mean of the resulting statistic approximately equals the degrees of freedom of the nominal chi-square distribution. Results show that empirically corrected statistics follow the nominal chi-square distribution much more closely than previously proposed corrections to T ML, and they control type I errors reasonably well whenever N ≥ max(50,2p). The formulations of the empirically corrected statistics are further used to predict type I errors of T ML as reported in the literature, and they perform well.

  10. Optimal Design and Related Areas in Optimization and Statistics

    CERN Document Server

    Pronzato, Luc

    2009-01-01

    This edited volume, dedicated to Henry P. Wynn, reflects his broad range of research interests, focusing in particular on the applications of optimal design theory in optimization and statistics. It covers algorithms for constructing optimal experimental designs, general gradient-type algorithms for convex optimization, majorization and stochastic ordering, algebraic statistics, Bayesian networks and nonlinear regression. Written by leading specialists in the field, each chapter contains a survey of the existing literature along with substantial new material. This work will appeal to both the

  11. Statistical analysis of the Ft. Calhoun reactor coolant pump system

    International Nuclear Information System (INIS)

    Patel, Bimal; Heising, C.D.

    1997-01-01

    In engineering science, statistical quality control techniques have traditionally been applied to control manufacturing processes. An application to commercial nuclear power plant maintenance and control is presented that can greatly improve plant safety. As a demonstration of such an approach, a specific system is analyzed: the reactor coolant pumps (RCPs) of the Ft. Calhoun nuclear power plant. This research uses capability analysis, Shewhart X-bar, R charts, canonical correlation methods, and design of experiments to analyze the process for the state of statistical control. The results obtained show that six out of ten parameters are under control specification limits and four parameters are not in the state of statistical control. The analysis shows that statistical process control methods can be applied as an early warning system capable of identifying significant equipment problems well in advance of traditional control room alarm indicators. Such a system would provide operators with ample time to respond to possible emergency situations and thus improve plant safety and reliability. (Author)

  12. Report of the Survey on the Design Review of New Reactor Applications. Volume 1 - Instrumentation and Control

    International Nuclear Information System (INIS)

    Downey, Steven

    2014-06-01

    At the tenth meeting of the CNRA Working Group on the Regulation of New Reactors (WGRNR) in March 2013, the members agreed to present the responses to the Second Phase, or Design Phase, of the Licensing Process Survey as a multi-volume text. As such, each report will focus on one of the eleven general technical categories covered in the survey. The general technical categories were selected to conform to the topics covered in the International Atomic Energy Agency (IAEA) Safety Guide GS-G-4.1. This report, which is the first volume, provides a discussion of the survey responses related to Instrumentation and Control (I and C). The Instrumentation and Control category includes the twelve following technical topics: Reactor trip system, actuation systems for Engineered Safety Features (ESF), safe shutdown system, safety-related display instrumentation, information and interlock systems important to safety, controls systems, main control room, supplementary control room, diverse I and C systems, data communication systems, software reliability and cyber-security. For each technical topic, the member countries described the information provided by the applicant, the scope and level of detail of the technical review, the technical basis for granting regulatory authorisation, the skill sets required and the Level of effort needed to perform the review. Based on a comparison of the information provided in response to the survey, the following observations were made: - Among the regulatory organisations that responded to the survey, there are similarities in the design information provided by an applicant. In most countries, the design information provided by an applicant includes, but is not limited to, a description of the I and C system design and functions, a description of the verification and validation programmes, and provisions for analysis, testing, and inspection of various I and C systems. - In addition to the regulations, it is a common practice for countries

  13. A statistical model of uplink inter-cell interference with slow and fast power control mechanisms

    KAUST Repository

    Tabassum, Hina; Yilmaz, Ferkan; Dawy, Zaher; Alouini, Mohamed-Slim

    2013-01-01

    Uplink power control is in essence an interference mitigation technique that aims at minimizing the inter-cell interference (ICI) in cellular networks by reducing the transmit power levels of the mobile users while maintaining their target received signal quality levels at base stations. Power control mechanisms directly impact the interference dynamics and, thus, affect the overall achievable capacity and consumed power in cellular networks. Due to the stochastic nature of wireless channels and mobile users' locations, it is important to derive theoretical models for ICI that can capture the impact of design alternatives related to power control mechanisms. To this end, we derive and verify a novel statistical model for uplink ICI in Generalized-K composite fading environments as a function of various slow and fast power control mechanisms. The derived expressions are then utilized to quantify numerically key network performance metrics that include average resource fairness, average reduction in power consumption, and ergodic capacity. The accuracy of the derived expressions is validated via Monte-Carlo simulations. Results are generated for multiple network scenarios, and insights are extracted to assess various power control mechanisms as a function of system parameters. © 1972-2012 IEEE.

  14. A statistical model of uplink inter-cell interference with slow and fast power control mechanisms

    KAUST Repository

    Tabassum, Hina

    2013-09-01

    Uplink power control is in essence an interference mitigation technique that aims at minimizing the inter-cell interference (ICI) in cellular networks by reducing the transmit power levels of the mobile users while maintaining their target received signal quality levels at base stations. Power control mechanisms directly impact the interference dynamics and, thus, affect the overall achievable capacity and consumed power in cellular networks. Due to the stochastic nature of wireless channels and mobile users\\' locations, it is important to derive theoretical models for ICI that can capture the impact of design alternatives related to power control mechanisms. To this end, we derive and verify a novel statistical model for uplink ICI in Generalized-K composite fading environments as a function of various slow and fast power control mechanisms. The derived expressions are then utilized to quantify numerically key network performance metrics that include average resource fairness, average reduction in power consumption, and ergodic capacity. The accuracy of the derived expressions is validated via Monte-Carlo simulations. Results are generated for multiple network scenarios, and insights are extracted to assess various power control mechanisms as a function of system parameters. © 1972-2012 IEEE.

  15. A bibliometric analysis of 50 years of worldwide research on statistical process control

    Directory of Open Access Journals (Sweden)

    Fabiane Letícia Lizarelli

    Full Text Available Abstract An increasing number of papers on statistical process control (SPC has emerged in the last fifty years, especially in the last fifteen years. This may be attributed to the increased global competitiveness generated by innovation and the continuous improvement of products and processes. In this sense, SPC has a fundamentally important role in quality and production systems. The research in this paper considers the context of technological improvement and innovation of products and processes to increase corporate competitiveness. There are several other statistical technics and tools for assisting continuous improvement and innovation of products and processes but, despite the limitations in their use in the improvement projects, there is growing concern about the use of SPC. A gap between the SPC technics taught in engineering courses and their practical applications to industrial problems is observed in empirical research; thus, it is important to understand what has been done and identify the trends in SPC research. The bibliometric study in this paper is proposed in this direction and uses the Web of Science (WoS database. Data analysis indicates that there was a growth rate of more than 90% in the number of publications on SPC after 1990. Our results reveal the countries where these publications have come from, the authors with the highest number of papers and their networks. Main sources of publications are also identified; it is observed that the publications of SPC papers are concentrated in some of the international research journals, not necessarily those with the major high-impact factors. Furthermore, the papers are focused on industrial engineering, operations research and management science fields. The most common term found in the papers was cumulative sum control charts, but new topics have emerged and have been researched in the past ten years, such as multivariate methods for process monitoring and nonparametric methods.

  16. The 12th quality control survey for radioisotope in vitro tests in Japan, 1990

    Energy Technology Data Exchange (ETDEWEB)

    1991-10-01

    The results of the 12th quality control nationwide survey is presented. Of 670 selected facilities, 405 (60.4%) participated in this survey. Myoglobin and trypsin were added as new items to be examined. The other conventional items were as follows: alpha-fetoprotein (AFP), aldosterone, {beta}{sub 2}-microglobulin, carbohydrate angigen 15-3, C-peptide, digoxin, elastase 1, free triiodothyronine, growth hormone, immunoglobulin E, prostatic acid phosphatase, pancreatic secretory trypsin inhibitor, progesterone, prolactin, thyroglobulin, triiodothyronine (T{sub 3}), T{sub 3} uptake, tissue polypeptide antigen, thyroid stimulating hormone, and testosterone. There was a great coefficient of variation (CV) between kits in AFP, aldosterone, progesterone, and prolactin. These results were analogous to those in the previous surveys. For T{sub 3}, there was a great difference between CV by radioimmunoassay and by non-isotopic method. Both myoglobin and trypsin had a great difference between kits. (N.K.).

  17. IMPORTANCE OF MATERIAL BALANCES AND THEIR STATISTICAL EVALUATION IN RUSSIAN MATERIAL, PROTECTION, CONTROL AND ACCOUNTING

    International Nuclear Information System (INIS)

    Fishbone, L.G.

    1999-01-01

    While substantial work has been performed in the Russian MPC and A Program, much more needs to be done at Russian nuclear facilities to complete four necessary steps. These are (1) periodically measuring the physical inventory of nuclear material, (2) continuously measuring the flows of nuclear material, (3) using the results to close the material balance, particularly at bulk processing facilities, and (4) statistically evaluating any apparent loss of nuclear material. The periodic closing of material balances provides an objective test of the facility's system of nuclear material protection, control and accounting. The statistical evaluation using the uncertainties associated with individual measurement systems involved in the calculation of the material balance provides a fair standard for concluding whether the apparent loss of nuclear material means a diversion or whether the facility's accounting system needs improvement. In particular, if unattractive flow material at a facility is not measured well, the accounting system cannot readily detect the loss of attractive material if the latter substantially derives from the former

  18. Statistical correlation of structural mode shapes from test measurements and NASTRAN analytical values

    Science.gov (United States)

    Purves, L.; Strang, R. F.; Dube, M. P.; Alea, P.; Ferragut, N.; Hershfeld, D.

    1983-01-01

    The software and procedures of a system of programs used to generate a report of the statistical correlation between NASTRAN modal analysis results and physical tests results from modal surveys are described. Topics discussed include: a mathematical description of statistical correlation, a user's guide for generating a statistical correlation report, a programmer's guide describing the organization and functions of individual programs leading to a statistical correlation report, and a set of examples including complete listings of programs, and input and output data.

  19. Disclosure control using partially synthetic data for large-scale health surveys, with applications to CanCORS.

    Science.gov (United States)

    Loong, Bronwyn; Zaslavsky, Alan M; He, Yulei; Harrington, David P

    2013-10-30

    Statistical agencies have begun to partially synthesize public-use data for major surveys to protect the confidentiality of respondents' identities and sensitive attributes by replacing high disclosure risk and sensitive variables with multiple imputations. To date, there are few applications of synthetic data techniques to large-scale healthcare survey data. Here, we describe partial synthesis of survey data collected by the Cancer Care Outcomes Research and Surveillance (CanCORS) project, a comprehensive observational study of the experiences, treatments, and outcomes of patients with lung or colorectal cancer in the USA. We review inferential methods for partially synthetic data and discuss selection of high disclosure risk variables for synthesis, specification of imputation models, and identification disclosure risk assessment. We evaluate data utility by replicating published analyses and comparing results using original and synthetic data and discuss practical issues in preserving inferential conclusions. We found that important subgroup relationships must be included in the synthetic data imputation model, to preserve the data utility of the observed data for a given analysis procedure. We conclude that synthetic CanCORS data are suited best for preliminary data analyses purposes. These methods address the requirement to share data in clinical research without compromising confidentiality. Copyright © 2013 John Wiley & Sons, Ltd.

  20. A handbook of statistical graphics using SAS ODS

    CERN Document Server

    Der, Geoff

    2014-01-01

    An Introduction to Graphics: Good Graphics, Bad Graphics, Catastrophic Graphics and Statistical GraphicsThe Challenger DisasterGraphical DisplaysA Little History and Some Early Graphical DisplaysGraphical DeceptionAn Introduction to ODS GraphicsGenerating ODS GraphsODS DestinationsStatistical Graphics ProceduresODS Graphs from Statistical ProceduresControlling ODS GraphicsControlling Labelling in GraphsODS Graphics EditorGraphs for Displaying the Characteristics of Univariate Data: Horse Racing, Mortality Rates, Forearm Lengths, Survival Times and Geyser EruptionsIntroductionPie Chart, Bar Cha