WorldWideScience

Sample records for volume statistical process

  1. Representative volume size: A comparison of statistical continuum mechanics and statistical physics

    Energy Technology Data Exchange (ETDEWEB)

    AIDUN,JOHN B.; TRUCANO,TIMOTHY G.; LO,CHI S.; FYE,RICHARD M.

    1999-05-01

    In this combination background and position paper, the authors argue that careful work is needed to develop accurate methods for relating the results of fine-scale numerical simulations of material processes to meaningful values of macroscopic properties for use in constitutive models suitable for finite element solid mechanics simulations. To provide a definite context for this discussion, the problem is couched in terms of the lack of general objective criteria for identifying the size of the representative volume (RV) of a material. The objective of this report is to lay out at least the beginnings of an approach for applying results and methods from statistical physics to develop concepts and tools necessary for determining the RV size, as well as alternatives to RV volume-averaging for situations in which the RV is unmanageably large. The background necessary to understand the pertinent issues and statistical physics concepts is presented.

  2. Columbia River Basin Seasonal Volumes and Statistics, 1928-1989. 1990 Level Modified Streamflows Computed Seasonal Volumes 61-Year Statistics.

    Energy Technology Data Exchange (ETDEWEB)

    A.G. Crook Company

    1993-04-01

    This report was prepared by the A.G. Crook Company, under contract to Bonneville Power Administration, and provides statistics of seasonal volumes and streamflow for 28 selected sites in the Columbia River Basin.

  3. Use of statistical process control in the production of blood components

    DEFF Research Database (Denmark)

    Magnussen, K; Quere, S; Winkel, P

    2008-01-01

    Introduction of statistical process control in the setting of a small blood centre was tested, both on the regular red blood cell production and specifically to test if a difference was seen in the quality of the platelets produced, when a change was made from a relatively large inexperienced...... by an experienced staff with four technologists. We applied statistical process control to examine if time series of quality control values were in statistical control. Leucocyte count in red blood cells was out of statistical control. Platelet concentration and volume of the platelets produced by the occasional...... occasional component manufacturing staff to an experienced regular manufacturing staff. Production of blood products is a semi-automated process in which the manual steps may be difficult to control. This study was performed in an ongoing effort to improve the control and optimize the quality of the blood...

  4. Statistical analysis of non-homogeneous Poisson processes. Statistical processing of a particle multidetector

    International Nuclear Information System (INIS)

    Lacombe, J.P.

    1985-12-01

    Statistic study of Poisson non-homogeneous and spatial processes is the first part of this thesis. A Neyman-Pearson type test is defined concerning the intensity measurement of these processes. Conditions are given for which consistency of the test is assured, and others giving the asymptotic normality of the test statistics. Then some techniques of statistic processing of Poisson fields and their applications to a particle multidetector study are given. Quality tests of the device are proposed togetherwith signal extraction methods [fr

  5. Frequency-volume Statistics of Rock Falls: Examples From France, Italy and California

    Science.gov (United States)

    Dussauge-Peisser, C.; Guzzetti, F.; Wieczorek, G. F.

    There is accumulating evidence that the distribution of rock-fall volume exhibits power law (fractal) statistics in different physiographic and geologic environments. We have studied the frequency-volume statistics of rock falls in three areas: Grenoble, France; Umbria, Italy; and Yosemite Valley, California, USA. We present a compari- son of the datasets currently available. For the Grenoble area a catalogue of rock falls between 1248 and 1995 occurred along a 120 km long limestone cliff. The dataset contains information on 105 rock-fall events ranging in size from 3xE-2 to 5xE8 m3. Only the time window 1935-1995 is considered in the study, involving 87 events from 1E-2 to 1E6 m3. The cumulative frequency-volume statistics follow a power-law (frac- tal) relationship with exponent b = -0.4 over the range 50 m3 Yosemite Valley the database contains information on historical (1851-2001) rock falls (122), rock slides (251) and prehistoric rock avalanches (5). For Yosemite, the non-cumulative frequency-volume statistics of rock falls and rock slides are very sim- ilar and correlate well with a power-law (fractal) relation with exponent beta = -1.4, over the range 30 m3 volume distribution of rock avalanches. We discuss the implications of such a power law fitting the data for rock-fall hazard assessment. We also discuss the variation of the b and beta exponents for natural events and earthquake triggered events.

  6. Mathematical statistics and stochastic processes

    CERN Document Server

    Bosq, Denis

    2013-01-01

    Generally, books on mathematical statistics are restricted to the case of independent identically distributed random variables. In this book however, both this case AND the case of dependent variables, i.e. statistics for discrete and continuous time processes, are studied. This second case is very important for today's practitioners.Mathematical Statistics and Stochastic Processes is based on decision theory and asymptotic statistics and contains up-to-date information on the relevant topics of theory of probability, estimation, confidence intervals, non-parametric statistics and rob

  7. Statistical Optics

    Science.gov (United States)

    Goodman, Joseph W.

    2000-07-01

    The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson The Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences Robert G. Bartle The Elements of Integration and Lebesgue Measure George E. P. Box & Norman R. Draper Evolutionary Operation: A Statistical Method for Process Improvement George E. P. Box & George C. Tiao Bayesian Inference in Statistical Analysis R. W. Carter Finite Groups of Lie Type: Conjugacy Classes and Complex Characters R. W. Carter Simple Groups of Lie Type William G. Cochran & Gertrude M. Cox Experimental Designs, Second Edition Richard Courant Differential and Integral Calculus, Volume I RIchard Courant Differential and Integral Calculus, Volume II Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume I Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume II D. R. Cox Planning of Experiments Harold S. M. Coxeter Introduction to Geometry, Second Edition Charles W. Curtis & Irving Reiner Representation Theory of Finite Groups and Associative Algebras Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume I Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume II Cuthbert Daniel Fitting Equations to Data: Computer Analysis of Multifactor Data, Second Edition Bruno de Finetti Theory of Probability, Volume I Bruno de Finetti Theory of Probability, Volume 2 W. Edwards Deming Sample Design in Business Research

  8. Statistical Inference at Work: Statistical Process Control as an Example

    Science.gov (United States)

    Bakker, Arthur; Kent, Phillip; Derry, Jan; Noss, Richard; Hoyles, Celia

    2008-01-01

    To characterise statistical inference in the workplace this paper compares a prototypical type of statistical inference at work, statistical process control (SPC), with a type of statistical inference that is better known in educational settings, hypothesis testing. Although there are some similarities between the reasoning structure involved in…

  9. Multivariate Statistical Process Control

    DEFF Research Database (Denmark)

    Kulahci, Murat

    2013-01-01

    As sensor and computer technology continues to improve, it becomes a normal occurrence that we confront with high dimensional data sets. As in many areas of industrial statistics, this brings forth various challenges in statistical process control (SPC) and monitoring for which the aim...... is to identify “out-of-control” state of a process using control charts in order to reduce the excessive variation caused by so-called assignable causes. In practice, the most common method of monitoring multivariate data is through a statistic akin to the Hotelling’s T2. For high dimensional data with excessive...... amount of cross correlation, practitioners are often recommended to use latent structures methods such as Principal Component Analysis to summarize the data in only a few linear combinations of the original variables that capture most of the variation in the data. Applications of these control charts...

  10. Fracture criterion for brittle materials based on statistical cells of finite volume

    International Nuclear Information System (INIS)

    Cords, H.; Kleist, G.; Zimmermann, R.

    1986-06-01

    An analytical consideration of the Weibull Statistical Analysis of brittle materials established the necessity of including one additional material constant for a more comprehensive description of the failure behaviour. The Weibull analysis is restricted to infinitesimal volume elements in consequence of the differential calculus applied. It was found that infinitesimally small elements are in conflict with the basic statistical assumption and that the differential calculus is not needed in fact since nowadays most of the stress analyses are based on finite element calculations, and these are most suitable for a subsequent statistical analysis of strength. The size of a finite statistical cell has been introduced as the third material parameter. It should represent the minimum volume containing all statistical features of the material such as distribution of pores, flaws and grains. The new approach also contains a unique treatment of failure under multiaxial stresses. The quantity responsible for failure under multiaxial stresses is introduced as a modified strain energy. Sixteen different tensile specimens including CT-specimens have been investigated experimentally and analyzed with the probabilistic fracture criterion. As a result it can be stated that the failure rates of all types of specimens made from three different grades of graphite are predictable. The accuracy of the prediction is one standard deviation. (orig.) [de

  11. Statistical Process Control for KSC Processing

    Science.gov (United States)

    Ford, Roger G.; Delgado, Hector; Tilley, Randy

    1996-01-01

    The 1996 Summer Faculty Fellowship Program and Kennedy Space Center (KSC) served as the basis for a research effort into statistical process control for KSC processing. The effort entailed several tasks and goals. The first was to develop a customized statistical process control (SPC) course for the Safety and Mission Assurance Trends Analysis Group. The actual teaching of this course took place over several weeks. In addition, an Internet version of the same course complete with animation and video excerpts from the course when it was taught at KSC was developed. The application of SPC to shuttle processing took up the rest of the summer research project. This effort entailed the evaluation of SPC use at KSC, both present and potential, due to the change in roles for NASA and the Single Flight Operations Contractor (SFOC). Individual consulting on SPC use was accomplished as well as an evaluation of SPC software for KSC use in the future. A final accomplishment of the orientation of the author to NASA changes, terminology, data format, and new NASA task definitions will allow future consultation when the needs arise.

  12. Frontiers in statistical quality control 11

    CERN Document Server

    Schmid, Wolfgang

    2015-01-01

    The main focus of this edited volume is on three major areas of statistical quality control: statistical process control (SPC), acceptance sampling and design of experiments. The majority of the papers deal with statistical process control, while acceptance sampling and design of experiments are also treated to a lesser extent. The book is organized into four thematic parts, with Part I addressing statistical process control. Part II is devoted to acceptance sampling. Part III covers the design of experiments, while Part IV discusses related fields. The twenty-three papers in this volume stem from The 11th International Workshop on Intelligent Statistical Quality Control, which was held in Sydney, Australia from August 20 to August 23, 2013. The event was hosted by Professor Ross Sparks, CSIRO Mathematics, Informatics and Statistics, North Ryde, Australia and was jointly organized by Professors S. Knoth, W. Schmid and Ross Sparks. The papers presented here were carefully selected and reviewed by the scientifi...

  13. Statistical analysis of rockfall volume distributions: Implications for rockfall dynamics

    Science.gov (United States)

    Dussauge, Carine; Grasso, Jean-Robert; Helmstetter, AgnèS.

    2003-06-01

    We analyze the volume distribution of natural rockfalls on different geological settings (i.e., calcareous cliffs in the French Alps, Grenoble area, and granite Yosemite cliffs, California Sierra) and different volume ranges (i.e., regional and worldwide catalogs). Contrary to previous studies that included several types of landslides, we restrict our analysis to rockfall sources which originated on subvertical cliffs. For the three data sets, we find that the rockfall volumes follow a power law distribution with a similar exponent value, within error bars. This power law distribution was also proposed for rockfall volumes that occurred along road cuts. All these results argue for a recurrent power law distribution of rockfall volumes on subvertical cliffs, for a large range of rockfall sizes (102-1010 m3), regardless of the geological settings and of the preexisting geometry of fracture patterns that are drastically different on the three studied areas. The power law distribution for rockfall volumes could emerge from two types of processes. First, the observed power law distribution of rockfall volumes is similar to the one reported for both fragmentation experiments and fragmentation models. This argues for the geometry of rock mass fragment sizes to possibly control the rockfall volumes. This way neither cascade nor avalanche processes would influence the rockfall volume distribution. Second, without any requirement of scale-invariant quenched heterogeneity patterns, the rock mass dynamics can arise from avalanche processes driven by fluctuations of the rock mass properties, e.g., cohesion or friction angle. This model may also explain the power law distribution reported for landslides involving unconsolidated materials. We find that the exponent values of rockfall volume on subvertical cliffs, 0.5 ± 0.2, is significantly smaller than the 1.2 ± 0.3 value reported for mixed landslide types. This change of exponents can be driven by the material strength, which

  14. Probability, Statistics, and Stochastic Processes

    CERN Document Server

    Olofsson, Peter

    2011-01-01

    A mathematical and intuitive approach to probability, statistics, and stochastic processes This textbook provides a unique, balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. This text combines a rigorous, calculus-based development of theory with a more intuitive approach that appeals to readers' sense of reason and logic, an approach developed through the author's many years of classroom experience. The text begins with three chapters that d

  15. Determination of the minimum size of a statistical representative volume element from a fibre-reinforced composite based on point pattern statistics

    DEFF Research Database (Denmark)

    Hansen, Jens Zangenberg; Brøndsted, Povl

    2013-01-01

    In a previous study, Trias et al. [1] determined the minimum size of a statistical representative volume element (SRVE) of a unidirectional fibre-reinforced composite primarily based on numerical analyses of the stress/strain field. In continuation of this, the present study determines the minimu...... size of an SRVE based on a statistical analysis on the spatial statistics of the fibre packing patterns found in genuine laminates, and those generated numerically using a microstructure generator. © 2012 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved....

  16. Statistical Modeling of Ultrawideband Body-Centric Wireless Channels Considering Room Volume

    Directory of Open Access Journals (Sweden)

    Miyuki Hirose

    2012-01-01

    Full Text Available This paper presents the results of a statistical modeling of onbody ultrawideband (UWB radio channels for wireless body area network (WBAN applications. Measurements were conducted in five different rooms. A measured delay profile can be divided into two domains; in the first domain (04 ns has multipath components that are dominant and dependent on room volume. The first domain was modeled with a conventional power decay law model, and the second domain with a modified Saleh-Valenzuela model considering the room volume. Realizations of the impulse responses are presented based on the composite model and compared with the measured average power delay profiles.

  17. Statistical aspects of determinantal point processes

    DEFF Research Database (Denmark)

    Lavancier, Frédéric; Møller, Jesper; Rubak, Ege

    The statistical aspects of determinantal point processes (DPPs) seem largely unexplored. We review the appealing properties of DDPs, demonstrate that they are useful models for repulsiveness, detail a simulation procedure, and provide freely available software for simulation and statistical infer...

  18. The statistical mind in modern society. The Netherlands 1850-1940. Volume II: statistics and scientific work

    NARCIS (Netherlands)

    Stamhuis, I.H.; Klep, P.M.M.; Maarseveen, J.G.S.J. van

    2008-01-01

    In the period 1850-1940 statistics developed as a new combination of theory and practice. A wide range of phenomena were looked at in a novel way and this statistical mindset had a pervasive influence in contemporary society. This development of statistics is closely interlinked with the process of

  19. New advances in statistical modeling and applications

    CERN Document Server

    Santos, Rui; Oliveira, Maria; Paulino, Carlos

    2014-01-01

    This volume presents selected papers from the XIXth Congress of the Portuguese Statistical Society, held in the town of Nazaré, Portugal, from September 28 to October 1, 2011. All contributions were selected after a thorough peer-review process. It covers a broad range of papers in the areas of statistical science, probability and stochastic processes, extremes and statistical applications.

  20. Comparison of Statistically Modeled Contaminated Soil Volume Estimates and Actual Excavation Volumes at the Maywood FUSRAP Site - 13555

    Energy Technology Data Exchange (ETDEWEB)

    Moore, James [U.S. Army Corps of Engineers - New York District 26 Federal Plaza, New York, New York 10278 (United States); Hays, David [U.S. Army Corps of Engineers - Kansas City District 601 E. 12th Street, Kansas City, Missouri 64106 (United States); Quinn, John; Johnson, Robert; Durham, Lisa [Argonne National Laboratory, Environmental Science Division 9700 S. Cass Ave., Argonne, Illinois 60439 (United States)

    2013-07-01

    As part of the ongoing remediation process at the Maywood Formerly Utilized Sites Remedial Action Program (FUSRAP) properties, Argonne National Laboratory (Argonne) assisted the U.S. Army Corps of Engineers (USACE) New York District by providing contaminated soil volume estimates for the main site area, much of which is fully or partially remediated. As part of the volume estimation process, an initial conceptual site model (ICSM) was prepared for the entire site that captured existing information (with the exception of soil sampling results) pertinent to the possible location of surface and subsurface contamination above cleanup requirements. This ICSM was based on historical anecdotal information, aerial photographs, and the logs from several hundred soil cores that identified the depth of fill material and the depth to bedrock under the site. Specialized geostatistical software developed by Argonne was used to update the ICSM with historical sampling results and down-hole gamma survey information for hundreds of soil core locations. The updating process yielded both a best guess estimate of contamination volumes and a conservative upper bound on the volume estimate that reflected the estimate's uncertainty. Comparison of model results to actual removed soil volumes was conducted on a parcel-by-parcel basis. Where sampling data density was adequate, the actual volume matched the model's average or best guess results. Where contamination was un-characterized and unknown to the model, the actual volume exceeded the model's conservative estimate. Factors affecting volume estimation were identified to assist in planning further excavations. (authors)

  1. Frontiers in statistical quality control

    CERN Document Server

    Wilrich, Peter-Theodor

    2004-01-01

    This volume treats the four main categories of Statistical Quality Control: General SQC Methodology, On-line Control including Sampling Inspection and Statistical Process Control, Off-line Control with Data Analysis and Experimental Design, and, fields related to Reliability. Experts with international reputation present their newest contributions.

  2. Applicability of statistical process control techniques

    NARCIS (Netherlands)

    Schippers, W.A.J.

    1998-01-01

    This paper concerns the application of Process Control Techniques (PCTs) for the improvement of the technical performance of discrete production processes. Successful applications of these techniques, such as Statistical Process Control Techniques (SPC), can be found in the literature. However, some

  3. Statistical analysis of the sustained lava dome emplacement and destruction processes at Popocatépetl volcano, Central México

    Science.gov (United States)

    Mendoza-Rosas, Ana Teresa; Gómez-Vázquez, Ángel; De la Cruz-Reyna, Servando

    2017-06-01

    Popocatépetl volcano reawakened in 1994 after nearly 70 years of quiescence. Between 1996 and 2015, a succession of at least 38 lava domes have been irregularly emplaced and destroyed, with each dome reaching particular volumes at specific emplacement rates. The complexity of this sequence is analyzed using statistical methods in an attempt to gain insight into the physics and dynamics of the lava dome emplacement and destruction process and to objectively assess the hazards related to that volcano. The time series of emplacements, dome residences, lava effusion lulls, and emplaced dome volumes and thicknesses are modeled using the simple exponential and Weibull distributions, the compound non-homogeneous generalized Pareto-Poisson process (NHPPP), and the mixture of exponentials distribution (MOED). The statistical analysis reveals that the sequence of dome emplacements is a non-stationary, self-regulating process most likely controlled by the balance between buoyancy-driven magma ascent and volatile exsolution crystallization. This balance has supported the sustained effusive activity for decades and may persist for an undetermined amount of time. However, the eruptive history of Popocatépetl includes major Plinian phases that may have resulted from a breach in that balance. Certain criteria to recognize such breaching conditions are inferred from this statistical analysis.

  4. Parametric statistical inference for discretely observed diffusion processes

    DEFF Research Database (Denmark)

    Pedersen, Asger Roer

    Part 1: Theoretical results Part 2: Statistical applications of Gaussian diffusion processes in freshwater ecology......Part 1: Theoretical results Part 2: Statistical applications of Gaussian diffusion processes in freshwater ecology...

  5. Improving Instruction Using Statistical Process Control.

    Science.gov (United States)

    Higgins, Ronald C.; Messer, George H.

    1990-01-01

    Two applications of statistical process control to the process of education are described. Discussed are the use of prompt feedback to teachers and prompt feedback to students. A sample feedback form is provided. (CW)

  6. Sampling, Probability Models and Statistical Reasoning Statistical

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Sampling, Probability Models and Statistical Reasoning Statistical Inference. Mohan Delampady V R Padmawar. General Article Volume 1 Issue 5 May 1996 pp 49-58 ...

  7. Statistical aspects of determinantal point processes

    DEFF Research Database (Denmark)

    Lavancier, Frédéric; Møller, Jesper; Rubak, Ege Holger

    The statistical aspects of determinantal point processes (DPPs) seem largely unexplored. We review the appealing properties of DDPs, demonstrate that they are useful models for repulsiveness, detail a simulation procedure, and provide freely available software for simulation and statistical...... inference. We pay special attention to stationary DPPs, where we give a simple condition ensuring their existence, construct parametric models, describe how they can be well approximated so that the likelihood can be evaluated and realizations can be simulated, and discuss how statistical inference...

  8. Data Mining Foundations and Intelligent Paradigms Volume 2 Statistical, Bayesian, Time Series and other Theoretical Aspects

    CERN Document Server

    Jain, Lakhmi

    2012-01-01

    Data mining is one of the most rapidly growing research areas in computer science and statistics. In Volume 2 of this three volume series, we have brought together contributions from some of the most prestigious researchers in theoretical data mining. Each of the chapters is self contained. Statisticians and applied scientists/ engineers will find this volume valuable. Additionally, it provides a sourcebook for graduate students interested in the current direction of research in data mining.

  9. USAF Logistics Process Optimization Study for the Aircraft Asset Sustainment Process. Volume 2

    National Research Council Canada - National Science Library

    Adamson, Anthony

    1998-01-01

    .... It is published as three separate volumes. Volume I, USAF Logistics Process Optimization Study for the Aircraft Asset Sustainment Process -- Phase II Report, discusses the result and cost/benefit analysis of testing three initiatives...

  10. USAF Logistics Process Optimization Study for the Aircraft Asset Sustainment Process. Volume 1

    National Research Council Canada - National Science Library

    Adamson, Anthony

    1998-01-01

    .... It is published as three separate volumes. Volume I, USAF Logistics Process Optimization Study for the Aircraft Asset Sustainment Process -- Phase II Report, discusses the result and cost/benefit analysis of testing three initiatives...

  11. PROCESS VARIABILITY REDUCTION THROUGH STATISTICAL PROCESS CONTROL FOR QUALITY IMPROVEMENT

    Directory of Open Access Journals (Sweden)

    B.P. Mahesh

    2010-09-01

    Full Text Available Quality has become one of the most important customer decision factors in the selection among the competing product and services. Consequently, understanding and improving quality is a key factor leading to business success, growth and an enhanced competitive position. Hence quality improvement program should be an integral part of the overall business strategy. According to TQM, the effective way to improve the Quality of the product or service is to improve the process used to build the product. Hence, TQM focuses on process, rather than results as the results are driven by the processes. Many techniques are available for quality improvement. Statistical Process Control (SPC is one such TQM technique which is widely accepted for analyzing quality problems and improving the performance of the production process. This article illustrates the step by step procedure adopted at a soap manufacturing company to improve the Quality by reducing process variability using Statistical Process Control.

  12. Single photon laser altimeter simulator and statistical signal processing

    Science.gov (United States)

    Vacek, Michael; Prochazka, Ivan

    2013-05-01

    Spaceborne altimeters are common instruments onboard the deep space rendezvous spacecrafts. They provide range and topographic measurements critical in spacecraft navigation. Simultaneously, the receiver part may be utilized for Earth-to-satellite link, one way time transfer, and precise optical radiometry. The main advantage of single photon counting approach is the ability of processing signals with very low signal-to-noise ratio eliminating the need of large telescopes and high power laser source. Extremely small, rugged and compact microchip lasers can be employed. The major limiting factor, on the other hand, is the acquisition time needed to gather sufficient volume of data in repetitive measurements in order to process and evaluate the data appropriately. Statistical signal processing is adopted to detect signals with average strength much lower than one photon per measurement. A comprehensive simulator design and range signal processing algorithm are presented to identify a mission specific altimeter configuration. Typical mission scenarios (celestial body surface landing and topographical mapping) are simulated and evaluated. The high interest and promising single photon altimeter applications are low-orbit (˜10 km) and low-radial velocity (several m/s) topographical mapping (asteroids, Phobos and Deimos) and landing altimetry (˜10 km) where range evaluation repetition rates of ˜100 Hz and 0.1 m precision may be achieved. Moon landing and asteroid Itokawa topographical mapping scenario simulations are discussed in more detail.

  13. Statistical data processing with automatic system for environmental radiation monitoring

    International Nuclear Information System (INIS)

    Zarkh, V.G.; Ostroglyadov, S.V.

    1986-01-01

    Practice of statistical data processing for radiation monitoring is exemplified, and some results obtained are presented. Experience in practical application of mathematical statistics methods for radiation monitoring data processing allowed to develop a concrete algorithm of statistical processing realized in M-6000 minicomputer. The suggested algorithm by its content is divided into 3 parts: parametrical data processing and hypotheses test, pair and multiple correlation analysis. Statistical processing programms are in a dialogue operation. The above algorithm was used to process observed data over radioactive waste disposal control region. Results of surface waters monitoring processing are presented

  14. Using Statistical Process Control to Enhance Student Progression

    Science.gov (United States)

    Hanna, Mark D.; Raichura, Nilesh; Bernardes, Ednilson

    2012-01-01

    Public interest in educational outcomes has markedly increased in the most recent decade; however, quality management and statistical process control have not deeply penetrated the management of academic institutions. This paper presents results of an attempt to use Statistical Process Control (SPC) to identify a key impediment to continuous…

  15. Statistical evaluation of the mechanical properties of high-volume class F fly ash concretes

    KAUST Repository

    Yoon, Seyoon; Monteiro, Paulo J.M.; Macphee, Donald E.; Glasser, Fredrik P.; Imbabi, Mohammed Salah-Eldin

    2014-01-01

    the authors experimentally and statistically investigated the effects of mix-design factors on the mechanical properties of high-volume class F fly ash concretes. A total of 240 and 32 samples were produced and tested in the laboratory to measure compressive

  16. Statistical thermodynamics of nonequilibrium processes

    CERN Document Server

    Keizer, Joel

    1987-01-01

    The structure of the theory ofthermodynamics has changed enormously since its inception in the middle of the nineteenth century. Shortly after Thomson and Clausius enunciated their versions of the Second Law, Clausius, Maxwell, and Boltzmann began actively pursuing the molecular basis of thermo­ dynamics, work that culminated in the Boltzmann equation and the theory of transport processes in dilute gases. Much later, Onsager undertook the elucidation of the symmetry oftransport coefficients and, thereby, established himself as the father of the theory of nonequilibrium thermodynamics. Com­ bining the statistical ideas of Gibbs and Langevin with the phenomenological transport equations, Onsager and others went on to develop a consistent statistical theory of irreversible processes. The power of that theory is in its ability to relate measurable quantities, such as transport coefficients and thermodynamic derivatives, to the results of experimental measurements. As powerful as that theory is, it is linear and...

  17. Breakthroughs in statistics

    CERN Document Server

    Johnson, Norman

    This is author-approved bcc: This is the third volume of a collection of seminal papers in the statistical sciences written during the past 110 years. These papers have each had an outstanding influence on the development of statistical theory and practice over the last century. Each paper is preceded by an introduction written by an authority in the field providing background information and assessing its influence. Volume III concerntrates on articles from the 1980's while including some earlier articles not included in Volume I and II. Samuel Kotz is Professor of Statistics in the College of Business and Management at the University of Maryland. Norman L. Johnson is Professor Emeritus of Statistics at the University of North Carolina. Also available: Breakthroughs in Statistics Volume I: Foundations and Basic Theory Samuel Kotz and Norman L. Johnson, Editors 1993. 631 pp. Softcover. ISBN 0-387-94037-5 Breakthroughs in Statistics Volume II: Methodology and Distribution Samuel Kotz and Norman L. Johnson, Edi...

  18. Statistical process control in wine industry using control cards

    OpenAIRE

    Dimitrieva, Evica; Atanasova-Pacemska, Tatjana; Pacemska, Sanja

    2013-01-01

    This paper is based on the research of the technological process of automatic filling of bottles of wine in winery in Stip, Republic of Macedonia. The statistical process control using statistical control card is created. The results and recommendations for improving the process are discussed.

  19. Fundamentals of statistical signal processing

    CERN Document Server

    Kay, Steven M

    1993-01-01

    A unified presentation of parameter estimation for those involved in the design and implementation of statistical signal processing algorithms. Covers important approaches to obtaining an optimal estimator and analyzing its performance; and includes numerous examples as well as applications to real- world problems. MARKETS: For practicing engineers and scientists who design and analyze signal processing systems, i.e., to extract information from noisy signals — radar engineer, sonar engineer, geophysicist, oceanographer, biomedical engineer, communications engineer, economist, statistician, physicist, etc.

  20. Beneficiation-hydroretort processing of US oil shales: Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    None

    1989-01-01

    This report has been divided into three volumes. Volume I describes the MRI beneficiation work. In addition, Volume I presents the results of joint beneficiation-hydroretorting studies and provides an economic analysis of the combined beneficiation-hydroretorting approach for processing Eastern oil shales. Volume II presents detailed results of hydroretorting tests made by HYCRUDE/IGT on raw and beneficiated oil shales prepared by MRI. Volume III comprises detailed engineering design drawings and supporting data developed by the Roberts and Schaefer Company, Engineers and Contractors, Salt Lake City, Utah, in support of the capital and operating costs for a conceptual beneficiation plant processing an Alabama oil shale.

  1. Statistical Data Processing with R – Metadata Driven Approach

    Directory of Open Access Journals (Sweden)

    Rudi SELJAK

    2016-06-01

    Full Text Available In recent years the Statistical Office of the Republic of Slovenia has put a lot of effort into re-designing its statistical process. We replaced the classical stove-pipe oriented production system with general software solutions, based on the metadata driven approach. This means that one general program code, which is parametrized with process metadata, is used for data processing for a particular survey. Currently, the general program code is entirely based on SAS macros, but in the future we would like to explore how successfully statistical software R can be used for this approach. Paper describes the metadata driven principle for data validation, generic software solution and main issues connected with the use of statistical software R for this approach.

  2. The statistical mind in modern society. The Netherlands 1850-1940. Volume I: official Statistics, social progress and modern enterprise

    NARCIS (Netherlands)

    Maarseveen, J.G.S.J. van; Klep, P.M.M.; Stamhuis, I.H.

    2008-01-01

    In the period 1850-1940 statistics developed as a new combination of theory and practice. A wide range of phenomena were looked at in a novel way and this statistical mindset had a pervasive influence in contemporary society. This development of statistics is closely interlinked with the process of

  3. 77 FR 46096 - Statistical Process Controls for Blood Establishments; Public Workshop

    Science.gov (United States)

    2012-08-02

    ...] Statistical Process Controls for Blood Establishments; Public Workshop AGENCY: Food and Drug Administration... workshop entitled: ``Statistical Process Controls for Blood Establishments.'' The purpose of this public workshop is to discuss the implementation of statistical process controls to validate and monitor...

  4. Improving the Document Development Process: Integrating Relational Data and Statistical Process Control.

    Science.gov (United States)

    Miller, John

    1994-01-01

    Presents an approach to document numbering, document titling, and process measurement which, when used with fundamental techniques of statistical process control, reveals meaningful process-element variation as well as nominal productivity models. (SR)

  5. Statistical process control in nursing research.

    Science.gov (United States)

    Polit, Denise F; Chaboyer, Wendy

    2012-02-01

    In intervention studies in which randomization to groups is not possible, researchers typically use quasi-experimental designs. Time series designs are strong quasi-experimental designs but are seldom used, perhaps because of technical and analytic hurdles. Statistical process control (SPC) is an alternative analytic approach to testing hypotheses about intervention effects using data collected over time. SPC, like traditional statistical methods, is a tool for understanding variation and involves the construction of control charts that distinguish between normal, random fluctuations (common cause variation), and statistically significant special cause variation that can result from an innovation. The purpose of this article is to provide an overview of SPC and to illustrate its use in a study of a nursing practice improvement intervention. Copyright © 2011 Wiley Periodicals, Inc.

  6. Nonparametric predictive inference in statistical process control

    NARCIS (Netherlands)

    Arts, G.R.J.; Coolen, F.P.A.; Laan, van der P.

    2000-01-01

    New methods for statistical process control are presented, where the inferences have a nonparametric predictive nature. We consider several problems in process control in terms of uncertainties about future observable random quantities, and we develop inferences for these random quantities hased on

  7. Comparative assessment of TRU waste forms and processes. Volume II. Waste form data, process descriptions, and costs

    International Nuclear Information System (INIS)

    Ross, W.A.; Lokken, R.O.; May, R.P.; Roberts, F.P.; Thornhill, R.E.; Timmerman, C.L.; Treat, R.L.; Westsik, J.H. Jr.

    1982-09-01

    This volume contains supporting information for the comparative assessment of the transuranic waste forms and processes summarized in Volume I. Detailed data on the characterization of the waste forms selected for the assessment, process descriptions, and cost information are provided. The purpose of this volume is to provide additional information that may be useful when using the data in Volume I and to provide greater detail on particular waste forms and processes. Volume II is divided into two sections and two appendixes. The first section provides information on the preparation of the waste form specimens used in this study and additional characterization data in support of that in Volume I. The second section includes detailed process descriptions for the eight processes evaluated. Appendix A lists the results of MCC-1 leach test and Appendix B lists additional cost data. 56 figures, 12 tables

  8. Statistic techniques of process control for MTR type

    International Nuclear Information System (INIS)

    Oliveira, F.S.; Ferrufino, F.B.J.; Santos, G.R.T.; Lima, R.M.

    2002-01-01

    This work aims at introducing some improvements on the fabrication of MTR type fuel plates, applying statistic techniques of process control. The work was divided into four single steps and their data were analyzed for: fabrication of U 3 O 8 fuel plates; fabrication of U 3 Si 2 fuel plates; rolling of small lots of fuel plates; applying statistic tools and standard specifications to perform a comparative study of these processes. (author)

  9. Applying Statistical Process Control to Clinical Data: An Illustration.

    Science.gov (United States)

    Pfadt, Al; And Others

    1992-01-01

    Principles of statistical process control are applied to a clinical setting through the use of control charts to detect changes, as part of treatment planning and clinical decision-making processes. The logic of control chart analysis is derived from principles of statistical inference. Sample charts offer examples of evaluating baselines and…

  10. Statistical process control for serially correlated data

    NARCIS (Netherlands)

    Wieringa, Jakob Edo

    1999-01-01

    Statistical Process Control (SPC) aims at quality improvement through reduction of variation. The best known tool of SPC is the control chart. Over the years, the control chart has proved to be a successful practical technique for monitoring process measurements. However, its usefulness in practice

  11. Monitoring a PVC batch process with multivariate statistical process control charts

    NARCIS (Netherlands)

    Tates, A. A.; Louwerse, D. J.; Smilde, A. K.; Koot, G. L. M.; Berndt, H.

    1999-01-01

    Multivariate statistical process control charts (MSPC charts) are developed for the industrial batch production process of poly(vinyl chloride) (PVC). With these MSPC charts different types of abnormal batch behavior were detected on-line. With batch contribution plots, the probable causes of these

  12. A case study: application of statistical process control tool for determining process capability and sigma level.

    Science.gov (United States)

    Chopra, Vikram; Bairagi, Mukesh; Trivedi, P; Nagar, Mona

    2012-01-01

    Statistical process control is the application of statistical methods to the measurement and analysis of variation process. Various regulatory authorities such as Validation Guidance for Industry (2011), International Conference on Harmonisation ICH Q10 (2009), the Health Canada guidelines (2009), Health Science Authority, Singapore: Guidance for Product Quality Review (2008), and International Organization for Standardization ISO-9000:2005 provide regulatory support for the application of statistical process control for better process control and understanding. In this study risk assessments, normal probability distributions, control charts, and capability charts are employed for selection of critical quality attributes, determination of normal probability distribution, statistical stability, and capability of production processes, respectively. The objective of this study is to determine tablet production process quality in the form of sigma process capability. By interpreting data and graph trends, forecasting of critical quality attributes, sigma process capability, and stability of process were studied. The overall study contributes to an assessment of process at the sigma level with respect to out-of-specification attributes produced. Finally, the study will point to an area where the application of quality improvement and quality risk assessment principles for achievement of six sigma-capable processes is possible. Statistical process control is the most advantageous tool for determination of the quality of any production process. This tool is new for the pharmaceutical tablet production process. In the case of pharmaceutical tablet production processes, the quality control parameters act as quality assessment parameters. Application of risk assessment provides selection of critical quality attributes among quality control parameters. Sequential application of normality distributions, control charts, and capability analyses provides a valid statistical

  13. Statistical Methods for Stochastic Differential Equations

    CERN Document Server

    Kessler, Mathieu; Sorensen, Michael

    2012-01-01

    The seventh volume in the SemStat series, Statistical Methods for Stochastic Differential Equations presents current research trends and recent developments in statistical methods for stochastic differential equations. Written to be accessible to both new students and seasoned researchers, each self-contained chapter starts with introductions to the topic at hand and builds gradually towards discussing recent research. The book covers Wiener-driven equations as well as stochastic differential equations with jumps, including continuous-time ARMA processes and COGARCH processes. It presents a sp

  14. On statistical analysis of compound point process

    Czech Academy of Sciences Publication Activity Database

    Volf, Petr

    2006-01-01

    Roč. 35, 2-3 (2006), s. 389-396 ISSN 1026-597X R&D Projects: GA ČR(CZ) GA402/04/1294 Institutional research plan: CEZ:AV0Z10750506 Keywords : counting process * compound process * hazard function * Cox -model Subject RIV: BB - Applied Statistics, Operational Research

  15. Analysis of Variance in Statistical Image Processing

    Science.gov (United States)

    Kurz, Ludwik; Hafed Benteftifa, M.

    1997-04-01

    A key problem in practical image processing is the detection of specific features in a noisy image. Analysis of variance (ANOVA) techniques can be very effective in such situations, and this book gives a detailed account of the use of ANOVA in statistical image processing. The book begins by describing the statistical representation of images in the various ANOVA models. The authors present a number of computationally efficient algorithms and techniques to deal with such problems as line, edge, and object detection, as well as image restoration and enhancement. By describing the basic principles of these techniques, and showing their use in specific situations, the book will facilitate the design of new algorithms for particular applications. It will be of great interest to graduate students and engineers in the field of image processing and pattern recognition.

  16. Statistical properties of several models of fractional random point processes

    Science.gov (United States)

    Bendjaballah, C.

    2011-08-01

    Statistical properties of several models of fractional random point processes have been analyzed from the counting and time interval statistics points of view. Based on the criterion of the reduced variance, it is seen that such processes exhibit nonclassical properties. The conditions for these processes to be treated as conditional Poisson processes are examined. Numerical simulations illustrate part of the theoretical calculations.

  17. Statistical inference for Cox processes

    DEFF Research Database (Denmark)

    Møller, Jesper; Waagepetersen, Rasmus Plenge

    2002-01-01

    Research has generated a number of advances in methods for spatial cluster modelling in recent years, particularly in the area of Bayesian cluster modelling. Along with these advances has come an explosion of interest in the potential applications of this work, especially in epidemiology and genome...... research.   In one integrated volume, this book reviews the state-of-the-art in spatial clustering and spatial cluster modelling, bringing together research and applications previously scattered throughout the literature. It begins with an overview of the field, then presents a series of chapters...... that illuminate the nature and purpose of cluster modelling within different application areas, including astrophysics, epidemiology, ecology, and imaging. The focus then shifts to methods, with discussions on point and object process modelling, perfect sampling of cluster processes, partitioning in space...

  18. Multivariate Statistical Process Control Charts: An Overview

    OpenAIRE

    Bersimis, Sotiris; Psarakis, Stelios; Panaretos, John

    2006-01-01

    In this paper we discuss the basic procedures for the implementation of multivariate statistical process control via control charting. Furthermore, we review multivariate extensions for all kinds of univariate control charts, such as multivariate Shewhart-type control charts, multivariate CUSUM control charts and multivariate EWMA control charts. In addition, we review unique procedures for the construction of multivariate control charts, based on multivariate statistical techniques such as p...

  19. Nonparametric predictive inference in statistical process control

    NARCIS (Netherlands)

    Arts, G.R.J.; Coolen, F.P.A.; Laan, van der P.

    2004-01-01

    Statistical process control (SPC) is used to decide when to stop a process as confidence in the quality of the next item(s) is low. Information to specify a parametric model is not always available, and as SPC is of a predictive nature, we present a control chart developed using nonparametric

  20. Statistical estimation of process holdup

    International Nuclear Information System (INIS)

    Harris, S.P.

    1988-01-01

    Estimates of potential process holdup and their random and systematic error variances are derived to improve the inventory difference (ID) estimate and its associated measure of uncertainty for a new process at the Savannah River Plant. Since the process is in a start-up phase, data have not yet accumulated for statistical modelling. The material produced in the facility will be a very pure, highly enriched 235U with very small isotopic variability. Therefore, data published in LANL's unclassified report on Estimation Methods for Process Holdup of a Special Nuclear Materials was used as a starting point for the modelling process. LANL's data were gathered through a series of designed measurements of special nuclear material (SNM) holdup at two of their materials-processing facilities. Also, they had taken steps to improve the quality of data through controlled, larger scale, experiments outside of LANL at highly enriched uranium processing facilities. The data they have accumulated are on an equipment component basis. Our modelling has been restricted to the wet chemistry area. We have developed predictive models for each of our process components based on the LANL data. 43 figs

  1. USAF Logistics Process Optimization Study for the Aircraft Asset Sustainment Process. Volume 3. Future to be Asset Sustainment Process Model

    National Research Council Canada - National Science Library

    Adamson, Anthony

    1998-01-01

    .... It is published as three separate volumes. Volume I, USAF Logistics Process Optimization Study for the Aircraft Asset Sustainment Process -- Phase II Report, discusses the result and cost/benefit analysis of testing three initiatives...

  2. Statistical process control for residential treated wood

    Science.gov (United States)

    Patricia K. Lebow; Timothy M. Young; Stan Lebow

    2017-01-01

    This paper is the first stage of a study that attempts to improve the process of manufacturing treated lumber through the use of statistical process control (SPC). Analysis of industrial and auditing agency data sets revealed there are differences between the industry and agency probability density functions (pdf) for normalized retention data. Resampling of batches of...

  3. Image-guided radiotherapy quality control: Statistical process control using image similarity metrics.

    Science.gov (United States)

    Shiraishi, Satomi; Grams, Michael P; Fong de Los Santos, Luis E

    2018-05-01

    The purpose of this study was to demonstrate an objective quality control framework for the image review process. A total of 927 cone-beam computed tomography (CBCT) registrations were retrospectively analyzed for 33 bilateral head and neck cancer patients who received definitive radiotherapy. Two registration tracking volumes (RTVs) - cervical spine (C-spine) and mandible - were defined, within which a similarity metric was calculated and used as a registration quality tracking metric over the course of treatment. First, sensitivity to large misregistrations was analyzed for normalized cross-correlation (NCC) and mutual information (MI) in the context of statistical analysis. The distribution of metrics was obtained for displacements that varied according to a normal distribution with standard deviation of σ = 2 mm, and the detectability of displacements greater than 5 mm was investigated. Then, similarity metric control charts were created using a statistical process control (SPC) framework to objectively monitor the image registration and review process. Patient-specific control charts were created using NCC values from the first five fractions to set a patient-specific process capability limit. Population control charts were created using the average of the first five NCC values for all patients in the study. For each patient, the similarity metrics were calculated as a function of unidirectional translation, referred to as the effective displacement. Patient-specific action limits corresponding to 5 mm effective displacements were defined. Furthermore, effective displacements of the ten registrations with the lowest similarity metrics were compared with a three dimensional (3DoF) couch displacement required to align the anatomical landmarks. Normalized cross-correlation identified suboptimal registrations more effectively than MI within the framework of SPC. Deviations greater than 5 mm were detected at 2.8σ and 2.1σ from the mean for NCC and MI

  4. Preliminary evaluation of alternative waste form solidification processes. Volume II. Evaluation of the processes

    International Nuclear Information System (INIS)

    1980-08-01

    This Volume II presents engineering feasibility evaluations of the eleven processes for solidification of nuclear high-level liquid wastes (HHLW) described in Volume I of this report. Each evaluation was based in a systematic assessment of the process in respect to six principal evaluation criteria: complexity of process; state of development; safety; process requirements; development work required; and facility requirements. The principal criteria were further subdivided into a total of 22 subcriteria, each of which was assigned a weight. Each process was then assigned a figure of merit, on a scale of 1 to 10, for each of the subcriteria. A total rating was obtained for each process by summing the products of the subcriteria ratings and the subcriteria weights. The evaluations were based on the process descriptions presented in Volume I of this report, supplemented by information obtained from the literature, including publications by the originators of the various processes. Waste form properties were, in general, not evaluated. This document describes the approach which was taken, the developent and application of the rating criteria and subcriteria, and the evaluation results. A series of appendices set forth summary descriptions of the processes and the ratings, together with the complete numerical ratings assigned; two appendices present further technical details on the rating process

  5. Quantitative analysis of CT brain images: a statistical model incorporating partial volume and beam hardening effects

    International Nuclear Information System (INIS)

    McLoughlin, R.F.; Ryan, M.V.; Heuston, P.M.; McCoy, C.T.; Masterson, J.B.

    1992-01-01

    The purpose of this study was to construct and evaluate a statistical model for the quantitative analysis of computed tomographic brain images. Data were derived from standard sections in 34 normal studies. A model representing the intercranial pure tissue and partial volume areas, with allowance for beam hardening, was developed. The average percentage error in estimation of areas, derived from phantom tests using the model, was 28.47%. We conclude that our model is not sufficiently accurate to be of clinical use, even though allowance was made for partial volume and beam hardening effects. (author)

  6. Marrakesh International Conference on Probability and Statistics

    CERN Document Server

    Ouassou, Idir; Rachdi, Mustapha

    2015-01-01

    This volume, which highlights recent advances in statistical methodology and applications, is divided into two main parts. The first part presents theoretical results on estimation techniques in functional statistics, while the second examines three key areas of application: estimation problems in queuing theory, an application in signal processing, and the copula approach to epidemiologic modelling. The book’s peer-reviewed contributions are based on papers originally presented at the Marrakesh International Conference on Probability and Statistics held in December 2013.

  7. Probability, Statistics, and Stochastic Processes

    CERN Document Server

    Olofsson, Peter

    2012-01-01

    This book provides a unique and balanced approach to probability, statistics, and stochastic processes.   Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area.  The Second Edition features new coverage of analysis of variance (ANOVA), consistency and efficiency of estimators, asymptotic theory for maximum likelihood estimators, empirical distribution function and the Kolmogorov-Smirnov test, general linear models, multiple comparisons, Markov chain Monte Carlo (MCMC), Brownian motion, martingales, and

  8. Measurable inhomogeneities in stock trading volume flow

    Science.gov (United States)

    Cortines, A. A. G.; Riera, R.; Anteneodo, C.

    2008-08-01

    We investigate the statistics of volumes of shares traded in stock markets. We show that the stochastic process of trading volumes can be understood on the basis of a mixed Poisson process at the microscopic time level. The beta distribution of the second kind (also known as q-gamma distribution), that has been proposed to describe empirical volume histograms, naturally results from our analysis. In particular, the shape of the distribution at small volumes is governed by the degree of granularity in the trading process, while the exponent controlling the tail is a measure of the inhomogeneities in market activity. Furthermore, the present case furnishes empirical evidence of how power law probability distributions can arise as a consequence of a fluctuating intrinsic parameter.

  9. Statistical model for grain boundary and grain volume oxidation kinetics in UO2 spent fuel

    International Nuclear Information System (INIS)

    Stout, R.B.; Shaw, H.F.; Einziger, R.E.

    1989-09-01

    This paper addresses statistical characteristics for the simplest case of grain boundary/grain volume oxidation kinetics of UO 2 to U 3 O 7 for a fragment of a spent fuel pellet. It also presents a limited discussion of future extensions to this simple case to represent the more complex cases of oxidation kinetics in spent fuels. 17 refs., 1 fig

  10. Memory-type control charts in statistical process control

    NARCIS (Netherlands)

    Abbas, N.

    2012-01-01

    Control chart is the most important statistical tool to manage the business processes. It is a graph of measurements on a quality characteristic of the process on the vertical axis plotted against time on the horizontal axis. The graph is completed with control limits that cause variation mark. Once

  11. Statistical image processing and multidimensional modeling

    CERN Document Server

    Fieguth, Paul

    2010-01-01

    Images are all around us! The proliferation of low-cost, high-quality imaging devices has led to an explosion in acquired images. When these images are acquired from a microscope, telescope, satellite, or medical imaging device, there is a statistical image processing task: the inference of something - an artery, a road, a DNA marker, an oil spill - from imagery, possibly noisy, blurry, or incomplete. A great many textbooks have been written on image processing. However this book does not so much focus on images, per se, but rather on spatial data sets, with one or more measurements taken over

  12. Statistical processing of technological and radiochemical data

    International Nuclear Information System (INIS)

    Lahodova, Zdena; Vonkova, Kateřina

    2011-01-01

    The project described in this article had two goals. The main goal was to compare technological and radiochemical data from two units of nuclear power plant. The other goal was to check the collection, organization and interpretation of routinely measured data. Monitoring of analytical and radiochemical data is a very valuable source of knowledge for some processes in the primary circuit. Exploratory analysis of one-dimensional data was performed to estimate location and variability and to find extreme values, data trends, distribution, autocorrelation etc. This process allowed for the cleaning and completion of raw data. Then multiple analyses such as multiple comparisons, multiple correlation, variance analysis, and so on were performed. Measured data was organized into a data matrix. The results and graphs such as Box plots, Mahalanobis distance, Biplot, Correlation, and Trend graphs are presented in this article as statistical analysis tools. Tables of data were replaced with graphs because graphs condense large amounts of information into easy-to-understand formats. The significant conclusion of this work is that the collection and comprehension of data is a very substantial part of statistical processing. With well-prepared and well-understood data, its accurate evaluation is possible. Cooperation between the technicians who collect data and the statistician who processes it is also very important. (author)

  13. Manufacturing Squares: An Integrative Statistical Process Control Exercise

    Science.gov (United States)

    Coy, Steven P.

    2016-01-01

    In the exercise, students in a junior-level operations management class are asked to manufacture a simple product. Given product specifications, they must design a production process, create roles and design jobs for each team member, and develop a statistical process control plan that efficiently and effectively controls quality during…

  14. Statistical convergence of a non-positive approximation process

    International Nuclear Information System (INIS)

    Agratini, Octavian

    2011-01-01

    Highlights: → A general class of approximation processes is introduced. → The A-statistical convergence is studied. → Applications in quantum calculus are delivered. - Abstract: Starting from a general sequence of linear and positive operators of discrete type, we associate its r-th order generalization. This construction involves high order derivatives of a signal and it looses the positivity property. Considering that the initial approximation process is A-statistically uniform convergent, we prove that the property is inherited by the new sequence. Also, our result includes information about the uniform convergence. Two applications in q-Calculus are presented. We study q-analogues both of Meyer-Koenig and Zeller operators and Stancu operators.

  15. 12th Workshop on Stochastic Models, Statistics and Their Applications

    CERN Document Server

    Rafajłowicz, Ewaryst; Szajowski, Krzysztof

    2015-01-01

    This volume presents the latest advances and trends in stochastic models and related statistical procedures. Selected peer-reviewed contributions focus on statistical inference, quality control, change-point analysis and detection, empirical processes, time series analysis, survival analysis and reliability, statistics for stochastic processes, big data in technology and the sciences, statistical genetics, experiment design, and stochastic models in engineering. Stochastic models and related statistical procedures play an important part in furthering our understanding of the challenging problems currently arising in areas of application such as the natural sciences, information technology, engineering, image analysis, genetics, energy and finance, to name but a few. This collection arises from the 12th Workshop on Stochastic Models, Statistics and Their Applications, Wroclaw, Poland.

  16. [Statistics for statistics?--Thoughts about psychological tools].

    Science.gov (United States)

    Berger, Uwe; Stöbel-Richter, Yve

    2007-12-01

    Statistical methods take a prominent place among psychologists' educational programs. Being known as difficult to understand and heavy to learn, students fear of these contents. Those, who do not aspire after a research carrier at the university, will forget the drilled contents fast. Furthermore, because it does not apply for the work with patients and other target groups at a first glance, the methodological education as a whole was often questioned. For many psychological practitioners the statistical education makes only sense by enforcing respect against other professions, namely physicians. For the own business, statistics is rarely taken seriously as a professional tool. The reason seems to be clear: Statistics treats numbers, while psychotherapy treats subjects. So, does statistics ends in itself? With this article, we try to answer the question, if and how statistical methods were represented within the psychotherapeutical and psychological research. Therefore, we analyzed 46 Originals of a complete volume of the journal Psychotherapy, Psychosomatics, Psychological Medicine (PPmP). Within the volume, 28 different analyse methods were applied, from which 89 per cent were directly based upon statistics. To be able to write and critically read Originals as a backbone of research, presumes a high degree of statistical education. To ignore statistics means to ignore research and at least to reveal the own professional work to arbitrariness.

  17. Statistical representative elementary volumes of porous media determined using greyscale analysis of 3D tomograms

    Science.gov (United States)

    Bruns, S.; Stipp, S. L. S.; Sørensen, H. O.

    2017-09-01

    Digital rock physics carries the dogmatic concept of having to segment volume images for quantitative analysis but segmentation rejects huge amounts of signal information. Information that is essential for the analysis of difficult and marginally resolved samples, such as materials with very small features, is lost during segmentation. In X-ray nanotomography reconstructions of Hod chalk we observed partial volume voxels with an abundance that limits segmentation based analysis. Therefore, we investigated the suitability of greyscale analysis for establishing statistical representative elementary volumes (sREV) for the important petrophysical parameters of this type of chalk, namely porosity, specific surface area and diffusive tortuosity, by using volume images without segmenting the datasets. Instead, grey level intensities were transformed to a voxel level porosity estimate using a Gaussian mixture model. A simple model assumption was made that allowed formulating a two point correlation function for surface area estimates using Bayes' theory. The same assumption enables random walk simulations in the presence of severe partial volume effects. The established sREVs illustrate that in compacted chalk, these simulations cannot be performed in binary representations without increasing the resolution of the imaging system to a point where the spatial restrictions of the represented sample volume render the precision of the measurement unacceptable. We illustrate this by analyzing the origins of variance in the quantitative analysis of volume images, i.e. resolution dependence and intersample and intrasample variance. Although we cannot make any claims on the accuracy of the approach, eliminating the segmentation step from the analysis enables comparative studies with higher precision and repeatability.

  18. Towards the elimination of Monte Carlo statistical fluctuation from dose volume histograms for radiotherapy treatment planning

    International Nuclear Information System (INIS)

    Sempau, J.; Bielajew, A.F.

    2000-01-01

    The Monte Carlo calculation of dose for radiotherapy treatment planning purposes introduces unavoidable statistical noise into the prediction of dose in a given volume element (voxel). When the doses in these voxels are summed to produce dose volume histograms (DVHs), this noise translates into a broadening of differential DVHs and correspondingly flatter DVHs. A brute force approach would entail calculating dose for long periods of time - enough to ensure that the DVHs had converged. In this paper we introduce an approach for deconvolving the statistical noise from DVHs, thereby obtaining estimates for converged DVHs obtained about 100 times faster than the brute force approach described above. There are two important implications of this work: (a) decisions based upon DVHs may be made much more economically using the new approach and (b) inverse treatment planning or optimization methods may employ Monte Carlo dose calculations at all stages of the iterative procedure since the prohibitive cost of Monte Carlo calculations at the intermediate calculation steps can be practically eliminated. (author)

  19. Statistical analysis and digital processing of the Mössbauer spectra

    International Nuclear Information System (INIS)

    Prochazka, Roman; Tucek, Jiri; Mashlan, Miroslav; Pechousek, Jiri; Tucek, Pavel; Marek, Jaroslav

    2010-01-01

    This work is focused on using the statistical methods and development of the filtration procedures for signal processing in Mössbauer spectroscopy. Statistical tools for noise filtering in the measured spectra are used in many scientific areas. The use of a pure statistical approach in accumulated Mössbauer spectra filtration is described. In Mössbauer spectroscopy, the noise can be considered as a Poisson statistical process with a Gaussian distribution for high numbers of observations. This noise is a superposition of the non-resonant photons counting with electronic noise (from γ-ray detection and discrimination units), and the velocity system quality that can be characterized by the velocity nonlinearities. The possibility of a noise-reducing process using a new design of statistical filter procedure is described. This mathematical procedure improves the signal-to-noise ratio and thus makes it easier to determine the hyperfine parameters of the given Mössbauer spectra. The filter procedure is based on a periodogram method that makes it possible to assign the statistically important components in the spectral domain. The significance level for these components is then feedback-controlled using the correlation coefficient test results. The estimation of the theoretical correlation coefficient level which corresponds to the spectrum resolution is performed. Correlation coefficient test is based on comparison of the theoretical and the experimental correlation coefficients given by the Spearman method. The correctness of this solution was analyzed by a series of statistical tests and confirmed by many spectra measured with increasing statistical quality for a given sample (absorber). The effect of this filter procedure depends on the signal-to-noise ratio and the applicability of this method has binding conditions

  20. Statistical analysis and digital processing of the Mössbauer spectra

    Science.gov (United States)

    Prochazka, Roman; Tucek, Pavel; Tucek, Jiri; Marek, Jaroslav; Mashlan, Miroslav; Pechousek, Jiri

    2010-02-01

    This work is focused on using the statistical methods and development of the filtration procedures for signal processing in Mössbauer spectroscopy. Statistical tools for noise filtering in the measured spectra are used in many scientific areas. The use of a pure statistical approach in accumulated Mössbauer spectra filtration is described. In Mössbauer spectroscopy, the noise can be considered as a Poisson statistical process with a Gaussian distribution for high numbers of observations. This noise is a superposition of the non-resonant photons counting with electronic noise (from γ-ray detection and discrimination units), and the velocity system quality that can be characterized by the velocity nonlinearities. The possibility of a noise-reducing process using a new design of statistical filter procedure is described. This mathematical procedure improves the signal-to-noise ratio and thus makes it easier to determine the hyperfine parameters of the given Mössbauer spectra. The filter procedure is based on a periodogram method that makes it possible to assign the statistically important components in the spectral domain. The significance level for these components is then feedback-controlled using the correlation coefficient test results. The estimation of the theoretical correlation coefficient level which corresponds to the spectrum resolution is performed. Correlation coefficient test is based on comparison of the theoretical and the experimental correlation coefficients given by the Spearman method. The correctness of this solution was analyzed by a series of statistical tests and confirmed by many spectra measured with increasing statistical quality for a given sample (absorber). The effect of this filter procedure depends on the signal-to-noise ratio and the applicability of this method has binding conditions.

  1. Applying Statistical Process Quality Control Methodology to Educational Settings.

    Science.gov (United States)

    Blumberg, Carol Joyce

    A subset of Statistical Process Control (SPC) methodology known as Control Charting is introduced. SPC methodology is a collection of graphical and inferential statistics techniques used to study the progress of phenomena over time. The types of control charts covered are the null X (mean), R (Range), X (individual observations), MR (moving…

  2. Using Paper Helicopters to Teach Statistical Process Control

    Science.gov (United States)

    Johnson, Danny J.

    2011-01-01

    This hands-on project uses a paper helicopter to teach students how to distinguish between common and special causes of variability when developing and using statistical process control charts. It allows the student to experience a process that is out-of-control due to imprecise or incomplete product design specifications and to discover how the…

  3. The statistical process control methods - SPC

    Directory of Open Access Journals (Sweden)

    Floreková Ľubica

    1998-03-01

    Full Text Available Methods of statistical evaluation of quality – SPC (item 20 of the documentation system of quality control of ISO norm, series 900 of various processes, products and services belong amongst basic qualitative methods that enable us to analyse and compare data pertaining to various quantitative parameters. Also they enable, based on the latter, to propose suitable interventions with the aim of improving these processes, products and services. Theoretical basis and applicatibily of the principles of the: - diagnostics of a cause and effects, - Paret analysis and Lorentz curve, - number distribution and frequency curves of random variable distribution, - Shewhart regulation charts, are presented in the contribution.

  4. Statistical process control methods allow the analysis and improvement of anesthesia care.

    Science.gov (United States)

    Fasting, Sigurd; Gisvold, Sven E

    2003-10-01

    Quality aspects of the anesthetic process are reflected in the rate of intraoperative adverse events. The purpose of this report is to illustrate how the quality of the anesthesia process can be analyzed using statistical process control methods, and exemplify how this analysis can be used for quality improvement. We prospectively recorded anesthesia-related data from all anesthetics for five years. The data included intraoperative adverse events, which were graded into four levels, according to severity. We selected four adverse events, representing important quality and safety aspects, for statistical process control analysis. These were: inadequate regional anesthesia, difficult emergence from general anesthesia, intubation difficulties and drug errors. We analyzed the underlying process using 'p-charts' for statistical process control. In 65,170 anesthetics we recorded adverse events in 18.3%; mostly of lesser severity. Control charts were used to define statistically the predictable normal variation in problem rate, and then used as a basis for analysis of the selected problems with the following results: Inadequate plexus anesthesia: stable process, but unacceptably high failure rate; Difficult emergence: unstable process, because of quality improvement efforts; Intubation difficulties: stable process, rate acceptable; Medication errors: methodology not suited because of low rate of errors. By applying statistical process control methods to the analysis of adverse events, we have exemplified how this allows us to determine if a process is stable, whether an intervention is required, and if quality improvement efforts have the desired effect.

  5. Analyzing a Mature Software Inspection Process Using Statistical Process Control (SPC)

    Science.gov (United States)

    Barnard, Julie; Carleton, Anita; Stamper, Darrell E. (Technical Monitor)

    1999-01-01

    This paper presents a cooperative effort where the Software Engineering Institute and the Space Shuttle Onboard Software Project could experiment applying Statistical Process Control (SPC) analysis to inspection activities. The topics include: 1) SPC Collaboration Overview; 2) SPC Collaboration Approach and Results; and 3) Lessons Learned.

  6. Some properties of point processes in statistical optics

    International Nuclear Information System (INIS)

    Picinbono, B.; Bendjaballah, C.

    2010-01-01

    The analysis of the statistical properties of the point process (PP) of photon detection times can be used to determine whether or not an optical field is classical, in the sense that its statistical description does not require the methods of quantum optics. This determination is, however, more difficult than ordinarily admitted and the first aim of this paper is to illustrate this point by using some results of the PP theory. For example, it is well known that the analysis of the photodetection of classical fields exhibits the so-called bunching effect. But this property alone cannot be used to decide the nature of a given optical field. Indeed, we have presented examples of point processes for which a bunching effect appears and yet they cannot be obtained from a classical field. These examples are illustrated by computer simulations. Similarly, it is often admitted that for fields with very low light intensity the bunching or antibunching can be described by using the statistical properties of the distance between successive events of the point process, which simplifies the experimental procedure. We have shown that, while this property is valid for classical PPs, it has no reason to be true for nonclassical PPs, and we have presented some examples of this situation also illustrated by computer simulations.

  7. Applied Behavior Analysis and Statistical Process Control?

    Science.gov (United States)

    Hopkins, B. L.

    1995-01-01

    Incorporating statistical process control (SPC) methods into applied behavior analysis is discussed. It is claimed that SPC methods would likely reduce applied behavior analysts' intimate contacts with problems and would likely yield poor treatment and research decisions. Cases and data presented by Pfadt and Wheeler (1995) are cited as examples.…

  8. Statistical process control for alpha spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Richardson, W; Majoras, R E [Oxford Instruments, Inc. P.O. Box 2560, Oak Ridge TN 37830 (United States); Joo, I O; Seymour, R S [Accu-Labs Research, Inc. 4663 Table Mountain Drive, Golden CO 80403 (United States)

    1995-10-01

    Statistical process control(SPC) allows for the identification of problems in alpha spectroscopy processes before they occur, unlike standard laboratory Q C which only identifies problems after a process fails. SPC tools that are directly applicable to alpha spectroscopy include individual X-charts and X-bar charts, process capability plots, and scatter plots. Most scientists are familiar with the concepts the and methods employed by SPC. These tools allow analysis of process bias, precision, accuracy and reproducibility as well as process capability. Parameters affecting instrument performance are monitored and analyzed using SPC methods. These instrument parameters can also be compared to sampling, preparation, measurement, and analysis Q C parameters permitting the evaluation of cause effect relationships. Three examples of SPC, as applied to alpha spectroscopy , are presented. The first example investigates background contamination using averaging to show trends quickly. A second example demonstrates how SPC can identify sample processing problems, analyzing both how and why this problem occurred. A third example illustrates how SPC can predict when an alpha spectroscopy process is going to fail. This allows for an orderly and timely shutdown of the process to perform preventative maintenance, avoiding the need to repeat costly sample analyses. 7 figs., 2 tabs.

  9. Statistical process control for alpha spectroscopy

    International Nuclear Information System (INIS)

    Richardson, W.; Majoras, R.E.; Joo, I.O.; Seymour, R.S.

    1995-01-01

    Statistical process control(SPC) allows for the identification of problems in alpha spectroscopy processes before they occur, unlike standard laboratory Q C which only identifies problems after a process fails. SPC tools that are directly applicable to alpha spectroscopy include individual X-charts and X-bar charts, process capability plots, and scatter plots. Most scientists are familiar with the concepts the and methods employed by SPC. These tools allow analysis of process bias, precision, accuracy and reproducibility as well as process capability. Parameters affecting instrument performance are monitored and analyzed using SPC methods. These instrument parameters can also be compared to sampling, preparation, measurement, and analysis Q C parameters permitting the evaluation of cause effect relationships. Three examples of SPC, as applied to alpha spectroscopy , are presented. The first example investigates background contamination using averaging to show trends quickly. A second example demonstrates how SPC can identify sample processing problems, analyzing both how and why this problem occurred. A third example illustrates how SPC can predict when an alpha spectroscopy process is going to fail. This allows for an orderly and timely shutdown of the process to perform preventative maintenance, avoiding the need to repeat costly sample analyses. 7 figs., 2 tabs

  10. Distinct contributions of attention and working memory to visual statistical learning and ensemble processing.

    Science.gov (United States)

    Hall, Michelle G; Mattingley, Jason B; Dux, Paul E

    2015-08-01

    The brain exploits redundancies in the environment to efficiently represent the complexity of the visual world. One example of this is ensemble processing, which provides a statistical summary of elements within a set (e.g., mean size). Another is statistical learning, which involves the encoding of stable spatial or temporal relationships between objects. It has been suggested that ensemble processing over arrays of oriented lines disrupts statistical learning of structure within the arrays (Zhao, Ngo, McKendrick, & Turk-Browne, 2011). Here we asked whether ensemble processing and statistical learning are mutually incompatible, or whether this disruption might occur because ensemble processing encourages participants to process the stimulus arrays in a way that impedes statistical learning. In Experiment 1, we replicated Zhao and colleagues' finding that ensemble processing disrupts statistical learning. In Experiments 2 and 3, we found that statistical learning was unimpaired by ensemble processing when task demands necessitated (a) focal attention to individual items within the stimulus arrays and (b) the retention of individual items in working memory. Together, these results are consistent with an account suggesting that ensemble processing and statistical learning can operate over the same stimuli given appropriate stimulus processing demands during exposure to regularities. (c) 2015 APA, all rights reserved).

  11. Limiting processes in non-equilibrium classical statistical mechanics

    International Nuclear Information System (INIS)

    Jancel, R.

    1983-01-01

    After a recall of the basic principles of the statistical mechanics, the results of ergodic theory, the transient at the thermodynamic limit and his link with the transport theory near the equilibrium are analyzed. The fundamental problems put by the description of non-equilibrium macroscopic systems are investigated and the kinetic methods are stated. The problems of the non-equilibrium statistical mechanics are analyzed: irreversibility and coarse-graining, macroscopic variables and kinetic description, autonomous reduced descriptions, limit processes, BBGKY hierarchy, limit theorems [fr

  12. Self-assessed performance improves statistical fusion of image labels

    Energy Technology Data Exchange (ETDEWEB)

    Bryan, Frederick W., E-mail: frederick.w.bryan@vanderbilt.edu; Xu, Zhoubing; Asman, Andrew J.; Allen, Wade M. [Electrical Engineering, Vanderbilt University, Nashville, Tennessee 37235 (United States); Reich, Daniel S. [Translational Neuroradiology Unit, National Institute of Neurological Disorders and Stroke, National Institutes of Health, Bethesda, Maryland 20892 (United States); Landman, Bennett A. [Electrical Engineering, Vanderbilt University, Nashville, Tennessee 37235 (United States); Biomedical Engineering, Vanderbilt University, Nashville, Tennessee 37235 (United States); and Radiology and Radiological Sciences, Vanderbilt University, Nashville, Tennessee 37235 (United States)

    2014-03-15

    Purpose: Expert manual labeling is the gold standard for image segmentation, but this process is difficult, time-consuming, and prone to inter-individual differences. While fully automated methods have successfully targeted many anatomies, automated methods have not yet been developed for numerous essential structures (e.g., the internal structure of the spinal cord as seen on magnetic resonance imaging). Collaborative labeling is a new paradigm that offers a robust alternative that may realize both the throughput of automation and the guidance of experts. Yet, distributing manual labeling expertise across individuals and sites introduces potential human factors concerns (e.g., training, software usability) and statistical considerations (e.g., fusion of information, assessment of confidence, bias) that must be further explored. During the labeling process, it is simple to ask raters to self-assess the confidence of their labels, but this is rarely done and has not been previously quantitatively studied. Herein, the authors explore the utility of self-assessment in relation to automated assessment of rater performance in the context of statistical fusion. Methods: The authors conducted a study of 66 volumes manually labeled by 75 minimally trained human raters recruited from the university undergraduate population. Raters were given 15 min of training during which they were shown examples of correct segmentation, and the online segmentation tool was demonstrated. The volumes were labeled 2D slice-wise, and the slices were unordered. A self-assessed quality metric was produced by raters for each slice by marking a confidence bar superimposed on the slice. Volumes produced by both voting and statistical fusion algorithms were compared against a set of expert segmentations of the same volumes. Results: Labels for 8825 distinct slices were obtained. Simple majority voting resulted in statistically poorer performance than voting weighted by self-assessed performance

  13. Self-assessed performance improves statistical fusion of image labels

    International Nuclear Information System (INIS)

    Bryan, Frederick W.; Xu, Zhoubing; Asman, Andrew J.; Allen, Wade M.; Reich, Daniel S.; Landman, Bennett A.

    2014-01-01

    Purpose: Expert manual labeling is the gold standard for image segmentation, but this process is difficult, time-consuming, and prone to inter-individual differences. While fully automated methods have successfully targeted many anatomies, automated methods have not yet been developed for numerous essential structures (e.g., the internal structure of the spinal cord as seen on magnetic resonance imaging). Collaborative labeling is a new paradigm that offers a robust alternative that may realize both the throughput of automation and the guidance of experts. Yet, distributing manual labeling expertise across individuals and sites introduces potential human factors concerns (e.g., training, software usability) and statistical considerations (e.g., fusion of information, assessment of confidence, bias) that must be further explored. During the labeling process, it is simple to ask raters to self-assess the confidence of their labels, but this is rarely done and has not been previously quantitatively studied. Herein, the authors explore the utility of self-assessment in relation to automated assessment of rater performance in the context of statistical fusion. Methods: The authors conducted a study of 66 volumes manually labeled by 75 minimally trained human raters recruited from the university undergraduate population. Raters were given 15 min of training during which they were shown examples of correct segmentation, and the online segmentation tool was demonstrated. The volumes were labeled 2D slice-wise, and the slices were unordered. A self-assessed quality metric was produced by raters for each slice by marking a confidence bar superimposed on the slice. Volumes produced by both voting and statistical fusion algorithms were compared against a set of expert segmentations of the same volumes. Results: Labels for 8825 distinct slices were obtained. Simple majority voting resulted in statistically poorer performance than voting weighted by self-assessed performance

  14. What's statistical about learning? Insights from modelling statistical learning as a set of memory processes.

    Science.gov (United States)

    Thiessen, Erik D

    2017-01-05

    Statistical learning has been studied in a variety of different tasks, including word segmentation, object identification, category learning, artificial grammar learning and serial reaction time tasks (e.g. Saffran et al. 1996 Science 274: , 1926-1928; Orban et al. 2008 Proceedings of the National Academy of Sciences 105: , 2745-2750; Thiessen & Yee 2010 Child Development 81: , 1287-1303; Saffran 2002 Journal of Memory and Language 47: , 172-196; Misyak & Christiansen 2012 Language Learning 62: , 302-331). The difference among these tasks raises questions about whether they all depend on the same kinds of underlying processes and computations, or whether they are tapping into different underlying mechanisms. Prior theoretical approaches to statistical learning have often tried to explain or model learning in a single task. However, in many cases these approaches appear inadequate to explain performance in multiple tasks. For example, explaining word segmentation via the computation of sequential statistics (such as transitional probability) provides little insight into the nature of sensitivity to regularities among simultaneously presented features. In this article, we will present a formal computational approach that we believe is a good candidate to provide a unifying framework to explore and explain learning in a wide variety of statistical learning tasks. This framework suggests that statistical learning arises from a set of processes that are inherent in memory systems, including activation, interference, integration of information and forgetting (e.g. Perruchet & Vinter 1998 Journal of Memory and Language 39: , 246-263; Thiessen et al. 2013 Psychological Bulletin 139: , 792-814). From this perspective, statistical learning does not involve explicit computation of statistics, but rather the extraction of elements of the input into memory traces, and subsequent integration across those memory traces that emphasize consistent information (Thiessen and Pavlik

  15. Statistical properties of antisymmetrized molecular dynamics for non-nucleon-emission and nucleon-emission processes

    International Nuclear Information System (INIS)

    Ono, A.; Horiuchi, H.

    1996-01-01

    Statistical properties of antisymmetrized molecular dynamics (AMD) are classical in the case of nucleon-emission processes, while they are quantum mechanical for the processes without nucleon emission. In order to understand this situation, we first clarify that there coexist mutually opposite two statistics in the AMD framework: One is the classical statistics of the motion of wave packet centroids and the other is the quantum statistics of the motion of wave packets which is described by the AMD wave function. We prove the classical statistics of wave packet centroids by using the framework of the microcanonical ensemble of the nuclear system with a realistic effective two-nucleon interaction. We show that the relation between the classical statistics of wave packet centroids and the quantum statistics of wave packets can be obtained by taking into account the effects of the wave packet spread. This relation clarifies how the quantum statistics of wave packets emerges from the classical statistics of wave packet centroids. It is emphasized that the temperature of the classical statistics of wave packet centroids is different from the temperature of the quantum statistics of wave packets. We then explain that the statistical properties of AMD for nucleon-emission processes are classical because nucleon-emission processes in AMD are described by the motion of wave packet centroids. We further show that when we improve the description of the nucleon-emission process so as to take into account the momentum fluctuation due to the wave packet spread, the AMD statistical properties for nucleon-emission processes change drastically into quantum statistics. Our study of nucleon-emission processes can be conversely regarded as giving another kind of proof of the fact that the statistics of wave packets is quantum mechanical while that of wave packet centroids is classical. copyright 1996 The American Physical Society

  16. Industrial commodity statistics yearbook 2001. Production statistics (1992-2001)

    International Nuclear Information System (INIS)

    2003-01-01

    This is the thirty-fifth in a series of annual compilations of statistics on world industry designed to meet both the general demand for information of this kind and the special requirements of the United Nations and related international bodies. Beginning with the 1992 edition, the title of the publication was changed to industrial Commodity Statistics Yearbook as the result of a decision made by the United Nations Statistical Commission at its twenty-seventh session to discontinue, effective 1994, publication of the Industrial Statistics Yearbook, volume I, General Industrial Statistics by the Statistics Division of the United Nations. The United Nations Industrial Development Organization (UNIDO) has become responsible for the collection and dissemination of general industrial statistics while the Statistics Division of the United Nations continues to be responsible for industrial commodity production statistics. The previous title, Industrial Statistics Yearbook, volume II, Commodity Production Statistics, was introduced in the 1982 edition. The first seven editions in this series were published under the title The Growth of World industry and the next eight editions under the title Yearbook of Industrial Statistics. This edition of the Yearbook contains annual quantity data on production of industrial commodities by country, geographical region, economic grouping and for the world. A standard list of about 530 commodities (about 590 statistical series) has been adopted for the publication. The statistics refer to the ten-year period 1992-2001 for about 200 countries and areas

  17. Industrial commodity statistics yearbook 2002. Production statistics (1993-2002)

    International Nuclear Information System (INIS)

    2004-01-01

    This is the thirty-sixth in a series of annual compilations of statistics on world industry designed to meet both the general demand for information of this kind and the special requirements of the United Nations and related international bodies. Beginning with the 1992 edition, the title of the publication was changed to industrial Commodity Statistics Yearbook as the result of a decision made by the United Nations Statistical Commission at its twenty-seventh session to discontinue, effective 1994, publication of the Industrial Statistics Yearbook, volume I, General Industrial Statistics by the Statistics Division of the United Nations. The United Nations Industrial Development Organization (UNIDO) has become responsible for the collection and dissemination of general industrial statistics while the Statistics Division of the United Nations continues to be responsible for industrial commodity production statistics. The previous title, Industrial Statistics Yearbook, volume II, Commodity Production Statistics, was introduced in the 1982 edition. The first seven editions in this series were published under the title 'The Growth of World industry' and the next eight editions under the title 'Yearbook of Industrial Statistics'. This edition of the Yearbook contains annual quantity data on production of industrial commodities by country, geographical region, economic grouping and for the world. A standard list of about 530 commodities (about 590 statistical series) has been adopted for the publication. The statistics refer to the ten-year period 1993-2002 for about 200 countries and areas

  18. Industrial commodity statistics yearbook 2000. Production statistics (1991-2000)

    International Nuclear Information System (INIS)

    2002-01-01

    This is the thirty-third in a series of annual compilations of statistics on world industry designed to meet both the general demand for information of this kind and the special requirements of the United Nations and related international bodies. Beginning with the 1992 edition, the title of the publication was changed to industrial Commodity Statistics Yearbook as the result of a decision made by the United Nations Statistical Commission at its twenty-seventh session to discontinue, effective 1994, publication of the Industrial Statistics Yearbook, volume I, General Industrial Statistics by the Statistics Division of the United Nations. The United Nations Industrial Development Organization (UNIDO) has become responsible for the collection and dissemination of general industrial statistics while the Statistics Division of the United Nations continues to be responsible for industrial commodity production statistics. The previous title, Industrial Statistics Yearbook, volume II, Commodity Production Statistics, was introduced in the 1982 edition. The first seven editions in this series were published under the title The Growth of World industry and the next eight editions under the title Yearbook of Industrial Statistics. This edition of the Yearbook contains annual quantity data on production of industrial commodities by country, geographical region, economic grouping and for the world. A standard list of about 530 commodities (about 590 statistical series) has been adopted for the publication. Most of the statistics refer to the ten-year period 1991-2000 for about 200 countries and areas

  19. Extension of the direct statistical approach to a volume parameter model (non-integer splitting)

    International Nuclear Information System (INIS)

    Burn, K.W.

    1990-01-01

    The Direct Statistical Approach is a rigorous mathematical derivation of the second moment for surface splitting and Russian Roulette games attached to the Monte Carlo modelling of fixed source particle transport. It has been extended to a volume parameter model (involving non-integer ''expected value'' splitting), and then to a cell model. The cell model gives second moment and time functions that have a closed form. This suggests the possibility of two different methods of solution of the optimum splitting/Russian Roulette parameters. (author)

  20. Industrial experience feedback of a geostatistical estimation of contaminated soil volumes - 59181

    International Nuclear Information System (INIS)

    Faucheux, Claire; Jeannee, Nicolas

    2012-01-01

    Geo-statistics meets a growing interest for the remediation forecast of potentially contaminated sites, by providing adapted methods to perform both chemical and radiological pollution mapping, to estimate contaminated volumes, potentially integrating auxiliary information, and to set up adaptive sampling strategies. As part of demonstration studies carried out for GeoSiPol (Geo-statistics for Polluted Sites), geo-statistics has been applied for the detailed diagnosis of a former oil depot in France. The ability within the geo-statistical framework to generate pessimistic / probable / optimistic scenarios for the contaminated volumes allows a quantification of the risks associated to the remediation process: e.g. the financial risk to excavate clean soils, the sanitary risk to leave contaminated soils in place. After a first mapping, an iterative approach leads to collect additional samples in areas previously identified as highly uncertain. Estimated volumes are then updated and compared to the volumes actually excavated. This benchmarking therefore provides a practical feedback on the performance of the geo-statistical methodology. (authors)

  1. The extraction and integration framework: a two-process account of statistical learning.

    Science.gov (United States)

    Thiessen, Erik D; Kronstein, Alexandra T; Hufnagle, Daniel G

    2013-07-01

    The term statistical learning in infancy research originally referred to sensitivity to transitional probabilities. Subsequent research has demonstrated that statistical learning contributes to infant development in a wide array of domains. The range of statistical learning phenomena necessitates a broader view of the processes underlying statistical learning. Learners are sensitive to a much wider range of statistical information than the conditional relations indexed by transitional probabilities, including distributional and cue-based statistics. We propose a novel framework that unifies learning about all of these kinds of statistical structure. From our perspective, learning about conditional relations outputs discrete representations (such as words). Integration across these discrete representations yields sensitivity to cues and distributional information. To achieve sensitivity to all of these kinds of statistical structure, our framework combines processes that extract segments of the input with processes that compare across these extracted items. In this framework, the items extracted from the input serve as exemplars in long-term memory. The similarity structure of those exemplars in long-term memory leads to the discovery of cues and categorical structure, which guides subsequent extraction. The extraction and integration framework provides a way to explain sensitivity to both conditional statistical structure (such as transitional probabilities) and distributional statistical structure (such as item frequency and variability), and also a framework for thinking about how these different aspects of statistical learning influence each other. 2013 APA, all rights reserved

  2. Statistical mixing and aggregation in Feller diffusion

    International Nuclear Information System (INIS)

    Anteneodo, C; Duarte Queirós, S M

    2009-01-01

    We consider Feller mean-reverting square-root diffusion, which has been applied to model a wide variety of processes with linearly state-dependent diffusion, such as stochastic volatility and interest rates in finance, and neuronal and population dynamics in the natural sciences. We focus on the statistical mixing (or superstatistical) process in which the parameter related to the mean value can fluctuate—a plausible mechanism for the emergence of heavy-tailed distributions. We obtain analytical results for the associated probability density function (both stationary and time-dependent), its correlation structure and aggregation properties. Our results are applied to explain the statistics of stock traded volume at different aggregation scales

  3. Statistical Process Control in a Modern Production Environment

    DEFF Research Database (Denmark)

    Windfeldt, Gitte Bjørg

    gathered here and standard statistical software. In Paper 2 a new method for process monitoring is introduced. The method uses a statistical model of the quality characteristic and a sliding window of observations to estimate the probability that the next item will not respect the specications......Paper 1 is aimed at practicians to help them test the assumption that the observations in a sample are independent and identically distributed. An assumption that is essential when using classical Shewhart charts. The test can easily be performed in the control chart setup using the samples....... If the estimated probability exceeds a pre-determined threshold the process will be stopped. The method is exible, allowing a complexity in modeling that remains invisible to the end user. Furthermore, the method allows to build diagnostic plots based on the parameters estimates that can provide valuable insight...

  4. Statistical evaluation of the mechanical properties of high-volume class F fly ash concretes

    KAUST Repository

    Yoon, Seyoon

    2014-03-01

    High-Volume Fly Ash (HVFA) concretes are seen by many as a feasible solution for sustainable, low embodied carbon construction. At the moment, fly ash is classified as a waste by-product, primarily of thermal power stations. In this paper the authors experimentally and statistically investigated the effects of mix-design factors on the mechanical properties of high-volume class F fly ash concretes. A total of 240 and 32 samples were produced and tested in the laboratory to measure compressive strength and Young\\'s modulus respectively. Applicability of the CEB-FIP (Comite Euro-international du Béton - Fédération Internationale de la Précontrainte) and ACI (American Concrete Institute) Building Model Code (Thomas, 2010; ACI Committee 209, 1982) [1,2] to the experimentally-derived mechanical property data for HVFA concretes was established. Furthermore, using multiple linear regression analysis, Mean Squared Residuals (MSRs) were obtained to determine whether a weight- or volume-based mix proportion is better to predict the mechanical properties of HVFA concrete. The significance levels of the design factors, which indicate how significantly the factors affect the HVFA concrete\\'s mechanical properties, were determined using analysis of variance (ANOVA) tests. The results show that a weight-based mix proportion is a slightly better predictor of mechanical properties than volume-based one. The significance level of fly ash substitution rate was higher than that of w/b ratio initially but reduced over time. © 2014 Elsevier Ltd. All rights reserved.

  5. Application of statistical process control to qualitative molecular diagnostic assays.

    Directory of Open Access Journals (Sweden)

    Cathal P O'brien

    2014-11-01

    Full Text Available Modern pathology laboratories and in particular high throughput laboratories such as clinical chemistry have developed a reliable system for statistical process control. Such a system is absent from the majority of molecular laboratories and where present is confined to quantitative assays. As the inability to apply statistical process control to assay is an obvious disadvantage this study aimed to solve this problem by using a frequency estimate coupled with a confidence interval calculation to detect deviations from an expected mutation frequency. The results of this study demonstrate the strengths and weaknesses of this approach and highlight minimum sample number requirements. Notably, assays with low mutation frequencies and detection of small deviations from an expected value require greater samples with a resultant protracted time to detection. Modelled laboratory data was also used to highlight how this approach might be applied in a routine molecular laboratory. This article is the first to describe the application of statistical process control to qualitative laboratory data.

  6. Using Statistical Process Control to Make Data-Based Clinical Decisions.

    Science.gov (United States)

    Pfadt, Al; Wheeler, Donald J.

    1995-01-01

    Statistical process control (SPC), which employs simple statistical tools and problem-solving techniques such as histograms, control charts, flow charts, and Pareto charts to implement continual product improvement procedures, can be incorporated into human service organizations. Examples illustrate use of SPC procedures to analyze behavioral data…

  7. Statistical process control support during Defense Waste Processing Facility chemical runs

    International Nuclear Information System (INIS)

    Brown, K.G.

    1994-01-01

    The Product Composition Control System (PCCS) has been developed to ensure that the wasteforms produced by the Defense Waste Processing Facility (DWPF) at the Savannah River Site (SRS) will satisfy the regulatory and processing criteria that will be imposed. The PCCS provides rigorous, statistically-defensible management of a noisy, multivariate system subject to multiple constraints. The system has been successfully tested and has been used to control the production of the first two melter feed batches during DWPF Chemical Runs. These operations will demonstrate the viability of the DWPF process. This paper provides a brief discussion of the technical foundation for the statistical process control algorithms incorporated into PCCS, and describes the results obtained and lessons learned from DWPF Cold Chemical Run operations. The DWPF will immobilize approximately 130 million liters of high-level nuclear waste currently stored at the Site in 51 carbon steel tanks. Waste handling operations separate this waste into highly radioactive sludge and precipitate streams and less radioactive water soluble salts. (In a separate facility, soluble salts are disposed of as low-level waste in a mixture of cement slag, and flyash.) In DWPF, the precipitate steam (Precipitate Hydrolysis Aqueous or PHA) is blended with the insoluble sludge and ground glass frit to produce melter feed slurry which is continuously fed to the DWPF melter. The melter produces a molten borosilicate glass which is poured into stainless steel canisters for cooling and, ultimately, shipment to and storage in a geologic repository

  8. An Automated Energy Detection Algorithm Based on Morphological and Statistical Processing Techniques

    Science.gov (United States)

    2018-01-09

    100 kHz, 1 MHz 100 MHz–1 GHz 1 100 kHz 3. Statistical Processing 3.1 Statistical Analysis Statistical analysis is the mathematical science...quantitative terms. In commercial prognostics and diagnostic vibrational monitoring applications , statistical techniques that are mainly used for alarm...Balakrishnan N, editors. Handbook of statistics . Amsterdam (Netherlands): Elsevier Science; 1998. p 555–602; (Order statistics and their applications

  9. Statistical Analysis of CMC Constituent and Processing Data

    Science.gov (United States)

    Fornuff, Jonathan

    2004-01-01

    Ceramic Matrix Composites (CMCs) are the next "big thing" in high-temperature structural materials. In the case of jet engines, it is widely believed that the metallic superalloys currently being utilized for hot structures (combustors, shrouds, turbine vanes and blades) are nearing their potential limits of improvement. In order to allow for increased turbine temperatures to increase engine efficiency, material scientists have begun looking toward advanced CMCs and SiC/SiC composites in particular. Ceramic composites provide greater strength-to-weight ratios at higher temperatures than metallic alloys, but at the same time require greater challenges in micro-structural optimization that in turn increases the cost of the material as well as increases the risk of variability in the material s thermo-structural behavior. to model various potential CMC engine materials and examines the current variability in these properties due to variability in component processing conditions and constituent materials; then, to see how processing and constituent variations effect key strength, stiffness, and thermal properties of the finished components. Basically, this means trying to model variations in the component s behavior by knowing what went into creating it. inter-phase and manufactured by chemical vapor infiltration (CVI) and melt infiltration (MI) were considered. Examinations of: (1) the percent constituents by volume, (2) the inter-phase thickness, (3) variations in the total porosity, and (4) variations in the chemical composition of the Sic fiber are carried out and modeled using various codes used here at NASA-Glenn (PCGina, NASALife, CEMCAN, etc...). The effects of these variations and the ranking of their respective influences on the various thermo-mechanical material properties are studied and compared to available test data. The properties of the materials as well as minor changes to geometry are then made to the computer model and the detrimental effects

  10. Pengendalian Kualitas Kertas Dengan Menggunakan Statistical Process Control di Paper Machine 3

    Directory of Open Access Journals (Sweden)

    Vera Devani

    2017-01-01

    Full Text Available Purpose of this research is to determine types and causes of defects commonly found in Paper Machine 3 by using statistical process control (SPC method.  Statistical process control (SPC is a technique for solving problems and is used to monitor, control, analyze, manage and improve products and processes using statistical methods.  Based on Pareto Diagrams, wavy defect is found as the most frequent defect, which is 81.7%.  Human factor, meanwhile, is found as the main cause of defect, primarily due to lack of understanding on machinery and lack of training both leading to errors in data input.

  11. Large-Deviation Results for Discriminant Statistics of Gaussian Locally Stationary Processes

    Directory of Open Access Journals (Sweden)

    Junichi Hirukawa

    2012-01-01

    Full Text Available This paper discusses the large-deviation principle of discriminant statistics for Gaussian locally stationary processes. First, large-deviation theorems for quadratic forms and the log-likelihood ratio for a Gaussian locally stationary process with a mean function are proved. Their asymptotics are described by the large deviation rate functions. Second, we consider the situations where processes are misspecified to be stationary. In these misspecified cases, we formally make the log-likelihood ratio discriminant statistics and derive the large deviation theorems of them. Since they are complicated, they are evaluated and illustrated by numerical examples. We realize the misspecification of the process to be stationary seriously affecting our discrimination.

  12. The application of bayesian statistic in data fit processing

    International Nuclear Information System (INIS)

    Guan Xingyin; Li Zhenfu; Song Zhaohui

    2010-01-01

    The rationality and disadvantage of least squares fitting that is usually used in data processing is analyzed, and the theory and commonly method that Bayesian statistic is applied in data processing is shown in detail. As it is proved in analysis, Bayesian approach avoid the limitative hypothesis that least squares fitting has in data processing, and the result has traits that it is more scientific and more easily understood, may replace the least squares fitting to apply in data processing. (authors)

  13. Parametric analysis of the statistical model of the stick-slip process

    Science.gov (United States)

    Lima, Roberta; Sampaio, Rubens

    2017-06-01

    In this paper it is performed a parametric analysis of the statistical model of the response of a dry-friction oscillator. The oscillator is a spring-mass system which moves over a base with a rough surface. Due to this roughness, the mass is subject to a dry-frictional force modeled as a Coulomb friction. The system is stochastically excited by an imposed bang-bang base motion. The base velocity is modeled by a Poisson process for which a probabilistic model is fully specified. The excitation induces in the system stochastic stick-slip oscillations. The system response is composed by a random sequence alternating stick and slip-modes. With realizations of the system, a statistical model is constructed for this sequence. In this statistical model, the variables of interest of the sequence are modeled as random variables, as for example, the number of time intervals in which stick or slip occur, the instants at which they begin, and their duration. Samples of the system response are computed by integration of the dynamic equation of the system using independent samples of the base motion. Statistics and histograms of the random variables which characterize the stick-slip process are estimated for the generated samples. The objective of the paper is to analyze how these estimated statistics and histograms vary with the system parameters, i.e., to make a parametric analysis of the statistical model of the stick-slip process.

  14. An easy and low cost option for economic statistical process control ...

    African Journals Online (AJOL)

    An easy and low cost option for economic statistical process control using Excel. ... in both economic and economic statistical designs of the X-control chart. ... in this paper and the numerical examples illustrated are executed on this program.

  15. Statistical process control of cocrystallization processes: A comparison between OPLS and PLS.

    Science.gov (United States)

    Silva, Ana F T; Sarraguça, Mafalda Cruz; Ribeiro, Paulo R; Santos, Adenilson O; De Beer, Thomas; Lopes, João Almeida

    2017-03-30

    Orthogonal partial least squares regression (OPLS) is being increasingly adopted as an alternative to partial least squares (PLS) regression due to the better generalization that can be achieved. Particularly in multivariate batch statistical process control (BSPC), the use of OPLS for estimating nominal trajectories is advantageous. In OPLS, the nominal process trajectories are expected to be captured in a single predictive principal component while uncorrelated variations are filtered out to orthogonal principal components. In theory, OPLS will yield a better estimation of the Hotelling's T 2 statistic and corresponding control limits thus lowering the number of false positives and false negatives when assessing the process disturbances. Although OPLS advantages have been demonstrated in the context of regression, its use on BSPC was seldom reported. This study proposes an OPLS-based approach for BSPC of a cocrystallization process between hydrochlorothiazide and p-aminobenzoic acid monitored on-line with near infrared spectroscopy and compares the fault detection performance with the same approach based on PLS. A series of cocrystallization batches with imposed disturbances were used to test the ability to detect abnormal situations by OPLS and PLS-based BSPC methods. Results demonstrated that OPLS was generally superior in terms of sensibility and specificity in most situations. In some abnormal batches, it was found that the imposed disturbances were only detected with OPLS. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Statistical physics

    CERN Document Server

    Sadovskii, Michael V

    2012-01-01

    This volume provides a compact presentation of modern statistical physics at an advanced level. Beginning with questions on the foundations of statistical mechanics all important aspects of statistical physics are included, such as applications to ideal gases, the theory of quantum liquids and superconductivity and the modern theory of critical phenomena. Beyond that attention is given to new approaches, such as quantum field theory methods and non-equilibrium problems.

  17. Molecular representation of molar domain (volume), evolution equations, and linear constitutive relations for volume transport.

    Science.gov (United States)

    Eu, Byung Chan

    2008-09-07

    In the traditional theories of irreversible thermodynamics and fluid mechanics, the specific volume and molar volume have been interchangeably used for pure fluids, but in this work we show that they should be distinguished from each other and given distinctive statistical mechanical representations. In this paper, we present a general formula for the statistical mechanical representation of molecular domain (volume or space) by using the Voronoi volume and its mean value that may be regarded as molar domain (volume) and also the statistical mechanical representation of volume flux. By using their statistical mechanical formulas, the evolution equations of volume transport are derived from the generalized Boltzmann equation of fluids. Approximate solutions of the evolution equations of volume transport provides kinetic theory formulas for the molecular domain, the constitutive equations for molar domain (volume) and volume flux, and the dissipation of energy associated with volume transport. Together with the constitutive equation for the mean velocity of the fluid obtained in a previous paper, the evolution equations for volume transport not only shed a fresh light on, and insight into, irreversible phenomena in fluids but also can be applied to study fluid flow problems in a manner hitherto unavailable in fluid dynamics and irreversible thermodynamics. Their roles in the generalized hydrodynamics will be considered in the sequel.

  18. Interrupted Time Series Versus Statistical Process Control in Quality Improvement Projects.

    Science.gov (United States)

    Andersson Hagiwara, Magnus; Andersson Gäre, Boel; Elg, Mattias

    2016-01-01

    To measure the effect of quality improvement interventions, it is appropriate to use analysis methods that measure data over time. Examples of such methods include statistical process control analysis and interrupted time series with segmented regression analysis. This article compares the use of statistical process control analysis and interrupted time series with segmented regression analysis for evaluating the longitudinal effects of quality improvement interventions, using an example study on an evaluation of a computerized decision support system.

  19. Statistical shape modeling based renal volume measurement using tracked ultrasound

    Science.gov (United States)

    Pai Raikar, Vipul; Kwartowitz, David M.

    2017-03-01

    Autosomal dominant polycystic kidney disease (ADPKD) is the fourth most common cause of kidney transplant worldwide accounting for 7-10% of all cases. Although ADPKD usually progresses over many decades, accurate risk prediction is an important task.1 Identifying patients with progressive disease is vital to providing new treatments being developed and enable them to enter clinical trials for new therapy. Among other factors, total kidney volume (TKV) is a major biomarker predicting the progression of ADPKD. Consortium for Radiologic Imaging Studies in Polycystic Kidney Disease (CRISP)2 have shown that TKV is an early, and accurate measure of cystic burden and likely growth rate. It is strongly associated with loss of renal function.3 While ultrasound (US) has proven as an excellent tool for diagnosing the disease; monitoring short-term changes using ultrasound has been shown to not be accurate. This is attributed to high operator variability and reproducibility as compared to tomographic modalities such as CT and MR (Gold standard). Ultrasound has emerged as one of the standout modality for intra-procedural imaging and with methods for spatial localization has afforded us the ability to track 2D ultrasound in physical space which it is being used. In addition to this, the vast amount of recorded tomographic data can be used to generate statistical shape models that allow us to extract clinical value from archived image sets. In this work, we aim at improving the prognostic value of US in managing ADPKD by assessing the accuracy of using statistical shape model augmented US data, to predict TKV, with the end goal of monitoring short-term changes.

  20. An introduction to statistical process control in research proteomics.

    Science.gov (United States)

    Bramwell, David

    2013-12-16

    Statistical process control is a well-established and respected method which provides a general purpose, and consistent framework for monitoring and improving the quality of a process. It is routinely used in many industries where the quality of final products is critical and is often required in clinical diagnostic laboratories [1,2]. To date, the methodology has been little utilised in research proteomics. It has been shown to be capable of delivering quantitative QC procedures for qualitative clinical assays [3] making it an ideal methodology to apply to this area of biological research. To introduce statistical process control as an objective strategy for quality control and show how it could be used to benefit proteomics researchers and enhance the quality of the results they generate. We demonstrate that rules which provide basic quality control are easy to derive and implement and could have a major impact on data quality for many studies. Statistical process control is a powerful tool for investigating and improving proteomics research work-flows. The process of characterising measurement systems and defining control rules forces the exploration of key questions that can lead to significant improvements in performance. This work asserts that QC is essential to proteomics discovery experiments. Every experimenter must know the current capabilities of their measurement system and have an objective means for tracking and ensuring that performance. Proteomic analysis work-flows are complicated and multi-variate. QC is critical for clinical chemistry measurements and huge strides have been made in ensuring the quality and validity of results in clinical biochemistry labs. This work introduces some of these QC concepts and works to bridge their use from single analyte QC to applications in multi-analyte systems. This article is part of a Special Issue entitled: Standardization and Quality Control in Proteomics. Copyright © 2013 The Author. Published by Elsevier

  1. A comprehensive analysis of the IMRT dose delivery process using statistical process control (SPC)

    Energy Technology Data Exchange (ETDEWEB)

    Gerard, Karine; Grandhaye, Jean-Pierre; Marchesi, Vincent; Kafrouni, Hanna; Husson, Francois; Aletti, Pierre [Research Center for Automatic Control (CRAN), Nancy University, CNRS, 54516 Vandoeuvre-les-Nancy (France); Department of Medical Physics, Alexis Vautrin Cancer Center, 54511 Vandoeuvre-les-Nancy Cedex (France) and DOSIsoft SA, 94230 Cachan (France); Research Laboratory for Innovative Processes (ERPI), Nancy University, EA 3767, 5400 Nancy Cedex (France); Department of Medical Physics, Alexis Vautrin Cancer Center, 54511 Vandoeuvre-les-Nancy Cedex (France); DOSIsoft SA, 94230 Cachan (France); Research Center for Automatic Control (CRAN), Nancy University, CNRS, 54516 Vandoeuvre-les-Nancy, France and Department of Medical Physics, Alexis Vautrin Cancer Center, 54511 Vandoeuvre-les-Nancy Cedex (France)

    2009-04-15

    The aim of this study is to introduce tools to improve the security of each IMRT patient treatment by determining action levels for the dose delivery process. To achieve this, the patient-specific quality control results performed with an ionization chamber--and which characterize the dose delivery process--have been retrospectively analyzed using a method borrowed from industry: Statistical process control (SPC). The latter consisted in fulfilling four principal well-structured steps. The authors first quantified the short term variability of ionization chamber measurements regarding the clinical tolerances used in the cancer center ({+-}4% of deviation between the calculated and measured doses) by calculating a control process capability (C{sub pc}) index. The C{sub pc} index was found superior to 4, which implies that the observed variability of the dose delivery process is not biased by the short term variability of the measurement. Then, the authors demonstrated using a normality test that the quality control results could be approximated by a normal distribution with two parameters (mean and standard deviation). Finally, the authors used two complementary tools--control charts and performance indices--to thoroughly analyze the IMRT dose delivery process. Control charts aim at monitoring the process over time using statistical control limits to distinguish random (natural) variations from significant changes in the process, whereas performance indices aim at quantifying the ability of the process to produce data that are within the clinical tolerances, at a precise moment. The authors retrospectively showed that the analysis of three selected control charts (individual value, moving-range, and EWMA control charts) allowed efficient drift detection of the dose delivery process for prostate and head-and-neck treatments before the quality controls were outside the clinical tolerances. Therefore, when analyzed in real time, during quality controls, they should

  2. A comprehensive analysis of the IMRT dose delivery process using statistical process control (SPC).

    Science.gov (United States)

    Gérard, Karine; Grandhaye, Jean-Pierre; Marchesi, Vincent; Kafrouni, Hanna; Husson, François; Aletti, Pierre

    2009-04-01

    The aim of this study is to introduce tools to improve the security of each IMRT patient treatment by determining action levels for the dose delivery process. To achieve this, the patient-specific quality control results performed with an ionization chamber--and which characterize the dose delivery process--have been retrospectively analyzed using a method borrowed from industry: Statistical process control (SPC). The latter consisted in fulfilling four principal well-structured steps. The authors first quantified the short-term variability of ionization chamber measurements regarding the clinical tolerances used in the cancer center (+/- 4% of deviation between the calculated and measured doses) by calculating a control process capability (C(pc)) index. The C(pc) index was found superior to 4, which implies that the observed variability of the dose delivery process is not biased by the short-term variability of the measurement. Then, the authors demonstrated using a normality test that the quality control results could be approximated by a normal distribution with two parameters (mean and standard deviation). Finally, the authors used two complementary tools--control charts and performance indices--to thoroughly analyze the IMRT dose delivery process. Control charts aim at monitoring the process over time using statistical control limits to distinguish random (natural) variations from significant changes in the process, whereas performance indices aim at quantifying the ability of the process to produce data that are within the clinical tolerances, at a precise moment. The authors retrospectively showed that the analysis of three selected control charts (individual value, moving-range, and EWMA control charts) allowed efficient drift detection of the dose delivery process for prostate and head-and-neck treatments before the quality controls were outside the clinical tolerances. Therefore, when analyzed in real time, during quality controls, they should improve the

  3. A new instrument for statistical process control of thermoset molding

    International Nuclear Information System (INIS)

    Day, D.R.; Lee, H.L.; Shepard, D.D.; Sheppard, N.F.

    1991-01-01

    The recent development of a rugged ceramic mold mounted dielectric sensor and high speed dielectric instrumentation now enables monitoring and statistical process control of production molding over thousands of runs. In this work special instrumentation and software (ICAM-1000) was utilized that automatically extracts critical point during the molding process including flow point, viscosity minimum gel inflection, and reaction endpoint. In addition, other sensors were incorporated to measure temperature and pressure. The critical point as well as temperature and pressure were then recorded during normal production and then plotted in the form of statistical process control (SPC) charts. Experiments have been carried out in RIM, SMC, and RTM type molding operations. The influence of temperature, pressure chemistry, and other variables has been investigated. In this paper examples of both RIM and SMC are discussed

  4. Selected papers on probability and statistics

    CERN Document Server

    2009-01-01

    This volume contains translations of papers that originally appeared in the Japanese journal Sūgaku. The papers range over a variety of topics in probability theory, statistics, and applications. This volume is suitable for graduate students and research mathematicians interested in probability and statistics.

  5. Statistical Inference on Memory Structure of Processes and Its Applications to Information Theory

    Science.gov (United States)

    2016-05-12

    Distribution Unlimited UU UU UU UU 12-05-2016 15-May-2014 14-Feb-2015 Final Report: Statistical Inference on Memory Structure of Processes and Its Applications ...ES) U.S. Army Research Office P.O. Box 12211 Research Triangle Park, NC 27709-2211 mathematical statistics ; time series; Markov chains; random...journals: Final Report: Statistical Inference on Memory Structure of Processes and Its Applications to Information Theory Report Title Three areas

  6. Auto-recognition of surfaces and auto-generation of material removal volume for finishing process

    Science.gov (United States)

    Kataraki, Pramod S.; Salman Abu Mansor, Mohd

    2018-03-01

    Auto-recognition of a surface and auto-generation of material removal volumes for the so recognised surfaces has become a need to achieve successful downstream manufacturing activities like automated process planning and scheduling. Few researchers have contributed to generation of material removal volume for a product but resulted in material removal volume discontinuity between two adjacent material removal volumes generated from two adjacent faces that form convex geometry. The need for limitation free material removal volume generation was attempted and an algorithm that automatically recognises computer aided design (CAD) model’s surface and also auto-generate material removal volume for finishing process of the recognised surfaces was developed. The surfaces of CAD model are successfully recognised by the developed algorithm and required material removal volume is obtained. The material removal volume discontinuity limitation that occurred in fewer studies is eliminated.

  7. Radioactive waste package assay facility. Volume 3. Data processing

    International Nuclear Information System (INIS)

    Creamer, S.C.; Lalies, A.A.; Wise, M.O.

    1992-01-01

    This report, in three volumes, covers the work carried out by Taylor Woodrow Construction Ltd, and two major sub-contractors: Harwell Laboratory (AEA Technology) and Siemens Plessey Controls Ltd, on the development of a radioactive waste package assay facility, for cemented 500 litre intermediate level waste drums. Volume 3, describes the work carried out by Siemens Plessey Controls Ltd on the data-processing aspects of an integrated waste assay facility. It introduces the need for a mathematical model of the assay process and develops a deterministic model which could be tested using Harwell experimental data. Relevant nuclear reactions are identified. Full implementation of the model was not possible within the scope of the Harwell experimental work, although calculations suggested that the model behaved as predicted by theory. 34 figs., 52 refs., 5 tabs

  8. Statistical process control charts for monitoring military injuries.

    Science.gov (United States)

    Schuh, Anna; Canham-Chervak, Michelle; Jones, Bruce H

    2017-12-01

    An essential aspect of an injury prevention process is surveillance, which quantifies and documents injury rates in populations of interest and enables monitoring of injury frequencies, rates and trends. To drive progress towards injury reduction goals, additional tools are needed. Statistical process control charts, a methodology that has not been previously applied to Army injury monitoring, capitalise on existing medical surveillance data to provide information to leadership about injury trends necessary for prevention planning and evaluation. Statistical process control Shewhart u-charts were created for 49 US Army installations using quarterly injury medical encounter rates, 2007-2015, for active duty soldiers obtained from the Defense Medical Surveillance System. Injuries were defined according to established military injury surveillance recommendations. Charts display control limits three standard deviations (SDs) above and below an installation-specific historical average rate determined using 28 data points, 2007-2013. Charts are available in Army strategic management dashboards. From 2007 to 2015, Army injury rates ranged from 1254 to 1494 unique injuries per 1000 person-years. Installation injury rates ranged from 610 to 2312 injuries per 1000 person-years. Control charts identified four installations with injury rates exceeding the upper control limits at least once during 2014-2015, rates at three installations exceeded the lower control limit at least once and 42 installations had rates that fluctuated around the historical mean. Control charts can be used to drive progress towards injury reduction goals by indicating statistically significant increases and decreases in injury rates. Future applications to military subpopulations, other health outcome metrics and chart enhancements are suggested. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  9. Petroleum supply annual 1998: Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-06-01

    The ``Petroleum Supply Annual`` (PSA) contains information on the supply and disposition of crude oil and petroleum products. The publication reflects data that were collected from the petroleum industry during 1998 through annual and monthly surveys. The PSA is divided into two volumes. This first volume contains three sections: Summary Statistics, Detailed Statistics, and Refinery Statistics; each with final annual data. The second volume contains final statistics for each month of 1998, and replaces data previously published in the PSA. The tables in Volumes 1 and 2 are similarly numbered to facilitate comparison between them. 16 figs., 59 tabs.

  10. Petroleum supply annual, 1997. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-06-01

    The Petroleum Supply Annual (PSA) contains information on the supply and disposition of crude oil and petroleum products. The publication reflects data that were collected from the petroleum industry during 1997 through annual and monthly surveys. The PSA is divided into two volumes. This first volume contains three sections: Summary Statistics, Detailed Statistics, and Refinery Statistics; each with final annual data. The second volume contains final statistics for each month of 1997, and replaces data previously published in the Petroleum Supply Monthly (PSM). The tables in Volumes 1 and 2 are similarly numbered to facilitate comparison between them. 16 figs., 48 tabs.

  11. Petroleum supply annual 1998: Volume 1

    International Nuclear Information System (INIS)

    1999-06-01

    The ''Petroleum Supply Annual'' (PSA) contains information on the supply and disposition of crude oil and petroleum products. The publication reflects data that were collected from the petroleum industry during 1998 through annual and monthly surveys. The PSA is divided into two volumes. This first volume contains three sections: Summary Statistics, Detailed Statistics, and Refinery Statistics; each with final annual data. The second volume contains final statistics for each month of 1998, and replaces data previously published in the PSA. The tables in Volumes 1 and 2 are similarly numbered to facilitate comparison between them. 16 figs., 59 tabs

  12. Petroleum supply annual, 1997. Volume 1

    International Nuclear Information System (INIS)

    1998-06-01

    The Petroleum Supply Annual (PSA) contains information on the supply and disposition of crude oil and petroleum products. The publication reflects data that were collected from the petroleum industry during 1997 through annual and monthly surveys. The PSA is divided into two volumes. This first volume contains three sections: Summary Statistics, Detailed Statistics, and Refinery Statistics; each with final annual data. The second volume contains final statistics for each month of 1997, and replaces data previously published in the Petroleum Supply Monthly (PSM). The tables in Volumes 1 and 2 are similarly numbered to facilitate comparison between them. 16 figs., 48 tabs

  13. Advances in statistical monitoring of complex multivariate processes with applications in industrial process control

    CERN Document Server

    Kruger, Uwe

    2012-01-01

    The development and application of multivariate statistical techniques in process monitoring has gained substantial interest over the past two decades in academia and industry alike.  Initially developed for monitoring and fault diagnosis in complex systems, such techniques have been refined and applied in various engineering areas, for example mechanical and manufacturing, chemical, electrical and electronic, and power engineering.  The recipe for the tremendous interest in multivariate statistical techniques lies in its simplicity and adaptability for developing monitoring applica

  14. On the joint statistics of stable random processes

    International Nuclear Information System (INIS)

    Hopcraft, K I; Jakeman, E

    2011-01-01

    A utilitarian continuous bi-variate random process whose first-order probability density function is a stable random variable is constructed. Results paralleling some of those familiar from the theory of Gaussian noise are derived. In addition to the joint-probability density for the process, these include fractional moments and structure functions. Although the correlation functions for stable processes other than Gaussian do not exist, we show that there is coherence between values adopted by the process at different times, which identifies a characteristic evolution with time. The distribution of the derivative of the process, and the joint-density function of the value of the process and its derivative measured at the same time are evaluated. These enable properties to be calculated analytically such as level crossing statistics and those related to the random telegraph wave. When the stable process is fractal, the proportion of time it spends at zero is finite and some properties of this quantity are evaluated, an optical interpretation for which is provided. (paper)

  15. Statistical Process Control. Impact and Opportunities for Ohio.

    Science.gov (United States)

    Brown, Harold H.

    The first purpose of this study is to help the reader become aware of the evolution of Statistical Process Control (SPC) as it is being implemented and used in industry today. This is approached through the presentation of a brief historical account of SPC, from its inception through the technological miracle that has occurred in Japan. The…

  16. Statistical process control: separating signal from noise in emergency department operations.

    Science.gov (United States)

    Pimentel, Laura; Barrueto, Fermin

    2015-05-01

    Statistical process control (SPC) is a visually appealing and statistically rigorous methodology very suitable to the analysis of emergency department (ED) operations. We demonstrate that the control chart is the primary tool of SPC; it is constructed by plotting data measuring the key quality indicators of operational processes in rationally ordered subgroups such as units of time. Control limits are calculated using formulas reflecting the variation in the data points from one another and from the mean. SPC allows managers to determine whether operational processes are controlled and predictable. We review why the moving range chart is most appropriate for use in the complex ED milieu, how to apply SPC to ED operations, and how to determine when performance improvement is needed. SPC is an excellent tool for operational analysis and quality improvement for these reasons: 1) control charts make large data sets intuitively coherent by integrating statistical and visual descriptions; 2) SPC provides analysis of process stability and capability rather than simple comparison with a benchmark; 3) SPC allows distinction between special cause variation (signal), indicating an unstable process requiring action, and common cause variation (noise), reflecting a stable process; and 4) SPC keeps the focus of quality improvement on process rather than individual performance. Because data have no meaning apart from their context, and every process generates information that can be used to improve it, we contend that SPC should be seriously considered for driving quality improvement in emergency medicine. Copyright © 2015 Elsevier Inc. All rights reserved.

  17. Advances in statistical models for data analysis

    CERN Document Server

    Minerva, Tommaso; Vichi, Maurizio

    2015-01-01

    This edited volume focuses on recent research results in classification, multivariate statistics and machine learning and highlights advances in statistical models for data analysis. The volume provides both methodological developments and contributions to a wide range of application areas such as economics, marketing, education, social sciences and environment. The papers in this volume were first presented at the 9th biannual meeting of the Classification and Data Analysis Group (CLADAG) of the Italian Statistical Society, held in September 2013 at the University of Modena and Reggio Emilia, Italy.

  18. Automated force volume image processing for biological samples.

    Directory of Open Access Journals (Sweden)

    Pavel Polyakov

    2011-04-01

    Full Text Available Atomic force microscopy (AFM has now become a powerful technique for investigating on a molecular level, surface forces, nanomechanical properties of deformable particles, biomolecular interactions, kinetics, and dynamic processes. This paper specifically focuses on the analysis of AFM force curves collected on biological systems, in particular, bacteria. The goal is to provide fully automated tools to achieve theoretical interpretation of force curves on the basis of adequate, available physical models. In this respect, we propose two algorithms, one for the processing of approach force curves and another for the quantitative analysis of retraction force curves. In the former, electrostatic interactions prior to contact between AFM probe and bacterium are accounted for and mechanical interactions operating after contact are described in terms of Hertz-Hooke formalism. Retraction force curves are analyzed on the basis of the Freely Jointed Chain model. For both algorithms, the quantitative reconstruction of force curves is based on the robust detection of critical points (jumps, changes of slope or changes of curvature which mark the transitions between the various relevant interactions taking place between the AFM tip and the studied sample during approach and retraction. Once the key regions of separation distance and indentation are detected, the physical parameters describing the relevant interactions operating in these regions are extracted making use of regression procedure for fitting experiments to theory. The flexibility, accuracy and strength of the algorithms are illustrated with the processing of two force-volume images, which collect a large set of approach and retraction curves measured on a single biological surface. For each force-volume image, several maps are generated, representing the spatial distribution of the searched physical parameters as estimated for each pixel of the force-volume image.

  19. Design and Statistics in Quantitative Translation (Process) Research

    DEFF Research Database (Denmark)

    Balling, Laura Winther; Hvelplund, Kristian Tangsgaard

    2015-01-01

    Traditionally, translation research has been qualitative, but quantitative research is becoming increasingly important, especially in translation process research but also in other areas of translation studies. This poses problems to many translation scholars since this way of thinking...... is unfamiliar. In this article, we attempt to mitigate these problems by outlining our approach to good quantitative research, all the way from research questions and study design to data preparation and statistics. We concentrate especially on the nature of the variables involved, both in terms of their scale...... and their role in the design; this has implications for both design and choice of statistics. Although we focus on quantitative research, we also argue that such research should be supplemented with qualitative analyses and considerations of the translation product....

  20. Evaluation of influence of the locality, the vintage year, wine variety and fermentation process on volume of cooper and lead in wine

    Directory of Open Access Journals (Sweden)

    Jaroslav Jedlička

    2014-11-01

    Full Text Available We have focused on the influence evaluation of the locality, the vintage year and fermentation process on the volume of copper and lead into grape must and wine. First of all copper and lead volume was assessed into fresh grape musts. Subsequently the musts were fermented. During the wines analyses we found great decrease of copper by the fermentation process. Assessed Cu2+ values vary from 0.07 to 0.2 mg.L-1 and represent a decrease of the original copper volume from 90 to 97%. On the copper content into grape has probably the significant influence also the precipitation amount, which falling in the second part of the vegetation half a year. Total rainfall in the period before the grape harvesting (the months of August - September was for the first year 153 mm and for second year 137,5 mm. During both observed vintage years it was concerning to the above average values. Copper is not possible to eliminate totally in the protection of the vine against fungal diseases, because against it does not come into existence resistance into a pathogen. For resolution of this problem it is suitable to combine the copper and organic products. Fermentation affect as a biological filter and influence also lead volume. Into analysed wines we found the decrease of the lead volume from 25 to 94%. Maximal assessed Pb2+ value into wine was 0.09 mg.L-1. The linear relationship between lead and copper into grape must in relationship to the lead and copper into wine was not statistically demonstrated. We found the statistically significant relationship in lead content into grape must by the influence of the vintage year, which as we supposed, it was connected with the atmospheric precipitation quantity and distribution during the vegetation. On the base of the assessed results of the lead and copper volume into wine, we state that by using of the faultless material and appropriate technological equipment during the wine production, it is possible to eliminate almost

  1. Nonclinical statistics for pharmaceutical and biotechnology industries

    CERN Document Server

    2016-01-01

    This book serves as a reference text for regulatory, industry and academic statisticians and also a handy manual for entry level Statisticians. Additionally it aims to stimulate academic interest in the field of Nonclinical Statistics and promote this as an important discipline in its own right. This text brings together for the first time in a single volume a comprehensive survey of methods important to the nonclinical science areas within the pharmaceutical and biotechnology industries. Specifically the Discovery and Translational sciences, the Safety/Toxiology sciences, and the Chemistry, Manufacturing and Controls sciences. Drug discovery and development is a long and costly process. Most decisions in the drug development process are made with incomplete information. The data is rife with uncertainties and hence risky by nature. This is therefore the purview of Statistics. As such, this book aims to introduce readers to important statistical thinking and its application in these nonclinical areas. The cha...

  2. Statistical processing of experimental data

    OpenAIRE

    NAVRÁTIL, Pavel

    2012-01-01

    This thesis contains theory of probability and statistical sets. Solved and unsolved problems of probability, random variable and distributions random variable, random vector, statistical sets, regression and correlation analysis. Unsolved problems contains solutions.

  3. Advances in statistics

    Science.gov (United States)

    Howard Stauffer; Nadav Nur

    2005-01-01

    The papers included in the Advances in Statistics section of the Partners in Flight (PIF) 2002 Proceedings represent a small sample of statistical topics of current importance to Partners In Flight research scientists: hierarchical modeling, estimation of detection probabilities, and Bayesian applications. Sauer et al. (this volume) examines a hierarchical model...

  4. Statistical Process Control: Going to the Limit for Quality.

    Science.gov (United States)

    Training, 1987

    1987-01-01

    Defines the concept of statistical process control, a quality control method used especially in manufacturing. Generally, concept users set specific standard levels that must be met. Makes the point that although employees work directly with the method, management is responsible for its success within the plant. (CH)

  5. Calcul statistique du volume des blocs matriciels d'un gisement fissuré The Statistical Computing of Matrix Block Volume in a Fissured Reservoir

    Directory of Open Access Journals (Sweden)

    Guez F.

    2006-11-01

    the distribution of block volumes. But it is precisely this distribution that qoverns the choice of one or several successive recovery methods. Therefore, this article describes an original method for statistically computing the distribution law for matrix-block volumes. This method con be applied of any point in a reservoir. The reservoir portion involved with blocks having a given volume is deduced from this method. A general understanding of the fracturing phenomenon acts as the basis for the model. Subsurface observations on reservoir fracturing provide the data (histogrom of fracture direction and spacing. An application to the Eschau field (Alsace, France is described here to illustrate the method.

  6. Michigan forest statistics, 1980.

    Science.gov (United States)

    Gerhard K. Raile; W. Brad Smith

    1983-01-01

    The fourth inventory of the timber resource of Michigan shows a 7% decline in commercial forest area and a 27% gain in growing-stock volume between 1966 and 1980. Highlights and statistics are presented on area, volume, growth, mortality, removals, utilization, and biomass.

  7. Illinois forest statistics, 1985.

    Science.gov (United States)

    Jerold T. Hahn

    1987-01-01

    The third inventory of the timber resource of Illinois shows a 1% increase in commercial forest area and a 40% gain in growing-stock volume between 1962 and 1985. Presented are highlights and statistics on area, volume, growth, mortality, removals, utilization, and biomass.

  8. [Statistical process control applied to intensity modulated radiotherapy pretreatment controls with portal dosimetry].

    Science.gov (United States)

    Villani, N; Gérard, K; Marchesi, V; Huger, S; François, P; Noël, A

    2010-06-01

    The first purpose of this study was to illustrate the contribution of statistical process control for a better security in intensity modulated radiotherapy (IMRT) treatments. This improvement is possible by controlling the dose delivery process, characterized by pretreatment quality control results. So, it is necessary to put under control portal dosimetry measurements (currently, the ionisation chamber measurements were already monitored by statistical process control thanks to statistical process control tools). The second objective was to state whether it is possible to substitute ionisation chamber with portal dosimetry in order to optimize time devoted to pretreatment quality control. At Alexis-Vautrin center, pretreatment quality controls in IMRT for prostate and head and neck treatments were performed for each beam of each patient. These controls were made with an ionisation chamber, which is the reference detector for the absolute dose measurement, and with portal dosimetry for the verification of dose distribution. Statistical process control is a statistical analysis method, coming from industry, used to control and improve the studied process quality. It uses graphic tools as control maps to follow-up process, warning the operator in case of failure, and quantitative tools to evaluate the process toward its ability to respect guidelines: this is the capability study. The study was performed on 450 head and neck beams and on 100 prostate beams. Control charts, showing drifts, both slow and weak, and also both strong and fast, of mean and standard deviation have been established and have shown special cause introduced (manual shift of the leaf gap of the multileaf collimator). Correlation between dose measured at one point, given with the EPID and the ionisation chamber has been evaluated at more than 97% and disagreement cases between the two measurements were identified. The study allowed to demonstrate the feasibility to reduce the time devoted to

  9. Statistical process control applied to intensity modulated radiotherapy pretreatment controls with portal dosimetry

    International Nuclear Information System (INIS)

    Villani, N.; Noel, A.; Villani, N.; Gerard, K.; Marchesi, V.; Huger, S.; Noel, A.; Francois, P.

    2010-01-01

    Purpose The first purpose of this study was to illustrate the contribution of statistical process control for a better security in intensity modulated radiotherapy (I.M.R.T.) treatments. This improvement is possible by controlling the dose delivery process, characterized by pretreatment quality control results. So, it is necessary to put under control portal dosimetry measurements (currently, the ionisation chamber measurements were already monitored by statistical process control thanks to statistical process control tools). The second objective was to state whether it is possible to substitute ionisation chamber with portal dosimetry in order to optimize time devoted to pretreatment quality control. Patients and methods At Alexis-Vautrin center, pretreatment quality controls in I.M.R.T. for prostate and head and neck treatments were performed for each beam of each patient. These controls were made with an ionisation chamber, which is the reference detector for the absolute dose measurement, and with portal dosimetry for the verification of dose distribution. Statistical process control is a statistical analysis method, coming from industry, used to control and improve the studied process quality. It uses graphic tools as control maps to follow-up process, warning the operator in case of failure, and quantitative tools to evaluate the process toward its ability to respect guidelines: this is the capability study. The study was performed on 450 head and neck beams and on 100 prostate beams. Results Control charts, showing drifts, both slow and weak, and also both strong and fast, of mean and standard deviation have been established and have shown special cause introduced (manual shift of the leaf gap of the multi-leaf collimator). Correlation between dose measured at one point, given with the E.P.I.D. and the ionisation chamber has been evaluated at more than 97% and disagreement cases between the two measurements were identified. Conclusion The study allowed to

  10. Philosophy of statistics

    CERN Document Server

    Forster, Malcolm R

    2011-01-01

    Statisticians and philosophers of science have many common interests but restricted communication with each other. This volume aims to remedy these shortcomings. It provides state-of-the-art research in the area of philosophy of statistics by encouraging numerous experts to communicate with one another without feeling "restricted” by their disciplines or thinking "piecemeal” in their treatment of issues. A second goal of this book is to present work in the field without bias toward any particular statistical paradigm. Broadly speaking, the essays in this Handbook are concerned with problems of induction, statistics and probability. For centuries, foundational problems like induction have been among philosophers' favorite topics; recently, however, non-philosophers have increasingly taken a keen interest in these issues. This volume accordingly contains papers by both philosophers and non-philosophers, including scholars from nine academic disciplines.

  11. Statistics to the Rescue!: Using Data to Evaluate a Manufacturing Process

    Science.gov (United States)

    Keithley, Michael G.

    2009-01-01

    The use of statistics and process controls is too often overlooked in educating students. This article describes an activity appropriate for high school students who have a background in material processing. It gives them a chance to advance their knowledge by determining whether or not a manufacturing process works well. The activity follows a…

  12. Guideline implementation in clinical practice: Use of statistical process control charts as visual feedback devices

    Directory of Open Access Journals (Sweden)

    Fahad A Al-Hussein

    2009-01-01

    Conclusions: A process of audits in the context of statistical process control is necessary for any improvement in the implementation of guidelines in primary care. Statistical process control charts are an effective means of visual feedback to the care providers.

  13. Statistical methods for evaluating the attainment of cleanup standards

    Energy Technology Data Exchange (ETDEWEB)

    Gilbert, R.O.; Simpson, J.C.

    1992-12-01

    This document is the third volume in a series of volumes sponsored by the US Environmental Protection Agency (EPA), Statistical Policy Branch, that provide statistical methods for evaluating the attainment of cleanup Standards at Superfund sites. Volume 1 (USEPA 1989a) provides sampling designs and tests for evaluating attainment of risk-based standards for soils and solid media. Volume 2 (USEPA 1992) provides designs and tests for evaluating attainment of risk-based standards for groundwater. The purpose of this third volume is to provide statistical procedures for designing sampling programs and conducting statistical tests to determine whether pollution parameters in remediated soils and solid media at Superfund sites attain site-specific reference-based standards. This.document is written for individuals who may not have extensive training or experience with statistical methods. The intended audience includes EPA regional remedial project managers, Superfund-site potentially responsible parties, state environmental protection agencies, and contractors for these groups.

  14. Using Statistical Process Control Methods to Classify Pilot Mental Workloads

    National Research Council Canada - National Science Library

    Kudo, Terence

    2001-01-01

    .... These include cardiac, ocular, respiratory, and brain activity measures. The focus of this effort is to apply statistical process control methodology on different psychophysiological features in an attempt to classify pilot mental workload...

  15. Statistical process control for radiotherapy quality assurance

    International Nuclear Information System (INIS)

    Pawlicki, Todd; Whitaker, Matthew; Boyer, Arthur L.

    2005-01-01

    Every quality assurance process uncovers random and systematic errors. These errors typically consist of many small random errors and a very few number of large errors that dominate the result. Quality assurance practices in radiotherapy do not adequately differentiate between these two sources of error. The ability to separate these types of errors would allow the dominant source(s) of error to be efficiently detected and addressed. In this work, statistical process control is applied to quality assurance in radiotherapy for the purpose of setting action thresholds that differentiate between random and systematic errors. The theoretical development and implementation of process behavior charts are described. We report on a pilot project is which these techniques are applied to daily output and flatness/symmetry quality assurance for a 10 MV photon beam in our department. This clinical case was followed over 52 days. As part of our investigation, we found that action thresholds set using process behavior charts were able to identify systematic changes in our daily quality assurance process. This is in contrast to action thresholds set using the standard deviation, which did not identify the same systematic changes in the process. The process behavior thresholds calculated from a subset of the data detected a 2% change in the process whereas with a standard deviation calculation, no change was detected. Medical physicists must make decisions on quality assurance data as it is acquired. Process behavior charts help decide when to take action and when to acquire more data before making a change in the process

  16. Statistical Process Control in the Practice of Program Evaluation.

    Science.gov (United States)

    Posavac, Emil J.

    1995-01-01

    A technique developed to monitor the quality of manufactured products, statistical process control (SPC), incorporates several features that may prove attractive to evaluators. This paper reviews the history of SPC, suggests how the approach can enrich program evaluation, and illustrates its use in a hospital-based example. (SLD)

  17. New method of assigning uncertainty in volume calibration

    International Nuclear Information System (INIS)

    Lechner, J.A.; Reeve, C.P.; Spiegelman, C.H.

    1980-12-01

    This paper presents a practical statistical overview of the pressure-volume calibration curve for large nuclear materials processing tanks. It explains the appropriateness of applying splines (piecewise polynomials) to this curve, and it presents an overview of the associated statistical uncertainties. In order to implement these procedures, a practical and portable FORTRAN IV program is provided along with its users' manual. Finally, the recommended procedure is demonstrated on actual tank data collected by NBS

  18. Petroleum supply annual 1994. Volume 1

    International Nuclear Information System (INIS)

    1995-01-01

    The Petroleum Supply Annual (PSA) contains information on the supply and disposition of crude oil and petroleum products. The publication reflects data that were collected from the petroleum industry during 1994 through annual and monthly surveys. The PSA is divided into two volumes. This first volume contains four sections: Summary Statistics, Detailed Statistics, Refinery Capacity, and Oxygenate Capacity each with final annual data. The second volume contains final statistics for each month of 1994, and replaces data previously published in the Petroleum Supply Monthly (PSM). The tables in Volumes 1 and 2 are similarly numbered to facilitate comparison between them. Below is a description of each section in Volume 1 of the PSA

  19. Petroleum supply annual 1992: Volume 1

    International Nuclear Information System (INIS)

    1993-01-01

    The Petroleum Supply Annual (PSA) contains information on the supply and disposition of crude oil and petroleum products. The publication reflects data that were collected from the petroleum industry during 1992 through annual and monthly surveys. The PSA is divided into two volumes. This first volume contains four sections: Summary Statistics, Detailed Statistics, Refinery Capacity and Oxygenate Capacity each with final annual data. The second volume contains final statistics for each month of 1992, and replaces data previously published in the Petroleum Supply Monthly (PSM). The tables in Volumes 1 and 2 are similarly numbered to facilitate comparison between them

  20. Radiographic rejection index using statistical process control

    International Nuclear Information System (INIS)

    Savi, M.B.M.B.; Camozzato, T.S.C.; Soares, F.A.P.; Nandi, D.M.

    2015-01-01

    The Repeat Analysis Index (IRR) is one of the items contained in the Quality Control Program dictated by brazilian law of radiological protection and should be performed frequently, at least every six months. In order to extract more and better information of IRR, this study presents the Statistical Quality Control applied to reject rate through Statistical Process Control (Control Chart for Attributes ρ - GC) and the Pareto Chart (GP). Data collection was performed for 9 months and the last four months of collection was given on a daily basis. The Limits of Control (LC) were established and Minitab 16 software used to create the charts. IRR obtained for the period was corresponding to 8.8% ± 2,3% and the generated charts analyzed. Relevant information such as orders for X-ray equipment and processors were crossed to identify the relationship between the points that exceeded the control limits and the state of equipment at the time. The GC demonstrated ability to predict equipment failures, as well as the GP showed clearly what causes are recurrent in IRR. (authors) [pt

  1. Statistics

    International Nuclear Information System (INIS)

    1999-01-01

    For the year 1998 and the year 1999, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 1999, Energy exports by recipient country in January-June 1999, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  2. Statistics

    International Nuclear Information System (INIS)

    2001-01-01

    For the year 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions from the use of fossil fuels, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in 2000, Energy exports by recipient country in 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  3. Statistics

    International Nuclear Information System (INIS)

    2000-01-01

    For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g., Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-March 2000, Energy exports by recipient country in January-March 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  4. International Conference on Mathematical Sciences and Statistics 2013 : Selected Papers

    CERN Document Server

    Leong, Wah; Eshkuvatov, Zainidin

    2014-01-01

    This volume is devoted to the most recent discoveries in mathematics and statistics. It also serves as a platform for knowledge and information exchange between experts from industrial and academic sectors. The book covers a wide range of topics, including mathematical analyses, probability, statistics, algebra, geometry, mathematical physics, wave propagation, stochastic processes, ordinary and partial differential equations, boundary value problems, linear operators, cybernetics and number and functional theory. It is a valuable resource for pure and applied mathematicians, statisticians, engineers and scientists.

  5. Heterogeneous Rock Simulation Using DIP-Micromechanics-Statistical Methods

    Directory of Open Access Journals (Sweden)

    H. Molladavoodi

    2018-01-01

    Full Text Available Rock as a natural material is heterogeneous. Rock material consists of minerals, crystals, cement, grains, and microcracks. Each component of rock has a different mechanical behavior under applied loading condition. Therefore, rock component distribution has an important effect on rock mechanical behavior, especially in the postpeak region. In this paper, the rock sample was studied by digital image processing (DIP, micromechanics, and statistical methods. Using image processing, volume fractions of the rock minerals composing the rock sample were evaluated precisely. The mechanical properties of the rock matrix were determined based on upscaling micromechanics. In order to consider the rock heterogeneities effect on mechanical behavior, the heterogeneity index was calculated in a framework of statistical method. A Weibull distribution function was fitted to the Young modulus distribution of minerals. Finally, statistical and Mohr–Coulomb strain-softening models were used simultaneously as a constitutive model in DEM code. The acoustic emission, strain energy release, and the effect of rock heterogeneities on the postpeak behavior process were investigated. The numerical results are in good agreement with experimental data.

  6. Westinghouse Modular Grinding Process - Enhancement of Volume Reduction for Hot Resin Supercompaction - 13491

    Energy Technology Data Exchange (ETDEWEB)

    Fehrmann, Henning [Westinghouse Electric Germany GmbH, Dudenstr. 44, D-68167 Mannheim (Germany); Aign, Joerg [Westinghouse Electric Germany GmbH, Global D and D and Waste Management, Tarpenring 6, D-22419 Hamburg (Germany)

    2013-07-01

    In nuclear power plants (NPP) ion exchange (IX) resins are used in several systems for water treatment. Spent resins can contain a significant amount of contaminates which makes treatment for disposal of spent resins mandatory. Several treatment processes are available such as direct immobilization with technologies like cementation, bitumisation, polymer solidification or usage of a high integrity container (HIC). These technologies usually come with a significant increase in final waste volume. The Hot Resin Supercompaction (HRSC) is a thermal treatment process which reduces the resin waste volume significantly. For a mixture of powdered and bead resins the HRSC process has demonstrated a volume reduction of up to 75 % [1]. For bead resins only the HRSC process is challenging because the bead resins compaction properties are unfavorable. The bead resin material does not form a solid block after compaction and shows a high spring back effect. The volume reduction of bead resins is not as good as for the mixture described in [1]. The compaction properties of bead resin waste can be significantly improved by grinding the beads to powder. The grinding also eliminates the need for a powder additive.Westinghouse has developed a modular grinding process to grind the bead resin to powder. The developed process requires no circulation of resins and enables a selective adjustment of particle size and distribution to achieve optimal results in the HRSC or in any other following process. A special grinding tool setup is use to minimize maintenance and radiation exposure to personnel. (authors)

  7. Process simulation and statistical approaches for validating waste form qualification models

    International Nuclear Information System (INIS)

    Kuhn, W.L.; Toland, M.R.; Pulsipher, B.A.

    1989-05-01

    This report describes recent progress toward one of the principal objectives of the Nuclear Waste Treatment Program (NWTP) at the Pacific Northwest Laboratory (PNL): to establish relationships between vitrification process control and glass product quality. during testing of a vitrification system, it is important to show that departures affecting the product quality can be sufficiently detected through process measurements to prevent an unacceptable canister from being produced. Meeting this goal is a practical definition of a successful sampling, data analysis, and process control strategy. A simulation model has been developed and preliminarily tested by applying it to approximate operation of the West Valley Demonstration Project (WVDP) vitrification system at West Valley, New York. Multivariate statistical techniques have been identified and described that can be applied to analyze large sets of process measurements. Information on components, tanks, and time is then combined to create a single statistic through which all of the information can be used at once to determine whether the process has shifted away from a normal condition

  8. Reaming process improvement and control: An application of statistical engineering

    DEFF Research Database (Denmark)

    Müller, Pavel; Genta, G.; Barbato, G.

    2012-01-01

    A reaming operation had to be performed within given technological and economical constraints. Process improvement under realistic conditions was the goal of a statistical engineering project, supported by a comprehensive experimental investigation providing detailed information on single...

  9. Development of Statistical Process Control Methodology for an Environmentally Compliant Surface Cleaning Process in a Bonding Laboratory

    Science.gov (United States)

    Hutchens, Dale E.; Doan, Patrick A.; Boothe, Richard E.

    1997-01-01

    Bonding labs at both MSFC and the northern Utah production plant prepare bond test specimens which simulate or witness the production of NASA's Reusable Solid Rocket Motor (RSRM). The current process for preparing the bonding surfaces employs 1,1,1-trichloroethane vapor degreasing, which simulates the current RSRM process. Government regulations (e.g., the 1990 Amendments to the Clean Air Act) have mandated a production phase-out of a number of ozone depleting compounds (ODC) including 1,1,1-trichloroethane. In order to comply with these regulations, the RSRM Program is qualifying a spray-in-air (SIA) precision cleaning process using Brulin 1990, an aqueous blend of surfactants. Accordingly, surface preparation prior to bonding process simulation test specimens must reflect the new production cleaning process. The Bonding Lab Statistical Process Control (SPC) program monitors the progress of the lab and its capabilities, as well as certifies the bonding technicians, by periodically preparing D6AC steel tensile adhesion panels with EA-91 3NA epoxy adhesive using a standardized process. SPC methods are then used to ensure the process is statistically in control, thus producing reliable data for bonding studies, and identify any problems which might develop. Since the specimen cleaning process is being changed, new SPC limits must be established. This report summarizes side-by-side testing of D6AC steel tensile adhesion witness panels and tapered double cantilevered beams (TDCBs) using both the current baseline vapor degreasing process and a lab-scale spray-in-air process. A Proceco 26 inches Typhoon dishwasher cleaned both tensile adhesion witness panels and TDCBs in a process which simulates the new production process. The tests were performed six times during 1995, subsequent statistical analysis of the data established new upper control limits (UCL) and lower control limits (LCL). The data also demonstrated that the new process was equivalent to the vapor

  10. Bayesian statistical inference

    Directory of Open Access Journals (Sweden)

    Bruno De Finetti

    2017-04-01

    Full Text Available This work was translated into English and published in the volume: Bruno De Finetti, Induction and Probability, Biblioteca di Statistica, eds. P. Monari, D. Cocchi, Clueb, Bologna, 1993.Bayesian statistical Inference is one of the last fundamental philosophical papers in which we can find the essential De Finetti's approach to the statistical inference.

  11. International Conference on Robust Statistics

    CERN Document Server

    Filzmoser, Peter; Gather, Ursula; Rousseeuw, Peter

    2003-01-01

    Aspects of Robust Statistics are important in many areas. Based on the International Conference on Robust Statistics 2001 (ICORS 2001) in Vorau, Austria, this volume discusses future directions of the discipline, bringing together leading scientists, experienced researchers and practitioners, as well as younger researchers. The papers cover a multitude of different aspects of Robust Statistics. For instance, the fundamental problem of data summary (weights of evidence) is considered and its robustness properties are studied. Further theoretical subjects include e.g.: robust methods for skewness, time series, longitudinal data, multivariate methods, and tests. Some papers deal with computational aspects and algorithms. Finally, the aspects of application and programming tools complete the volume.

  12. Equilibrium statistical mechanics

    CERN Document Server

    Jackson, E Atlee

    2000-01-01

    Ideal as an elementary introduction to equilibrium statistical mechanics, this volume covers both classical and quantum methodology for open and closed systems. Introductory chapters familiarize readers with probability and microscopic models of systems, while additional chapters describe the general derivation of the fundamental statistical mechanics relationships. The final chapter contains 16 sections, each dealing with a different application, ordered according to complexity, from classical through degenerate quantum statistical mechanics. Key features include an elementary introduction t

  13. Tungsten Ions in Plasmas: Statistical Theory of Radiative-Collisional Processes

    Directory of Open Access Journals (Sweden)

    Alexander V. Demura

    2015-05-01

    Full Text Available The statistical model for calculations of the collisional-radiative processes in plasmas with tungsten impurity was developed. The electron structure of tungsten multielectron ions is considered in terms of both the Thomas-Fermi model and the Brandt-Lundquist model of collective oscillations of atomic electron density. The excitation or ionization of atomic electrons by plasma electron impacts are represented as photo-processes under the action of flux of equivalent photons introduced by E. Fermi. The total electron impact single ionization cross-sections of ions Wk+ with respective rates have been calculated and compared with the available experimental and modeling data (e.g., CADW. Plasma radiative losses on tungsten impurity were also calculated in a wide range of electron temperatures 1 eV–20 keV. The numerical code TFATOM was developed for calculations of radiative-collisional processes involving tungsten ions. The needed computational resources for TFATOM code are orders of magnitudes less than for the other conventional numerical codes. The transition from corona to Boltzmann limit was investigated in detail. The results of statistical approach have been tested by comparison with the vast experimental and conventional code data for a set of ions Wk+. It is shown that the universal statistical model accuracy for the ionization cross-sections and radiation losses is within the data scattering of significantly more complex quantum numerical codes, using different approximations for the calculation of atomic structure and the electronic cross-sections.

  14. Controlled air incinerator for radioactive waste. Volume I. Rationale, process, equipment, performance, and recommendations

    International Nuclear Information System (INIS)

    Neuls, A.S.; Draper, W.E.; Koenig, R.A.; Newmyer, J.M.; Warner, C.L.

    1982-11-01

    This two-volume report is a detailed design and operating documentation of the Los Alamos National Laboratory Controlled Air Incinerator (CAI) and is an aid to technology transfer to other Department of Energy contractor sites and the commercial sector. Volume I describes the CAI process, equipment, and performance, and it recommends modifications based on Los Alamos experience. It provides the necessary information for conceptual design and feasibility studies. Volume II provides descriptive engineering information such as drawings specifications, calculations, and costs. It aids duplication of the process at other facilities

  15. Process innovations to minimize waste volumes at Savannah River

    International Nuclear Information System (INIS)

    Doherty, J.P.

    1986-01-01

    In 1983 approximately 1.6 x 10 3 m 3 (427,000 gallons) of radioactive salt solution were decontaminated in a full-scale demonstration. The cesium decontamination factor (DF) was in excess of 4 x 10 4 vs. a goal of 1 x 10 4 . Data from this test were combined with pilot data and used to design the permanent facilities currently under construction. Startup of the Salt Decontamination Process is scheduled for 1987 and will decontaminate 2 x 10 4 m 3 (5.2 million gallons) of radioactive salt solution and generate 2 x 10 3 m 3 (520,000 gallons) of concentrated and washed precipitate per year. The Defense Waste Processing Facility (DWPF) will begin processing this concentrate in the Precipitate Hydrolysis Process starting in 1989. Laboratory data using simulated salt solution and nonradioactive cesium are being used to design this process. A 1/5-scale pilot plant is under construction and will be used to gain large-scale operating experience using nonradioactive simulants. This pilot plant is scheduled to startup in early 1987. The incentives to reduce the volume of waste that must be treated are self-evident. At Savannah River process development innovations to minimize the DWPF feed volumes have directly improved the economics of the process. The integrity of the final borosilicate glass water form has not been compromised by these developments. Many of the unit operations are familiar to chemical engineers and were put to use in a unique environment. As a result, tax dollars have been saved, and the objective of safely disposing of the nation's high-level defense waste has moved forward

  16. Statistical process control using optimized neural networks: a case study.

    Science.gov (United States)

    Addeh, Jalil; Ebrahimzadeh, Ata; Azarbad, Milad; Ranaee, Vahid

    2014-09-01

    The most common statistical process control (SPC) tools employed for monitoring process changes are control charts. A control chart demonstrates that the process has altered by generating an out-of-control signal. This study investigates the design of an accurate system for the control chart patterns (CCPs) recognition in two aspects. First, an efficient system is introduced that includes two main modules: feature extraction module and classifier module. In the feature extraction module, a proper set of shape features and statistical feature are proposed as the efficient characteristics of the patterns. In the classifier module, several neural networks, such as multilayer perceptron, probabilistic neural network and radial basis function are investigated. Based on an experimental study, the best classifier is chosen in order to recognize the CCPs. Second, a hybrid heuristic recognition system is introduced based on cuckoo optimization algorithm (COA) algorithm to improve the generalization performance of the classifier. The simulation results show that the proposed algorithm has high recognition accuracy. Copyright © 2013 ISA. Published by Elsevier Ltd. All rights reserved.

  17. Feasibility of large volume casting cementation process for intermediate level radioactive waste

    International Nuclear Information System (INIS)

    Chen Zhuying; Chen Baisong; Zeng Jishu; Yu Chengze

    1988-01-01

    The recent tendency of radioactive waste treatment and disposal both in China and abroad is reviewed. The feasibility of the large volume casting cementation process for treating and disposing the intermediate level radioactive waste from spent fuel reprocessing plant in shallow land is assessed on the basis of the analyses of the experimental results (such as formulation study, solidified radioactive waste properties measurement ect.). It can be concluded large volume casting cementation process is a promising, safe and economic process. It is feasible to dispose the intermediate level radioactive waste from reprocessing plant it the disposal site chosen has resonable geological and geographical conditions and some additional effective protection means are taken

  18. Influences of excluded volume of molecules on signaling processes on the biomembrane.

    Directory of Open Access Journals (Sweden)

    Masashi Fujii

    Full Text Available We investigate the influences of the excluded volume of molecules on biochemical reaction processes on 2-dimensional surfaces using a model of signal transduction processes on biomembranes. We perform simulations of the 2-dimensional cell-based model, which describes the reactions and diffusion of the receptors, signaling proteins, target proteins, and crowders on the cell membrane. The signaling proteins are activated by receptors, and these activated signaling proteins activate target proteins that bind autonomously from the cytoplasm to the membrane, and unbind from the membrane if activated. If the target proteins bind frequently, the volume fraction of molecules on the membrane becomes so large that the excluded volume of the molecules for the reaction and diffusion dynamics cannot be negligible. We find that such excluded volume effects of the molecules induce non-trivial variations of the signal flow, defined as the activation frequency of target proteins, as follows. With an increase in the binding rate of target proteins, the signal flow varies by i monotonically increasing; ii increasing then decreasing in a bell-shaped curve; or iii increasing, decreasing, then increasing in an S-shaped curve. We further demonstrate that the excluded volume of molecules influences the hierarchical molecular distributions throughout the reaction processes. In particular, when the system exhibits a large signal flow, the signaling proteins tend to surround the receptors to form receptor-signaling protein clusters, and the target proteins tend to become distributed around such clusters. To explain these phenomena, we analyze the stochastic model of the local motions of molecules around the receptor.

  19. Multiresolution, Geometric, and Learning Methods in Statistical Image Processing, Object Recognition, and Sensor Fusion

    National Research Council Canada - National Science Library

    Willsky, Alan

    2004-01-01

    .... Our research blends methods from several fields-statistics and probability, signal and image processing, mathematical physics, scientific computing, statistical learning theory, and differential...

  20. Statistical Process Control. A Summary. FEU/PICKUP Project Report.

    Science.gov (United States)

    Owen, M.; Clark, I.

    A project was conducted to develop a curriculum and training materials to be used in training industrial operatives in statistical process control (SPC) techniques. During the first phase of the project, questionnaires were sent to 685 companies (215 of which responded) to determine where SPC was being used, what type of SPC firms needed, and how…

  1. Theoretical test of Jarzynski's equality for reversible volume-switching processes of an ideal gas system.

    Science.gov (United States)

    Sung, Jaeyoung

    2007-07-01

    We present an exact theoretical test of Jarzynski's equality (JE) for reversible volume-switching processes of an ideal gas system. The exact analysis shows that the prediction of JE for the free energy difference is the same as the work done on the gas system during the reversible process that is dependent on the shape of path of the reversible volume-switching process.

  2. Multivariate statistical process control of a continuous pharmaceutical twin-screw granulation and fluid bed drying process.

    Science.gov (United States)

    Silva, A F; Sarraguça, M C; Fonteyne, M; Vercruysse, J; De Leersnyder, F; Vanhoorne, V; Bostijn, N; Verstraeten, M; Vervaet, C; Remon, J P; De Beer, T; Lopes, J A

    2017-08-07

    A multivariate statistical process control (MSPC) strategy was developed for the monitoring of the ConsiGma™-25 continuous tablet manufacturing line. Thirty-five logged variables encompassing three major units, being a twin screw high shear granulator, a fluid bed dryer and a product control unit, were used to monitor the process. The MSPC strategy was based on principal component analysis of data acquired under normal operating conditions using a series of four process runs. Runs with imposed disturbances in the dryer air flow and temperature, in the granulator barrel temperature, speed and liquid mass flow and in the powder dosing unit mass flow were utilized to evaluate the model's monitoring performance. The impact of the imposed deviations to the process continuity was also evaluated using Hotelling's T 2 and Q residuals statistics control charts. The influence of the individual process variables was assessed by analyzing contribution plots at specific time points. Results show that the imposed disturbances were all detected in both control charts. Overall, the MSPC strategy was successfully developed and applied. Additionally, deviations not associated with the imposed changes were detected, mainly in the granulator barrel temperature control. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Equilibrium statistical mechanics

    CERN Document Server

    Mayer, J E

    1968-01-01

    The International Encyclopedia of Physical Chemistry and Chemical Physics, Volume 1: Equilibrium Statistical Mechanics covers the fundamental principles and the development of theoretical aspects of equilibrium statistical mechanics. Statistical mechanical is the study of the connection between the macroscopic behavior of bulk matter and the microscopic properties of its constituent atoms and molecules. This book contains eight chapters, and begins with a presentation of the master equation used for the calculation of the fundamental thermodynamic functions. The succeeding chapters highlight t

  4. Hazard rate model and statistical analysis of a compound point process

    Czech Academy of Sciences Publication Activity Database

    Volf, Petr

    2005-01-01

    Roč. 41, č. 6 (2005), s. 773-786 ISSN 0023-5954 R&D Projects: GA ČR(CZ) GA402/04/1294 Institutional research plan: CEZ:AV0Z10750506 Keywords : couting process * compound process * Cox regression model * intensity Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.343, year: 2005

  5. Statistical Process Control Charts for Measuring and Monitoring Temporal Consistency of Ratings

    Science.gov (United States)

    Omar, M. Hafidz

    2010-01-01

    Methods of statistical process control were briefly investigated in the field of educational measurement as early as 1999. However, only the use of a cumulative sum chart was explored. In this article other methods of statistical quality control are introduced and explored. In particular, methods in the form of Shewhart mean and standard deviation…

  6. Fisher's Contributions to Statistics

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 2; Issue 9. Fisher's Contributions to Statistics. T Krishnan. General Article Volume 2 Issue 9 September 1997 pp 32-37. Fulltext. Click here to view fulltext PDF. Permanent link: https://www.ias.ac.in/article/fulltext/reso/002/09/0032-0037. Author Affiliations.

  7. Automatic generation of statistical pose and shape models for articulated joints.

    Science.gov (United States)

    Xin Chen; Graham, Jim; Hutchinson, Charles; Muir, Lindsay

    2014-02-01

    Statistical analysis of motion patterns of body joints is potentially useful for detecting and quantifying pathologies. However, building a statistical motion model across different subjects remains a challenging task, especially for a complex joint like the wrist. We present a novel framework for simultaneous registration and segmentation of multiple 3-D (CT or MR) volumes of different subjects at various articulated positions. The framework starts with a pose model generated from 3-D volumes captured at different articulated positions of a single subject (template). This initial pose model is used to register the template volume to image volumes from new subjects. During this process, the Grow-Cut algorithm is used in an iterative refinement of the segmentation of the bone along with the pose parameters. As each new subject is registered and segmented, the pose model is updated, improving the accuracy of successive registrations. We applied the algorithm to CT images of the wrist from 25 subjects, each at five different wrist positions and demonstrated that it performed robustly and accurately. More importantly, the resulting segmentations allowed a statistical pose model of the carpal bones to be generated automatically without interaction. The evaluation results show that our proposed framework achieved accurate registration with an average mean target registration error of 0.34 ±0.27 mm. The automatic segmentation results also show high consistency with the ground truth obtained semi-automatically. Furthermore, we demonstrated the capability of the resulting statistical pose and shape models by using them to generate a measurement tool for scaphoid-lunate dissociation diagnosis, which achieved 90% sensitivity and specificity.

  8. Petroleum supply annual 1998: Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-06-01

    The Petroleum Supply Annual (PSA) contains information on the supply and disposition of crude oil and petroleum products. The publication reflects data that were collected from the petroleum industry during 1998 through monthly surveys. The PSA is divided into two volumes. The first volume contains three sections: Summary Statistics, Detailed Statistics, and Refinery Statistics; each with final annual data. This second volume contains final statistics for each month of 1998, and replaces data previously published in the Petroleum Supply Monthly (PSM). The tables in Volumes 1 and 2 are similarly numbered to facilitate comparison between them. Explanatory Notes, located at the end of this publication, present information describing data collection, sources, estimation methodology, data quality control procedures, modifications to reporting requirements and interpretation of tables. Industry terminology and product definitions are listed alphabetically in the Glossary. 35 tabs.

  9. Petroleum supply annual, 1997. Volume 2

    International Nuclear Information System (INIS)

    1998-06-01

    The Petroleum Supply Annual (PSA) contains information on the supply and disposition of crude oil and petroleum products. The publication reflects data that were collected from the petroleum industry during 1997 through monthly surveys. The PSA is divided into two volumes. The first volume contains three sections: Summary Statistics, Detailed Statistics, and Refinery Statistics; each with final annual data. The second volume contains final statistics for each month of 1997, and replaces data previously published in the Petroleum Supply Monthly (PSM). The tables in Volumes 1 and 2 are similarly numbered to facilitate comparison between them. Explanatory Notes, located at the end of this publication, present information describing data collection, sources, estimation methodology, data quality control procedures, modifications to reporting requirements and interpretation of tables. Industry terminology and product definitions are listed alphabetically in the Glossary. 35 tabs

  10. Petroleum supply annual, 1997. Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-06-01

    The Petroleum Supply Annual (PSA) contains information on the supply and disposition of crude oil and petroleum products. The publication reflects data that were collected from the petroleum industry during 1997 through monthly surveys. The PSA is divided into two volumes. The first volume contains three sections: Summary Statistics, Detailed Statistics, and Refinery Statistics; each with final annual data. The second volume contains final statistics for each month of 1997, and replaces data previously published in the Petroleum Supply Monthly (PSM). The tables in Volumes 1 and 2 are similarly numbered to facilitate comparison between them. Explanatory Notes, located at the end of this publication, present information describing data collection, sources, estimation methodology, data quality control procedures, modifications to reporting requirements and interpretation of tables. Industry terminology and product definitions are listed alphabetically in the Glossary. 35 tabs.

  11. Petroleum supply annual 1995: Volume 2

    International Nuclear Information System (INIS)

    1996-06-01

    The Petroleum Supply Annual (PSA) contains information on the supply and disposition of crude oil and petroleum products. The publication reflects data that were collected from the petroleum industry during 1995 through monthly surveys. The PSA is divided into two volumes. This first volume contains three sections: Summary Statistics, Detailed Statistics, and selected Refinery Statistics each with final annual data. The second volume contains final statistics for each month of 1995, and replaces data previously published in the Petroleum Supply Monthly (PSM). The tables in Volumes 1 and 2 are similarly numbered to facilitate comparison between them. Explanatory Notes, located at the end of this publication, present information describing data collection, sources, estimation methodology, data quality control procedures, modifications to reporting requirements and interpretation of tables. Industry terminology and product definitions are listed alphabetically in the Glossary

  12. Petroleum supply annual 1998. Volume 2

    International Nuclear Information System (INIS)

    1999-06-01

    The Petroleum Supply Annual (PSA) contains information on the supply and disposition of crude oil and petroleum products. The publication reflects data that were collected from the petroleum industry during 1998 through monthly surveys. The PSA is divided into two volumes. The first volume contains three sections: Summary Statistics, Detailed Statistics, and Refinery Statistics; each with final annual data. This second volume contains final statistics for each month of 1998, and replaces data previously published in the Petroleum Supply Monthly (PSM). The tables in Volumes 1 and 2 are similarly numbered to facilitate comparison between them. Explanatory Notes, located at the end of this publication, present information describing data collection, sources, estimation methodology, data quality control procedures, modifications to reporting requirements and interpretation of tables. Industry terminology and product definitions are listed alphabetically in the Glossary. 35 tabs

  13. Crop identification technology assessment for remote sensing. (CITARS) Volume 9: Statistical analysis of results

    Science.gov (United States)

    Davis, B. J.; Feiveson, A. H.

    1975-01-01

    Results are presented of CITARS data processing in raw form. Tables of descriptive statistics are given along with descriptions and results of inferential analyses. The inferential results are organized by questions which CITARS was designed to answer.

  14. Robust statistical reconstruction for charged particle tomography

    Science.gov (United States)

    Schultz, Larry Joe; Klimenko, Alexei Vasilievich; Fraser, Andrew Mcleod; Morris, Christopher; Orum, John Christopher; Borozdin, Konstantin N; Sossong, Michael James; Hengartner, Nicolas W

    2013-10-08

    Systems and methods for charged particle detection including statistical reconstruction of object volume scattering density profiles from charged particle tomographic data to determine the probability distribution of charged particle scattering using a statistical multiple scattering model and determine a substantially maximum likelihood estimate of object volume scattering density using expectation maximization (ML/EM) algorithm to reconstruct the object volume scattering density. The presence of and/or type of object occupying the volume of interest can be identified from the reconstructed volume scattering density profile. The charged particle tomographic data can be cosmic ray muon tomographic data from a muon tracker for scanning packages, containers, vehicles or cargo. The method can be implemented using a computer program which is executable on a computer.

  15. Statistics

    International Nuclear Information System (INIS)

    2000-01-01

    For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy also includes historical time series over a longer period (see e.g., Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 2000, Energy exports by recipient country in January-June 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  16. Adaptive statistical iterative reconstruction for volume-rendered computed tomography portovenography. Improvement of image quality

    International Nuclear Information System (INIS)

    Matsuda, Izuru; Hanaoka, Shohei; Akahane, Masaaki

    2010-01-01

    Adaptive statistical iterative reconstruction (ASIR) is a reconstruction technique for computed tomography (CT) that reduces image noise. The purpose of our study was to investigate whether ASIR improves the quality of volume-rendered (VR) CT portovenography. Institutional review board approval, with waived consent, was obtained. A total of 19 patients (12 men, 7 women; mean age 69.0 years; range 25-82 years) suspected of having liver lesions underwent three-phase enhanced CT. VR image sets were prepared with both the conventional method and ASIR. The required time to make VR images was recorded. Two radiologists performed independent qualitative evaluations of the image sets. The Wilcoxon signed-rank test was used for statistical analysis. Contrast-noise ratios (CNRs) of the portal and hepatic vein were also evaluated. Overall image quality was significantly improved by ASIR (P<0.0001 and P=0.0155 for each radiologist). ASIR enhanced CNRs of the portal and hepatic vein significantly (P<0.0001). The time required to create VR images was significantly shorter with ASIR (84.7 vs. 117.1 s; P=0.014). ASIR enhances CNRs and improves image quality in VR CT portovenography. It also shortens the time required to create liver VR CT portovenographs. (author)

  17. Statistical modeling of volume of alcohol exposure for epidemiological studies of population health: the US example

    Directory of Open Access Journals (Sweden)

    Gmel Gerrit

    2010-03-01

    Full Text Available Abstract Background Alcohol consumption is a major risk factor in the global burden of disease, with overall volume of exposure as the principal underlying dimension. Two main sources of data on volume of alcohol exposure are available: surveys and per capita consumption derived from routine statistics such as taxation. As both sources have significant problems, this paper presents an approach that triangulates information from both sources into disaggregated estimates in line with the overall level of per capita consumption. Methods A modeling approach was applied to the US using data from a large and representative survey, the National Epidemiologic Survey on Alcohol and Related Conditions. Different distributions (log-normal, gamma, Weibull were used to model consumption among drinkers in subgroups defined by sex, age, and ethnicity. The gamma distribution was used to shift the fitted distributions in line with the overall volume as derived from per capita estimates. Implications for alcohol-attributable fractions were presented, using liver cirrhosis as an example. Results The triangulation of survey data with aggregated per capita consumption data proved feasible and allowed for modeling of alcohol exposure disaggregated by sex, age, and ethnicity. These models can be used in combination with risk relations for burden of disease calculations. Sensitivity analyses showed that the gamma distribution chosen yielded very similar results in terms of fit and alcohol-attributable mortality as the other tested distributions. Conclusions Modeling alcohol consumption via the gamma distribution was feasible. To further refine this approach, research should focus on the main assumptions underlying the approach to explore differences between volume estimates derived from surveys and per capita consumption figures.

  18. Advanced statistics to improve the physical interpretation of atomization processes

    International Nuclear Information System (INIS)

    Panão, Miguel R.O.; Radu, Lucian

    2013-01-01

    Highlights: ► Finite pdf mixtures improves physical interpretation of sprays. ► Bayesian approach using MCMC algorithm is used to find the best finite mixture. ► Statistical method identifies multiple droplet clusters in a spray. ► Multiple drop clusters eventually associated with multiple atomization mechanisms. ► Spray described by drop size distribution and not only its moments. -- Abstract: This paper reports an analysis of the physics of atomization processes using advanced statistical tools. Namely, finite mixtures of probability density functions, which best fitting is found using a Bayesian approach based on a Markov chain Monte Carlo (MCMC) algorithm. This approach takes into account eventual multimodality and heterogeneities in drop size distributions. Therefore, it provides information about the complete probability density function of multimodal drop size distributions and allows the identification of subgroups in the heterogeneous data. This allows improving the physical interpretation of atomization processes. Moreover, it also overcomes the limitations induced by analyzing the spray droplets characteristics through moments alone, particularly, the hindering of different natures of droplet formation. Finally, the method is applied to physically interpret a case-study based on multijet atomization processes

  19. Application of statistical process control and process capability analysis procedures in orbiter processing activities at the Kennedy Space Center

    Science.gov (United States)

    Safford, Robert R.; Jackson, Andrew E.; Swart, William W.; Barth, Timothy S.

    1994-01-01

    Successful ground processing at KSC requires that flight hardware and ground support equipment conform to specifications at tens of thousands of checkpoints. Knowledge of conformance is an essential requirement for launch. That knowledge of conformance at every requisite point does not, however, enable identification of past problems with equipment, or potential problem areas. This paper describes how the introduction of Statistical Process Control and Process Capability Analysis identification procedures into existing shuttle processing procedures can enable identification of potential problem areas and candidates for improvements to increase processing performance measures. Results of a case study describing application of the analysis procedures to Thermal Protection System processing are used to illustrate the benefits of the approaches described in the paper.

  20. Extrusion Process by Finite Volume Method Using OpenFoam Software

    International Nuclear Information System (INIS)

    Matos Martins, Marcelo; Tonini Button, Sergio; Divo Bressan, Jose; Ivankovic, Alojz

    2011-01-01

    The computational codes are very important tools to solve engineering problems. In the analysis of metal forming process, such as extrusion, this is not different because the computational codes allow analyzing the process with reduced cost. Traditionally, the Finite Element Method is used to solve solid mechanic problems, however, the Finite Volume Method (FVM) have been gaining force in this field of applications. This paper presents the velocity field and friction coefficient variation results, obtained by numerical simulation using the OpenFoam Software and the FVM to solve an aluminum direct cold extrusion process.

  1. Statistical features of pre-compound processes in nuclear reactions

    International Nuclear Information System (INIS)

    Hussein, M.S.; Rego, R.A.

    1983-04-01

    Several statistical aspects of multistep compound processes are discussed. The connection between the cross-section auto-correlation function and the average number of maxima is emphasized. The restrictions imposed by the non-zero value of the energy step used in measuring the excitation fuction and the experimental error are discussed. Applications are made to the system 25 Mg( 3 He,p) 27 Al. (Author) [pt

  2. Statistical process control: An approach to quality assurance in the production of vitrified nuclear waste

    International Nuclear Information System (INIS)

    Pulsipher, B.A.; Kuhn, W.L.

    1987-01-01

    Current planning for liquid high-level nuclear wastes existing in the United States includes processing in a liquid-fed ceramic melter to incorporate it into a high-quality glass, and placement in a deep geologic repository. The nuclear waste vitrification process requires assurance of a quality product with little or no final inspection. Statistical process control (SPC) is a quantitative approach to one quality assurance aspect of vitrified nuclear waste. This method for monitoring and controlling a process in the presence of uncertainties provides a statistical basis for decisions concerning product quality improvement. Statistical process control is shown to be a feasible and beneficial tool to help the waste glass producers demonstrate that the vitrification process can be controlled sufficiently to produce an acceptable product. This quantitative aspect of quality assurance could be an effective means of establishing confidence in the claims to a quality product

  3. Statistical process control: An approach to quality assurance in the production of vitrified nuclear waste

    International Nuclear Information System (INIS)

    Pulsipher, B.A.; Kuhn, W.L.

    1987-02-01

    Current planning for liquid high-level nuclear wastes existing in the US includes processing in a liquid-fed ceramic melter to incorporate it into a high-quality glass, and placement in a deep geologic repository. The nuclear waste vitrification process requires assurance of a quality product with little or no final inspection. Statistical process control (SPC) is a quantitative approach to one quality assurance aspect of vitrified nuclear waste. This method for monitoring and controlling a process in the presence of uncertainties provides a statistical basis for decisions concerning product quality improvement. Statistical process control is shown to be a feasible and beneficial tool to help the waste glass producers demonstrate that the vitrification process can be controlled sufficiently to produce an acceptable product. This quantitative aspect of quality assurance could be an effective means of establishing confidence in the claims to a quality product. 2 refs., 4 figs

  4. CONFIDENCE LEVELS AND/VS. STATISTICAL HYPOTHESIS TESTING IN STATISTICAL ANALYSIS. CASE STUDY

    Directory of Open Access Journals (Sweden)

    ILEANA BRUDIU

    2009-05-01

    Full Text Available Estimated parameters with confidence intervals and testing statistical assumptions used in statistical analysis to obtain conclusions on research from a sample extracted from the population. Paper to the case study presented aims to highlight the importance of volume of sample taken in the study and how this reflects on the results obtained when using confidence intervals and testing for pregnant. If statistical testing hypotheses not only give an answer "yes" or "no" to some questions of statistical estimation using statistical confidence intervals provides more information than a test statistic, show high degree of uncertainty arising from small samples and findings build in the "marginally significant" or "almost significant (p very close to 0.05.

  5. Impact of Autocorrelation on Principal Components and Their Use in Statistical Process Control

    DEFF Research Database (Denmark)

    Vanhatalo, Erik; Kulahci, Murat

    2015-01-01

    A basic assumption when using principal component analysis (PCA) for inferential purposes, such as in statistical process control (SPC), is that the data are independent in time. In many industrial processes, frequent sampling and process dynamics make this assumption unrealistic rendering sampled...

  6. Effect of large volume paracentesis on plasma volume--a cause of hypovolemia

    International Nuclear Information System (INIS)

    Kao, H.W.; Rakov, N.E.; Savage, E.; Reynolds, T.B.

    1985-01-01

    Large volume paracentesis, while effectively relieving symptoms in patients with tense ascites, has been generally avoided due to reports of complications attributed to an acute reduction in intravascular volume. Measurements of plasma volume in these subjects have been by indirect methods and have not uniformly confirmed hypovolemia. We have prospectively evaluated 18 patients (20 paracenteses) with tense ascites and peripheral edema due to chronic liver disease undergoing 5 liter paracentesis for relief of symptoms. Plasma volume pre- and postparacentesis was assessed by a 125 I-labeled human serum albumin dilution technique as well as by the change in hematocrit and postural blood pressure difference. No significant change in serum sodium, urea nitrogen, hematocrit or postural systolic blood pressure difference was noted at 24 or 48 hr after paracentesis. Serum creatinine at 24 hr after paracentesis was unchanged but a small but statistically significant increase in serum creatinine was noted at 48 hr postparacentesis. Plasma volume changed -2.7% (n = 6, not statistically significant) during the first 24 hr and -2.8% (n = 12, not statistically significant) during the 0- to 48-hr period. No complications from paracentesis were noted. These results suggest that 5 liter paracentesis for relief of symptoms is safe in patients with tense ascites and peripheral edema from chronic liver disease

  7. Statistical trajectory of an approximate EM algorithm for probabilistic image processing

    International Nuclear Information System (INIS)

    Tanaka, Kazuyuki; Titterington, D M

    2007-01-01

    We calculate analytically a statistical average of trajectories of an approximate expectation-maximization (EM) algorithm with generalized belief propagation (GBP) and a Gaussian graphical model for the estimation of hyperparameters from observable data in probabilistic image processing. A statistical average with respect to observed data corresponds to a configuration average for the random-field Ising model in spin glass theory. In the present paper, hyperparameters which correspond to interactions and external fields of spin systems are estimated by an approximate EM algorithm. A practical algorithm is described for gray-level image restoration based on a Gaussian graphical model and GBP. The GBP approach corresponds to the cluster variation method in statistical mechanics. Our main result in the present paper is to obtain the statistical average of the trajectory in the approximate EM algorithm by using loopy belief propagation and GBP with respect to degraded images generated from a probability density function with true values of hyperparameters. The statistical average of the trajectory can be expressed in terms of recursion formulas derived from some analytical calculations

  8. Ultrasmall volume molecular isothermal amplification in microfluidic chip with advanced surface processing

    International Nuclear Information System (INIS)

    Huang Guoliang; Yang Xiaoyong; Ma Li; Yang Xu

    2011-01-01

    In this paper, we developed a metal micro-fluidic chip with advanced surface processing for ultra-small volume molecular isothermal amplification. This method takes advantages of the nucleic acid amplification with good stability and consistency, high sensitivity about 31 genomic DNA copies and bacteria specific gene identification. Based on the advanced surface processing, the bioreaction assays of nucleic acid amplification was dropped about 392nl in volume. A high numerical aperture confocal optical detection system was advanced to sensitively monitor the DNA amplification with low noise and high power collecting fluorescence near to the optical diffraction limit. A speedy nucleic acid isothermal amplification was performed in the ultra-small volume microfluidic chip, where the time at the inflexions of second derivative to DNA exponential amplified curves was brought forward and the sensitivity was improved about 65 folds to that of in current 25μl Ep-tube amplified reaction, which indicates a promising clinic molecular diagnostics in the droplet amplification.

  9. Simulation of the radiography formation process from CT patient volume

    International Nuclear Information System (INIS)

    Bifulco, P.; Cesarelli, M.; Verso, E.; Roccasalva Firenze, M.; Sansone, M.; Bracale, M.

    1998-01-01

    The aim of this work is to develop an algorithm to simulate the radiographic image formation process using volumetric anatomical data of the patient, obtained from 3D diagnostic CT images. Many applications, including radiographic driven surgery, virtual reality in medicine and radiologist teaching and training, may take advantage of such technique. The designed algorithm has been developed to simulate a generic radiographic equipment, whatever oriented respect to the patient. The simulated radiography is obtained considering a discrete number of X-ray paths departing from the focus, passing through the patient volume and reaching the radiographic plane. To evaluate a generic pixel of the simulated radiography, the cumulative absorption along the corresponding X-ray is computed. To estimate X-ray absorption in a generic point of the patient volume, 3D interpolation of CT data has been adopted. The proposed technique is quite similar to those employed in Ray Tracing. A computer designed test volume has been used to assess the reliability of the radiography simulation algorithm as a measuring tool. From the errors analysis emerges that the accuracy achieved by the radiographic simulation algorithm is largely confined within the sampling step of the CT volume. (authors)

  10. 2nd Conference of the International Society for Nonparametric Statistics

    CERN Document Server

    Manteiga, Wenceslao; Romo, Juan

    2016-01-01

    This volume collects selected, peer-reviewed contributions from the 2nd Conference of the International Society for Nonparametric Statistics (ISNPS), held in Cádiz (Spain) between June 11–16 2014, and sponsored by the American Statistical Association, the Institute of Mathematical Statistics, the Bernoulli Society for Mathematical Statistics and Probability, the Journal of Nonparametric Statistics and Universidad Carlos III de Madrid. The 15 articles are a representative sample of the 336 contributed papers presented at the conference. They cover topics such as high-dimensional data modelling, inference for stochastic processes and for dependent data, nonparametric and goodness-of-fit testing, nonparametric curve estimation, object-oriented data analysis, and semiparametric inference. The aim of the ISNPS 2014 conference was to bring together recent advances and trends in several areas of nonparametric statistics in order to facilitate the exchange of research ideas, promote collaboration among researchers...

  11. DWARF GALAXY STARBURST STATISTICS IN THE LOCAL VOLUME

    International Nuclear Information System (INIS)

    Lee, Janice C.; Kennicutt, Robert C.; Akiyama, Sanae; Funes, S. J. Jose G.; Sakai, Shoko

    2009-01-01

    An unresolved question in galaxy evolution is whether the star formation histories (SFHs) of low-mass systems are preferentially dominated by starbursts or modes that are more quiescent and continuous. Here, we quantify the prevalence of global starbursts in dwarf galaxies at the present epoch and infer their characteristic durations and amplitudes. The analysis is based on the Hα component of the 11 Mpc Hα UV Galaxy Survey (11HUGS), which provides Hα and Galaxy Evolution Explorer UV imaging for an approximately volume-limited sample of ∼ 300 star-forming galaxies within 11 Mpc. We first examine the completeness properties of the sample, and then directly tally the number of bursting dwarfs and compute the fraction of star formation that is concentrated in such systems. To identify starbursting dwarfs, we use an integrated Hα equivalent width (EW) threshold of 100 A, which corresponds to a stellar birthrate of ∼ 2.5, and also explore the use of empirical starburst definitions based on σ thresholds of the observed logarithmic EW distributions. Our results are robust to the exact choice of the threshold, and are consistent with a picture where dwarfs that are currently experiencing massive global bursts are just the ∼ 6% tip of a low-mass galaxy iceberg. Moreover, bursts are only responsible for about a quarter of the total star formation in the overall dwarf population, so the majority of stars in low-mass systems are not formed in this mode today. Spirals and irregulars devoid of Hα emission are rare, indicating that the complete cessation of star formation generally does not occur in such galaxies and is not characteristic of the interburst state, at least for the more luminous systems with M B < -15. The starburst statistics presented here directly constrain the duty cycle and the average burst amplitude under the simplest assumptions where all dwarf irregulars share a common SFH and undergo similar burst cycles with equal probability. Uncertainties

  12. Hadronic electroweak processes in a finite volume

    Energy Technology Data Exchange (ETDEWEB)

    Agadjanov, Andria

    2017-11-07

    In the present thesis, we study a number of hadronic electroweak processes in a finite volume. Our work is motivated by the ongoing and future lattice simulations of the strong interaction theory called quantum chromodynamics. According to the available computational resources, the numerical calculations are necessarily performed on lattices with a finite spatial extension. The first part of the thesis is based on the finite volume formalism which is a standard method to investigate the processes with the final state interactions, and in particular, the elastic hadron resonances, on the lattice. Throughout the work, we systematically apply the non-relativistic effective field theory. The great merit of this approach is that it encodes the low-energy dynamics directly in terms of the effective range expansion parameters. After a brief introduction into the subject, we formulate a framework for the extraction of the ΔNγ{sup *} as well as the B→K{sup *} transition form factors from lattice data. Both processes are of substantial phenomenological interest, including the search for physics beyond the Standard Model. Moreover, we provide a proper field-theoretical definition of the resonance matrix elements, and advocate it in comparison to the one based on the infinitely narrow width approximation. In the second part we consider certain aspects of the doubly virtual nucleon Compton scattering. The main objective of the work is to answer the question whether there is, in the Regge language, a so-called fixed pole in the process. To answer this question, the unknown subtraction function, which enters one of the dispersion relations for the invariant amplitudes, has to be determined. The external field method provides a feasible approach to tackle this problem on the lattice. Considering the nucleon in a periodic magnetic field, we derive a simple relation for the ground state energy shift up to a second order in the field strength. The obtained result encodes the

  13. Hadronic electroweak processes in a finite volume

    International Nuclear Information System (INIS)

    Agadjanov, Andria

    2017-01-01

    In the present thesis, we study a number of hadronic electroweak processes in a finite volume. Our work is motivated by the ongoing and future lattice simulations of the strong interaction theory called quantum chromodynamics. According to the available computational resources, the numerical calculations are necessarily performed on lattices with a finite spatial extension. The first part of the thesis is based on the finite volume formalism which is a standard method to investigate the processes with the final state interactions, and in particular, the elastic hadron resonances, on the lattice. Throughout the work, we systematically apply the non-relativistic effective field theory. The great merit of this approach is that it encodes the low-energy dynamics directly in terms of the effective range expansion parameters. After a brief introduction into the subject, we formulate a framework for the extraction of the ΔNγ * as well as the B→K * transition form factors from lattice data. Both processes are of substantial phenomenological interest, including the search for physics beyond the Standard Model. Moreover, we provide a proper field-theoretical definition of the resonance matrix elements, and advocate it in comparison to the one based on the infinitely narrow width approximation. In the second part we consider certain aspects of the doubly virtual nucleon Compton scattering. The main objective of the work is to answer the question whether there is, in the Regge language, a so-called fixed pole in the process. To answer this question, the unknown subtraction function, which enters one of the dispersion relations for the invariant amplitudes, has to be determined. The external field method provides a feasible approach to tackle this problem on the lattice. Considering the nucleon in a periodic magnetic field, we derive a simple relation for the ground state energy shift up to a second order in the field strength. The obtained result encodes the value of the

  14. Use of statistic control of the process as part of a quality assurance plan

    International Nuclear Information System (INIS)

    Acosta, S.; Lewis, C.

    2013-01-01

    One of the technical requirements of the standard IRAM ISO 17025 for the accreditation of testing laboratories, is the assurance of the quality of the results through the control and monitoring of the factors influencing the reliability of them. The grade the factors contribute to the total measurement uncertainty, determines which of them should be considered when developing a quality assurance plan. The laboratory of environmental measurements of strontium-90 in the accreditation process, performs most of its determinations in samples with values close to the detection limit. For this reason the correct characterization of the white, is a critical parameter and is verified through a letter for statistical process control. The scope of the present work is concerned the control of whites and so it was collected a statistically significant amount of data, for a period of time that is covered of different conditions. This allowed consider significant variables in the process, such as temperature and humidity, and build a graph of white control, which forms the basis of a statistical process control. The data obtained were lower and upper limits for the preparation of the charter white control. In this way the process of characterization of white was considered to operate under statistical control and concludes that it can be used as part of a plan of insurance of the quality

  15. Statistical test data selection for reliability evalution of process computer software

    International Nuclear Information System (INIS)

    Volkmann, K.P.; Hoermann, H.; Ehrenberger, W.

    1976-01-01

    The paper presents a concept for converting knowledge about the characteristics of process states into practicable procedures for the statistical selection of test cases in testing process computer software. Process states are defined as vectors whose components consist of values of input variables lying in discrete positions or within given limits. Two approaches for test data selection, based on knowledge about cases of demand, are outlined referring to a purely probabilistic method and to the mathematics of stratified sampling. (orig.) [de

  16. Kinetic Analysis of Dynamic Positron Emission Tomography Data using Open-Source Image Processing and Statistical Inference Tools.

    Science.gov (United States)

    Hawe, David; Hernández Fernández, Francisco R; O'Suilleabháin, Liam; Huang, Jian; Wolsztynski, Eric; O'Sullivan, Finbarr

    2012-05-01

    In dynamic mode, positron emission tomography (PET) can be used to track the evolution of injected radio-labelled molecules in living tissue. This is a powerful diagnostic imaging technique that provides a unique opportunity to probe the status of healthy and pathological tissue by examining how it processes substrates. The spatial aspect of PET is well established in the computational statistics literature. This article focuses on its temporal aspect. The interpretation of PET time-course data is complicated because the measured signal is a combination of vascular delivery and tissue retention effects. If the arterial time-course is known, the tissue time-course can typically be expressed in terms of a linear convolution between the arterial time-course and the tissue residue. In statistical terms, the residue function is essentially a survival function - a familiar life-time data construct. Kinetic analysis of PET data is concerned with estimation of the residue and associated functionals such as flow, flux, volume of distribution and transit time summaries. This review emphasises a nonparametric approach to the estimation of the residue based on a piecewise linear form. Rapid implementation of this by quadratic programming is described. The approach provides a reference for statistical assessment of widely used one- and two-compartmental model forms. We illustrate the method with data from two of the most well-established PET radiotracers, (15)O-H(2)O and (18)F-fluorodeoxyglucose, used for assessment of blood perfusion and glucose metabolism respectively. The presentation illustrates the use of two open-source tools, AMIDE and R, for PET scan manipulation and model inference.

  17. Development of the NRC`s Human Performance Investigation Process (HPIP). Volume 2, Investigators`s Manual

    Energy Technology Data Exchange (ETDEWEB)

    Paradies, M.; Unger, L. [System Improvements, Inc., Knoxville, TN (United States); Haas, P.; Terranova, M. [Concord Associates, Inc., Knoxville, TN (United States)

    1993-10-01

    The three volumes of this report detail a standard investigation process for use by US Nuclear Regulatory Commission (NRC) personnel when investigating human performance related events at nuclear power plants. The process, called the Human Performance Investigation Process (HPIP), was developed to meet the special needs of NRC personnel, especially NRC resident and regional inspectors. HPIP is a systematic investigation process combining current procedures and field practices, expert experience, NRC human performance research, and applicable investigation techniques. The process is easy to learn and helps NRC personnel perform better field investigations of the root causes of human performance problems. The human performance data gathered through such investigations provides a better understanding of the human performance issues that cause event at nuclear power plants. This document, Volume II, is a field manual for use by investigators when performing event investigations. Volume II includes the HPIP Procedure, the HPIP Modules, and Appendices that provide extensive documentation of each investigation technique.

  18. Load research manual. Volume 1. Load research procedures

    Energy Technology Data Exchange (ETDEWEB)

    Brandenburg, L.; Clarkson, G.; Grund, Jr., C.; Leo, J.; Asbury, J.; Brandon-Brown, F.; Derderian, H.; Mueller, R.; Swaroop, R.

    1980-11-01

    This three-volume manual presents technical guidelines for electric utility load research. Special attention is given to issues raised by the load data reporting requirements of the Public Utility Regulatory Policies Act of 1978 and to problems faced by smaller utilities that are initiating load research programs. In Volumes 1 and 2, procedures are suggested for determining data requirements for load research, establishing the size and customer composition of a load survey sample, selecting and using equipment to record customer electricity usage, processing data tapes from the recording equipment, and analyzing the data. Statistical techniques used in customer sampling are discussed in detail. The costs of load research also are estimated, and ongoing load research programs at three utilities are described. The manual includes guides to load research literature and glossaries of load research and statistical terms.

  19. Gasoline from coal in the state of Illinois: feasibility study. Volume I. Design. [KBW gasification process, ICI low-pressure methanol process and Mobil M-gasoline process

    Energy Technology Data Exchange (ETDEWEB)

    1980-01-01

    Volume 1 describes the proposed plant: KBW gasification process, ICI low-pressure methanol process and Mobil M-gasoline process, and also with ancillary processes, such as oxygen plant, shift process, RECTISOL purification process, sulfur recovery equipment and pollution control equipment. Numerous engineering diagrams are included. (LTN)

  20. Comparing statistical and process-based flow duration curve models in ungauged basins and changing rain regimes

    Science.gov (United States)

    Müller, M. F.; Thompson, S. E.

    2016-02-01

    The prediction of flow duration curves (FDCs) in ungauged basins remains an important task for hydrologists given the practical relevance of FDCs for water management and infrastructure design. Predicting FDCs in ungauged basins typically requires spatial interpolation of statistical or model parameters. This task is complicated if climate becomes non-stationary, as the prediction challenge now also requires extrapolation through time. In this context, process-based models for FDCs that mechanistically link the streamflow distribution to climate and landscape factors may have an advantage over purely statistical methods to predict FDCs. This study compares a stochastic (process-based) and statistical method for FDC prediction in both stationary and non-stationary contexts, using Nepal as a case study. Under contemporary conditions, both models perform well in predicting FDCs, with Nash-Sutcliffe coefficients above 0.80 in 75 % of the tested catchments. The main drivers of uncertainty differ between the models: parameter interpolation was the main source of error for the statistical model, while violations of the assumptions of the process-based model represented the main source of its error. The process-based approach performed better than the statistical approach in numerical simulations with non-stationary climate drivers. The predictions of the statistical method under non-stationary rainfall conditions were poor if (i) local runoff coefficients were not accurately determined from the gauge network, or (ii) streamflow variability was strongly affected by changes in rainfall. A Monte Carlo analysis shows that the streamflow regimes in catchments characterized by frequent wet-season runoff and a rapid, strongly non-linear hydrologic response are particularly sensitive to changes in rainfall statistics. In these cases, process-based prediction approaches are favored over statistical models.

  1. A concept of volume rendering guided search process to analyze medical data set.

    Science.gov (United States)

    Zhou, Jianlong; Xiao, Chun; Wang, Zhiyan; Takatsuka, Masahiro

    2008-03-01

    This paper firstly presents an approach of parallel coordinates based parameter control panel (PCP). The PCP is used to control parameters of focal region-based volume rendering (FRVR) during data analysis. It uses a parallel coordinates style interface. Different rendering parameters represented with nodes on each axis, and renditions based on related parameters are connected using polylines to show dependencies between renditions and parameters. Based on the PCP, a concept of volume rendering guided search process is proposed. The search pipeline is divided into four phases. Different parameters of FRVR are recorded and modulated in the PCP during search phases. The concept shows that volume visualization could play the role of guiding a search process in the rendition space to help users to efficiently find local structures of interest. The usability of the proposed approach is evaluated to show its effectiveness.

  2. On two methods of statistical image analysis

    NARCIS (Netherlands)

    Missimer, J; Knorr, U; Maguire, RP; Herzog, H; Seitz, RJ; Tellman, L; Leenders, K.L.

    1999-01-01

    The computerized brain atlas (CBA) and statistical parametric mapping (SPM) are two procedures for voxel-based statistical evaluation of PET activation studies. Each includes spatial standardization of image volumes, computation of a statistic, and evaluation of its significance. In addition,

  3. Federal Funds for Research and Development: Fiscal Years 1980, 1981, and 1982. Volume XXX. Detailed Statistical Tables. Surveys of Science Resources Series.

    Science.gov (United States)

    National Science Foundation, Washington, DC.

    During the March through July 1981 period a total of 36 Federal agencies and their subdivisions (95 individual respondents) submitted data in response to the Annual Survey of Federal Funds for Research and Development, Volume XXX, conducted by the National Science Foundation. The detailed statistical tables presented in this report were derived…

  4. Recurrence interval analysis of trading volumes.

    Science.gov (United States)

    Ren, Fei; Zhou, Wei-Xing

    2010-06-01

    We study the statistical properties of the recurrence intervals τ between successive trading volumes exceeding a certain threshold q. The recurrence interval analysis is carried out for the 20 liquid Chinese stocks covering a period from January 2000 to May 2009, and two Chinese indices from January 2003 to April 2009. Similar to the recurrence interval distribution of the price returns, the tail of the recurrence interval distribution of the trading volumes follows a power-law scaling, and the results are verified by the goodness-of-fit tests using the Kolmogorov-Smirnov (KS) statistic, the weighted KS statistic and the Cramér-von Mises criterion. The measurements of the conditional probability distribution and the detrended fluctuation function show that both short-term and long-term memory effects exist in the recurrence intervals between trading volumes. We further study the relationship between trading volumes and price returns based on the recurrence interval analysis method. It is found that large trading volumes are more likely to occur following large price returns, and the comovement between trading volumes and price returns is more pronounced for large trading volumes.

  5. Reducing lumber thickness variation using real-time statistical process control

    Science.gov (United States)

    Thomas M. Young; Brian H. Bond; Jan Wiedenbeck

    2002-01-01

    A technology feasibility study for reducing lumber thickness variation was conducted from April 2001 until March 2002 at two sawmills located in the southern U.S. A real-time statistical process control (SPC) system was developed that featured Wonderware human machine interface technology (HMI) with distributed real-time control charts for all sawing centers and...

  6. Implementation of statistical process control for proteomic experiments via LC MS/MS.

    Science.gov (United States)

    Bereman, Michael S; Johnson, Richard; Bollinger, James; Boss, Yuval; Shulman, Nick; MacLean, Brendan; Hoofnagle, Andrew N; MacCoss, Michael J

    2014-04-01

    Statistical process control (SPC) is a robust set of tools that aids in the visualization, detection, and identification of assignable causes of variation in any process that creates products, services, or information. A tool has been developed termed Statistical Process Control in Proteomics (SProCoP) which implements aspects of SPC (e.g., control charts and Pareto analysis) into the Skyline proteomics software. It monitors five quality control metrics in a shotgun or targeted proteomic workflow. None of these metrics require peptide identification. The source code, written in the R statistical language, runs directly from the Skyline interface, which supports the use of raw data files from several of the mass spectrometry vendors. It provides real time evaluation of the chromatographic performance (e.g., retention time reproducibility, peak asymmetry, and resolution), and mass spectrometric performance (targeted peptide ion intensity and mass measurement accuracy for high resolving power instruments) via control charts. Thresholds are experiment- and instrument-specific and are determined empirically from user-defined quality control standards that enable the separation of random noise and systematic error. Finally, Pareto analysis provides a summary of performance metrics and guides the user to metrics with high variance. The utility of these charts to evaluate proteomic experiments is illustrated in two case studies.

  7. Estimation of Apple Volume and Its Shape Indentation Using Image Processing Technique and Neural Network

    Directory of Open Access Journals (Sweden)

    M Jafarlou

    2014-04-01

    Full Text Available Physical properties of agricultural products such as volume are the most important parameters influencing grading and packaging systems. They should be measured accurately as they are considered for any good system design. Image processing and neural network techniques are both non-destructive and useful methods which are recently used for such purpose. In this study, the images of apples were captured from a constant distance and then were processed in MATLAB software and the edges of apple images were extracted. The interior area of apple image was divided into some thin trapezoidal elements perpendicular to longitudinal axis. Total volume of apple was estimated by the summation of incremental volumes of these elements revolved around the apple’s longitudinal axis. The picture of half cut apple was also captured in order to obtain the apple shape’s indentation volume, which was subtracted from the previously estimated total volume of apple. The real volume of apples was measured using water displacement method and the relation between the real volume and estimated volume was obtained. The t-test and Bland-Altman indicated that the difference between the real volume and the estimated volume was not significantly different (p>0.05 i.e. the mean difference was 1.52 cm3 and the accuracy of measurement was 92%. Utilizing neural network with input variables of dimension and mass has increased the accuracy up to 97% and the difference between the mean of volumes decreased to 0.7 cm3.

  8. Assessment of the beryllium lymphocyte proliferation test using statistical process control.

    Science.gov (United States)

    Cher, Daniel J; Deubner, David C; Kelsh, Michael A; Chapman, Pamela S; Ray, Rose M

    2006-10-01

    Despite more than 20 years of surveillance and epidemiologic studies using the beryllium blood lymphocyte proliferation test (BeBLPT) as a measure of beryllium sensitization (BeS) and as an aid for diagnosing subclinical chronic beryllium disease (CBD), improvements in specific understanding of the inhalation toxicology of CBD have been limited. Although epidemiologic data suggest that BeS and CBD risks vary by process/work activity, it has proven difficult to reach specific conclusions regarding the dose-response relationship between workplace beryllium exposure and BeS or subclinical CBD. One possible reason for this uncertainty could be misclassification of BeS resulting from variation in BeBLPT testing performance. The reliability of the BeBLPT, a biological assay that measures beryllium sensitization, is unknown. To assess the performance of four laboratories that conducted this test, we used data from a medical surveillance program that offered testing for beryllium sensitization with the BeBLPT. The study population was workers exposed to beryllium at various facilities over a 10-year period (1992-2001). Workers with abnormal results were offered diagnostic workups for CBD. Our analyses used a standard statistical technique, statistical process control (SPC), to evaluate test reliability. The study design involved a repeated measures analysis of BeBLPT results generated from the company-wide, longitudinal testing. Analytical methods included use of (1) statistical process control charts that examined temporal patterns of variation for the stimulation index, a measure of cell reactivity to beryllium; (2) correlation analysis that compared prior perceptions of BeBLPT instability to the statistical measures of test variation; and (3) assessment of the variation in the proportion of missing test results and how time periods with more missing data influenced SPC findings. During the period of this study, all laboratories displayed variation in test results that

  9. Effect of hospital volume on processes of breast cancer care: A National Cancer Data Base study.

    Science.gov (United States)

    Yen, Tina W F; Pezzin, Liliana E; Li, Jianing; Sparapani, Rodney; Laud, Purushuttom W; Nattinger, Ann B

    2017-05-15

    The purpose of this study was to examine variations in delivery of several breast cancer processes of care that are correlated with lower mortality and disease recurrence, and to determine the extent to which hospital volume explains this variation. Women who were diagnosed with stage I-III unilateral breast cancer between 2007 and 2011 were identified within the National Cancer Data Base. Multiple logistic regression models were developed to determine whether hospital volume was independently associated with each of 10 individual process of care measures addressing diagnosis and treatment, and 2 composite measures assessing appropriateness of systemic treatment (chemotherapy and hormonal therapy) and locoregional treatment (margin status and radiation therapy). Among 573,571 women treated at 1755 different hospitals, 38%, 51%, and 10% were treated at high-, medium-, and low-volume hospitals, respectively. On multivariate analysis controlling for patient sociodemographic characteristics, treatment year and geographic location, hospital volume was a significant predictor for cancer diagnosis by initial biopsy (medium volume: odds ratio [OR] = 1.15, 95% confidence interval [CI] = 1.05-1.25; high volume: OR = 1.30, 95% CI = 1.14-1.49), negative surgical margins (medium volume: OR = 1.15, 95% CI = 1.06-1.24; high volume: OR = 1.28, 95% CI = 1.13-1.44), and appropriate locoregional treatment (medium volume: OR = 1.12, 95% CI = 1.07-1.17; high volume: OR = 1.16, 95% CI = 1.09-1.24). Diagnosis of breast cancer before initial surgery, negative surgical margins and appropriate use of radiation therapy may partially explain the volume-survival relationship. Dissemination of these processes of care to a broader group of hospitals could potentially improve the overall quality of care and outcomes of breast cancer survivors. Cancer 2017;123:957-66. © 2016 American Cancer Society. © 2016 American Cancer Society.

  10. Statistical representation of a spray as a point process

    International Nuclear Information System (INIS)

    Subramaniam, S.

    2000-01-01

    The statistical representation of a spray as a finite point process is investigated. One objective is to develop a better understanding of how single-point statistical information contained in descriptions such as the droplet distribution function (ddf), relates to the probability density functions (pdfs) associated with the droplets themselves. Single-point statistical information contained in the droplet distribution function (ddf) is shown to be related to a sequence of single surrogate-droplet pdfs, which are in general different from the physical single-droplet pdfs. It is shown that the ddf contains less information than the fundamental single-point statistical representation of the spray, which is also described. The analysis shows which events associated with the ensemble of spray droplets can be characterized by the ddf, and which cannot. The implications of these findings for the ddf approach to spray modeling are discussed. The results of this study also have important consequences for the initialization and evolution of direct numerical simulations (DNS) of multiphase flows, which are usually initialized on the basis of single-point statistics such as the droplet number density in physical space. If multiphase DNS are initialized in this way, this implies that even the initial representation contains certain implicit assumptions concerning the complete ensemble of realizations, which are invalid for general multiphase flows. Also the evolution of a DNS initialized in this manner is shown to be valid only if an as yet unproven commutation hypothesis holds true. Therefore, it is questionable to what extent DNS that are initialized in this manner constitute a direct simulation of the physical droplets. Implications of these findings for large eddy simulations of multiphase flows are also discussed. (c) 2000 American Institute of Physics

  11. Effect of drop volume and surface statistics on the superhydrophobicity of randomly rough substrates

    Science.gov (United States)

    Afferrante, L.; Carbone, G.

    2018-01-01

    In this paper, a simple theoretical approach is developed with the aim of evaluating shape, interfacial pressure, apparent contact angle and contact area of liquid drops gently deposed on randomly rough surfaces. This method can be useful to characterize the superhydrophobic properties of rough substrates, and to investigate the contact behavior of impacting drops. We assume that (i) the size of the apparent liquid-solid contact area is much larger than the micromorphology of the substrate, and (ii) a composite interface is always formed at the microscale. Results show apparent contact angle and liquid-solid area fraction are slightly influenced by the drop volume only at relatively high values of the root mean square roughness h rms, whereas the effect of volume is practically negligible at small h rms. The main statistical quantity affecting the superhydrophobic properties is found to be the Wenzel roughness parameter r W, which depends on the average slope of the surface heights. Moreover, transition from the Cassie-Baxter state to the Wenzel one is observed when r W reduces below a certain critical value, and theoretical predictions are found to be in good agreement with experimental data. Finally, the present method can be conveniently exploited to evaluate the occurrence of pinning phenomena in the case of impacting drops, as the Wenzel critical pressure for liquid penetration gives an estimation of the maximum impact pressure tolerated by the surface without pinning occurring.

  12. Simulation of the radiography formation process from CT patient volume

    Energy Technology Data Exchange (ETDEWEB)

    Bifulco, P; Cesarelli, M; Verso, E; Roccasalva Firenze, M; Sansone, M; Bracale, M [University of Naples, Federico II, Electronic Engineering Department, Bioengineering Unit, Via Claudio, 21 - 80125 Naples (Italy)

    1999-12-31

    The aim of this work is to develop an algorithm to simulate the radiographic image formation process using volumetric anatomical data of the patient, obtained from 3D diagnostic CT images. Many applications, including radiographic driven surgery, virtual reality in medicine and radiologist teaching and training, may take advantage of such technique. The designed algorithm has been developed to simulate a generic radiographic equipment, whatever oriented respect to the patient. The simulated radiography is obtained considering a discrete number of X-ray paths departing from the focus, passing through the patient volume and reaching the radiographic plane. To evaluate a generic pixel of the simulated radiography, the cumulative absorption along the corresponding X-ray is computed. To estimate X-ray absorption in a generic point of the patient volume, 3D interpolation of CT data has been adopted. The proposed technique is quite similar to those employed in Ray Tracing. A computer designed test volume has been used to assess the reliability of the radiography simulation algorithm as a measuring tool. From the errors analysis emerges that the accuracy achieved by the radiographic simulation algorithm is largely confined within the sampling step of the CT volume. (authors) 16 refs., 12 figs., 1 tabs.

  13. Petroleum supply annual 1996: Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-06-01

    The Petroleum Supply Annual (PSA) contains information on the supply and disposition of crude oil and petroleum products. The publication reflects data that were collected from the petroleum industry during 1996 through monthly surveys. The PSA is divided into two volumes. The first volume contains three sections: Summary Statistics, Detailed Statistics, and Refinery Capacity; each with final annual data. The second volume contains final statistics for each month of 1996, and replaces data previously published in the Petroleum Supply Monthly (PSM). The tables in Volumes 1 and 2 are similarly numbered to facilitate comparison between them. Explanatory Notes, located at the end of this publication, present information describing data collection, sources, estimation methodology, data quality control procedures, modifications to reporting requirements and interpretation of tables. Industry terminology and product definitions are listed alphabetically in the Glossary. 35 tabs.

  14. Petroleum supply annual 1994, Volume 2

    International Nuclear Information System (INIS)

    1995-06-01

    The Petroleum Supply Annual (PSA) contains information on the supply and disposition of crude oil and petroleum products. The publication reflects data that were collected from the petroleum industry during 1994 through annual and monthly surveys. The PSA is divided into two volumes. This first volume contains four sections: Summary Statistics, Detailed Statistics, Refinery Capacity, and Oxygenate Capacity each with final annual data. The second volume contains final statistics for each month of 1994, and replaces data previously published in the Petroleum Supply Monthly (PSM). The tables in Volumes 1 and 2 are similarly numbered to facilitate comparison between them. Explanatory Notes, located at the end of this publication, present information describing data collection, sources, estimation methodology, data quality control procedures, modifications to reporting requirements and interpretation of tables. Industry terminology and product definitions are listed alphabetically in the Glossary

  15. Petroleum supply annual 1996: Volume 2

    International Nuclear Information System (INIS)

    1997-06-01

    The Petroleum Supply Annual (PSA) contains information on the supply and disposition of crude oil and petroleum products. The publication reflects data that were collected from the petroleum industry during 1996 through monthly surveys. The PSA is divided into two volumes. The first volume contains three sections: Summary Statistics, Detailed Statistics, and Refinery Capacity; each with final annual data. The second volume contains final statistics for each month of 1996, and replaces data previously published in the Petroleum Supply Monthly (PSM). The tables in Volumes 1 and 2 are similarly numbered to facilitate comparison between them. Explanatory Notes, located at the end of this publication, present information describing data collection, sources, estimation methodology, data quality control procedures, modifications to reporting requirements and interpretation of tables. Industry terminology and product definitions are listed alphabetically in the Glossary. 35 tabs

  16. Management of Uncertainty by Statistical Process Control and a Genetic Tuned Fuzzy System

    Directory of Open Access Journals (Sweden)

    Stephan Birle

    2016-01-01

    Full Text Available In food industry, bioprocesses like fermentation often are a crucial part of the manufacturing process and decisive for the final product quality. In general, they are characterized by highly nonlinear dynamics and uncertainties that make it difficult to control these processes by the use of traditional control techniques. In this context, fuzzy logic controllers offer quite a straightforward way to control processes that are affected by nonlinear behavior and uncertain process knowledge. However, in order to maintain process safety and product quality it is necessary to specify the controller performance and to tune the controller parameters. In this work, an approach is presented to establish an intelligent control system for oxidoreductive yeast propagation as a representative process biased by the aforementioned uncertainties. The presented approach is based on statistical process control and fuzzy logic feedback control. As the cognitive uncertainty among different experts about the limits that define the control performance as still acceptable may differ a lot, a data-driven design method is performed. Based upon a historic data pool statistical process corridors are derived for the controller inputs control error and change in control error. This approach follows the hypothesis that if the control performance criteria stay within predefined statistical boundaries, the final process state meets the required quality definition. In order to keep the process on its optimal growth trajectory (model based reference trajectory a fuzzy logic controller is used that alternates the process temperature. Additionally, in order to stay within the process corridors, a genetic algorithm was applied to tune the input and output fuzzy sets of a preliminarily parameterized fuzzy controller. The presented experimental results show that the genetic tuned fuzzy controller is able to keep the process within its allowed limits. The average absolute error to the

  17. Competent statistical programmer: Need of business process outsourcing industry

    Science.gov (United States)

    Khan, Imran

    2014-01-01

    Over the last two decades Business Process Outsourcing (BPO) has evolved as much mature practice. India is looked as preferred destination for pharmaceutical outsourcing over a cost arbitrage. Among the biometrics outsourcing, statistical programming and analysis required very niche skill for service delivery. The demand and supply ratios are imbalance due to high churn out rate and less supply of competent programmer. Industry is moving from task delivery to ownership and accountability. The paradigm shift from an outsourcing to consulting is triggering the need for competent statistical programmer. Programmers should be trained in technical, analytical, problem solving, decision making and soft skill as the expectations from the customer are changing from task delivery to accountability of the project. This paper will highlight the common issue SAS programming service industry is facing and skills the programmers need to develop to cope up with these changes. PMID:24987578

  18. Competent statistical programmer: Need of business process outsourcing industry.

    Science.gov (United States)

    Khan, Imran

    2014-07-01

    Over the last two decades Business Process Outsourcing (BPO) has evolved as much mature practice. India is looked as preferred destination for pharmaceutical outsourcing over a cost arbitrage. Among the biometrics outsourcing, statistical programming and analysis required very niche skill for service delivery. The demand and supply ratios are imbalance due to high churn out rate and less supply of competent programmer. Industry is moving from task delivery to ownership and accountability. The paradigm shift from an outsourcing to consulting is triggering the need for competent statistical programmer. Programmers should be trained in technical, analytical, problem solving, decision making and soft skill as the expectations from the customer are changing from task delivery to accountability of the project. This paper will highlight the common issue SAS programming service industry is facing and skills the programmers need to develop to cope up with these changes.

  19. Competent statistical programmer: Need of business process outsourcing industry

    Directory of Open Access Journals (Sweden)

    Imran Khan

    2014-01-01

    Full Text Available Over the last two decades Business Process Outsourcing (BPO has evolved as much mature practice. India is looked as preferred destination for pharmaceutical outsourcing over a cost arbitrage. Among the biometrics outsourcing, statistical programming and analysis required very niche skill for service delivery. The demand and supply ratios are imbalance due to high churn out rate and less supply of competent programmer. Industry is moving from task delivery to ownership and accountability. The paradigm shift from an outsourcing to consulting is triggering the need for competent statistical programmer. Programmers should be trained in technical, analytical, problem solving, decision making and soft skill as the expectations from the customer are changing from task delivery to accountability of the project. This paper will highlight the common issue SAS programming service industry is facing and skills the programmers need to develop to cope up with these changes.

  20. Incineration as a radioactive waste volume reduction process for CEA nuclear centers

    International Nuclear Information System (INIS)

    Atabek, R.; Chaudon, L.

    1994-01-01

    Incineration processes represent a promising solution for waste volume reduction, and will be increasingly used in the future. The features and performance specifications of low-level waste incinerators with capacities ranging from 10 to 20 kg - h -1 at the Fontenay-aux-Roses, Grenoble and Cadarache nuclear centers in France are briefly reviewed. More extensive knowledge of low-level wastes produced in facilities operated by the Commissariat a l'Energie Atomique (CEA) has allowed us to assess the volume reduction obtained by processing combustible waste in existing incinerators. Research and development work is in progress to improve management procedures for higher-level waste and to build facilities capable of incinerating α - contaminated waste. (authors). 6 refs., 5 figs., 1 tab

  1. Comparison of Statistical Post-Processing Methods for Probabilistic Wind Speed Forecasting

    Science.gov (United States)

    Han, Keunhee; Choi, JunTae; Kim, Chansoo

    2018-02-01

    In this study, the statistical post-processing methods that include bias-corrected and probabilistic forecasts of wind speed measured in PyeongChang, which is scheduled to host the 2018 Winter Olympics, are compared and analyzed to provide more accurate weather information. The six post-processing methods used in this study are as follows: mean bias-corrected forecast, mean and variance bias-corrected forecast, decaying averaging forecast, mean absolute bias-corrected forecast, and the alternative implementations of ensemble model output statistics (EMOS) and Bayesian model averaging (BMA) models, which are EMOS and BMA exchangeable models by assuming exchangeable ensemble members and simplified version of EMOS and BMA models. Observations for wind speed were obtained from the 26 stations in PyeongChang and 51 ensemble member forecasts derived from the European Centre for Medium-Range Weather Forecasts (ECMWF Directorate, 2012) that were obtained between 1 May 2013 and 18 March 2016. Prior to applying the post-processing methods, reliability analysis was conducted by using rank histograms to identify the statistical consistency of ensemble forecast and corresponding observations. Based on the results of our study, we found that the prediction skills of probabilistic forecasts of EMOS and BMA models were superior to the biascorrected forecasts in terms of deterministic prediction, whereas in probabilistic prediction, BMA models showed better prediction skill than EMOS. Even though the simplified version of BMA model exhibited best prediction skill among the mentioned six methods, the results showed that the differences of prediction skills between the versions of EMOS and BMA were negligible.

  2. Application of Statistical Process Control (SPC in it´s Quality control

    Directory of Open Access Journals (Sweden)

    Carlos Hernández-Pedrera

    2015-12-01

    Full Text Available The overall objective of this paper is to use the SPC to assess the possibility of improving the process of obtaining a sanitary device. As specific objectives we set out to identify the variables to be analyzed to enter the statistical control of process (SPC, analyze possible errors and variations indicated by the control charts in addition to evaluate and compare the results achieved with the study of SPC before and after monitoring direct in the production line were used sampling methods and laboratory replacement to determine the quality of the finished product, then statistical methods were applied seeking to emphasize the importance and contribution from its application to monitor corrective actions and support processes in production. It was shown that the process is under control because the results were found within established control limits. There is a tendency to be displaced toward one end of the boundary, the distribution exceeds the limits, creating the possibility that under certain conditions the process is out of control, the results also showed that the process being within the limits of quality control is operating far from the optimal conditions. In any of the study situations were obtained products outside the limits of weight and discoloration but defective products were obtained.

  3. Statistical error in simulations of Poisson processes: Example of diffusion in solids

    Science.gov (United States)

    Nilsson, Johan O.; Leetmaa, Mikael; Vekilova, Olga Yu.; Simak, Sergei I.; Skorodumova, Natalia V.

    2016-08-01

    Simulations of diffusion in solids often produce poor statistics of diffusion events. We present an analytical expression for the statistical error in ion conductivity obtained in such simulations. The error expression is not restricted to any computational method in particular, but valid in the context of simulation of Poisson processes in general. This analytical error expression is verified numerically for the case of Gd-doped ceria by running a large number of kinetic Monte Carlo calculations.

  4. Los Alamos Controlled Air Incinerator for radioactive waste. Volume I. Rationale, process, equipment, performance, and recommendations

    International Nuclear Information System (INIS)

    Neuls, A.S.; Draper, W.E.; Koenig, R.A.; Newmyer, J.M.; Warner, C.L.

    1982-08-01

    This two-volume report is a detailed design and operating documentation of the Los Alamos National Laboratory Controlled Air Incinerator (CAI) and is an aid to technology transfer to other Department of Energy contractor sites and the commercial sector. Volume I describes the CAI process, equipment, and performance, and it recommends modifications based on Los Alamos experience. It provides the necessary information for conceptual design and feasibility studies. Volume II provides descriptive engineering information such as drawing, specifications, calculations, and costs. It aids duplication of the process at other facilities

  5. Los Alamos Controlled Air Incinerator for radioactive waste. Volume I. Rationale, process, equipment, performance, and recommendations

    Energy Technology Data Exchange (ETDEWEB)

    Neuls, A.S.; Draper, W.E.; Koenig, R.A.; Newmyer, J.M.; Warner, C.L.

    1982-08-01

    This two-volume report is a detailed design and operating documentation of the Los Alamos National Laboratory Controlled Air Incinerator (CAI) and is an aid to technology transfer to other Department of Energy contractor sites and the commercial sector. Volume I describes the CAI process, equipment, and performance, and it recommends modifications based on Los Alamos experience. It provides the necessary information for conceptual design and feasibility studies. Volume II provides descriptive engineering information such as drawing, specifications, calculations, and costs. It aids duplication of the process at other facilities.

  6. Graphene growth process modeling: a physical-statistical approach

    Science.gov (United States)

    Wu, Jian; Huang, Qiang

    2014-09-01

    As a zero-band semiconductor, graphene is an attractive material for a wide variety of applications such as optoelectronics. Among various techniques developed for graphene synthesis, chemical vapor deposition on copper foils shows high potential for producing few-layer and large-area graphene. Since fabrication of high-quality graphene sheets requires the understanding of growth mechanisms, and methods of characterization and control of grain size of graphene flakes, analytical modeling of graphene growth process is therefore essential for controlled fabrication. The graphene growth process starts with randomly nucleated islands that gradually develop into complex shapes, grow in size, and eventually connect together to cover the copper foil. To model this complex process, we develop a physical-statistical approach under the assumption of self-similarity during graphene growth. The growth kinetics is uncovered by separating island shapes from area growth rate. We propose to characterize the area growth velocity using a confined exponential model, which not only has clear physical explanation, but also fits the real data well. For the shape modeling, we develop a parametric shape model which can be well explained by the angular-dependent growth rate. This work can provide useful information for the control and optimization of graphene growth process on Cu foil.

  7. A system for classifying wood-using industries and recording statistics for automatic data processing.

    Science.gov (United States)

    E.W. Fobes; R.W. Rowe

    1968-01-01

    A system for classifying wood-using industries and recording pertinent statistics for automatic data processing is described. Forms and coding instructions for recording data of primary processing plants are included.

  8. Challenges in computational statistics and data mining

    CERN Document Server

    Mielniczuk, Jan

    2016-01-01

    This volume contains nineteen research papers belonging to the areas of computational statistics, data mining, and their applications. Those papers, all written specifically for this volume, are their authors’ contributions to honour and celebrate Professor Jacek Koronacki on the occcasion of his 70th birthday. The book’s related and often interconnected topics, represent Jacek Koronacki’s research interests and their evolution. They also clearly indicate how close the areas of computational statistics and data mining are.

  9. Statistical Process Control: A Quality Tool for a Venous Thromboembolic Disease Registry.

    Science.gov (United States)

    Posadas-Martinez, Maria Lourdes; Rojas, Liliana Paloma; Vazquez, Fernando Javier; De Quiros, Fernan Bernaldo; Waisman, Gabriel Dario; Giunta, Diego Hernan

    2016-01-01

    We aim to describe Statistical Control Process as a quality tool for the Institutional Registry of Venous Thromboembolic Disease (IRTD), a registry developed in a community-care tertiary hospital in Buenos Aires, Argentina. The IRTD is a prospective cohort. The process of data acquisition began with the creation of a computerized alert generated whenever physicians requested imaging or laboratory study to diagnose venous thromboembolism, which defined eligible patients. The process then followed a structured methodology for patient's inclusion, evaluation, and posterior data entry. To control this process, process performance indicators were designed to be measured monthly. These included the number of eligible patients, the number of included patients, median time to patient's evaluation, and percentage of patients lost to evaluation. Control charts were graphed for each indicator. The registry was evaluated in 93 months, where 25,757 patients were reported and 6,798 patients met inclusion criteria. The median time to evaluation was 20 hours (SD, 12) and 7.7% of the total was lost to evaluation. Each indicator presented trends over time, caused by structural changes and improvement cycles, and therefore the central limit suffered inflexions. Statistical process control through process performance indicators allowed us to control the performance of the registry over time to detect systematic problems. We postulate that this approach could be reproduced for other clinical registries.

  10. High-resolution marine flood modelling coupling overflow and overtopping processes: framing the hazard based on historical and statistical approaches

    Science.gov (United States)

    Nicolae Lerma, Alexandre; Bulteau, Thomas; Elineau, Sylvain; Paris, François; Durand, Paul; Anselme, Brice; Pedreros, Rodrigo

    2018-01-01

    A modelling chain was implemented in order to propose a realistic appraisal of the risk in coastal areas affected by overflowing as well as overtopping processes. Simulations are performed through a nested downscaling strategy from regional to local scale at high spatial resolution with explicit buildings, urban structures such as sea front walls and hydraulic structures liable to affect the propagation of water in urban areas. Validation of the model performance is based on hard and soft available data analysis and conversion of qualitative to quantitative information to reconstruct the area affected by flooding and the succession of events during two recent storms. Two joint probability approaches (joint exceedance contour and environmental contour) are used to define 100-year offshore conditions scenarios and to investigate the flood response to each scenario in terms of (1) maximum spatial extent of flooded areas, (2) volumes of water propagation inland and (3) water level in flooded areas. Scenarios of sea level rise are also considered in order to evaluate the potential hazard evolution. Our simulations show that for a maximising 100-year hazard scenario, for the municipality as a whole, 38 % of the affected zones are prone to overflow flooding and 62 % to flooding by propagation of overtopping water volume along the seafront. Results also reveal that for the two kinds of statistic scenarios a difference of about 5 % in the forcing conditions (water level, wave height and period) can produce significant differences in terms of flooding like +13.5 % of water volumes propagating inland or +11.3 % of affected surfaces. In some areas, flood response appears to be very sensitive to the chosen scenario with differences of 0.3 to 0.5 m in water level. The developed approach enables one to frame the 100-year hazard and to characterize spatially the robustness or the uncertainty over the results. Considering a 100-year scenario with mean sea level rise (0.6 m), hazard

  11. Assessing segmentation processes by click detection: online measure of statistical learning, or simple interference?

    Science.gov (United States)

    Franco, Ana; Gaillard, Vinciane; Cleeremans, Axel; Destrebecqz, Arnaud

    2015-12-01

    Statistical learning can be used to extract the words from continuous speech. Gómez, Bion, and Mehler (Language and Cognitive Processes, 26, 212-223, 2011) proposed an online measure of statistical learning: They superimposed auditory clicks on a continuous artificial speech stream made up of a random succession of trisyllabic nonwords. Participants were instructed to detect these clicks, which could be located either within or between words. The results showed that, over the length of exposure, reaction times (RTs) increased more for within-word than for between-word clicks. This result has been accounted for by means of statistical learning of the between-word boundaries. However, even though statistical learning occurs without an intention to learn, it nevertheless requires attentional resources. Therefore, this process could be affected by a concurrent task such as click detection. In the present study, we evaluated the extent to which the click detection task indeed reflects successful statistical learning. Our results suggest that the emergence of RT differences between within- and between-word click detection is neither systematic nor related to the successful segmentation of the artificial language. Therefore, instead of being an online measure of learning, the click detection task seems to interfere with the extraction of statistical regularities.

  12. Energy statistics: Fourth quarter, 1989

    International Nuclear Information System (INIS)

    Anon.

    1989-01-01

    This volume contains 100 tables compiling data into the following broad categories: energy, drilling, natural gas, gas liquids, oil, coal, peat, electricity, uranium, and business indicators. The types of data that are given include production and consumption statistics, reserves, imports and exports, prices, fossil fuel and nuclear power generation statistics, and price indices

  13. Identification of wastewater treatment processes for nutrient removal on a full-scale WWTP by statistical methods

    DEFF Research Database (Denmark)

    Carstensen, Jakob; Madsen, Henrik; Poulsen, Niels Kjølstad

    1994-01-01

    of the processes, i.e. including prior knowledge, with the significant effects found in data by using statistical identification methods. Rates of the biochemical and hydraulic processes are identified by statistical methods and the related constants for the biochemical processes are estimated assuming Monod...... kinetics. The models only include those hydraulic and kinetic parameters, which have shown to be significant in a statistical sense, and hence they can be quantified. The application potential of these models is on-line control, because the present state of the plant is given by the variables of the models......The introduction of on-line sensors of nutrient salt concentrations on wastewater treatment plants opens a wide new area of modelling wastewater processes. Time series models of these processes are very useful for gaining insight in real time operation of wastewater treatment systems which deal...

  14. Radiotherapy volume delineation using 18F-FDG-PET/CT modifies gross node volume in patients with oesophageal cancer.

    Science.gov (United States)

    Jimenez-Jimenez, E; Mateos, P; Aymar, N; Roncero, R; Ortiz, I; Gimenez, M; Pardo, J; Salinas, J; Sabater, S

    2018-05-02

    Evidence supporting the use of 18F-FDG-PET/CT in the segmentation process of oesophageal cancer for radiotherapy planning is limited. Our aim was to compare the volumes and tumour lengths defined by fused PET/CT vs. CT simulation. Twenty-nine patients were analyzed. All patients underwent a single PET/CT simulation scan. Two separate GTVs were defined: one based on CT data alone and another based on fused PET/CT data. Volume sizes for both data sets were compared and the spatial overlap was assessed by the Dice similarity coefficient (DSC). The gross tumour volume (GTVtumour) and maximum tumour diameter were greater by PET/CT, and length of primary tumour was greater by CT, but differences were not statistically significant. However, the gross node volume (GTVnode) was significantly greater by PET/CT. The DSC analysis showed excellent agreement for GTVtumour, 0.72, but was very low for GTVnode, 0.25. Our study shows that the volume definition by PET/CT and CT data differs. CT simulation, without taking into account PET/CT information, might leave cancer-involved nodes out of the radiotherapy-delineated volumes.

  15. Development of the NRC`s Human Performance Investigation Process (HPIP). Volume 3, Development documentation

    Energy Technology Data Exchange (ETDEWEB)

    Paradies, M.; Unger, L. [System Improvements, Inc., Knoxville, TN (United States); Haas, P.; Terranova, M. [Concord Associates, Inc., Knoxville, TN (United States)

    1993-10-01

    The three volumes of this report detail a standard investigation process for use by US Nuclear Regulatory Commission (NRC) personnel when investigating human performance related events at nuclear power plants. The process, called the Human Performance Investigation Process (HPIP), was developed to meet the special needs of NRC personnel, especially NRC resident and regional inspectors. HPIP is a systematic investigation process combining current procedures and field practices, expert experience, NRC human performance research, and applicable investigation techniques. The process is easy to learn and helps NRC personnel perform better field investigations of the root causes of human performance problems. The human performance data gathered through such investigations provides a better understanding of the human performance issues that cause events at nuclear power plants. This document, Volume III, is a detailed documentation of the development effort and the pilot training program.

  16. Data Mining and Statistics for Decision Making

    CERN Document Server

    Tufféry, Stéphane

    2011-01-01

    Data mining is the process of automatically searching large volumes of data for models and patterns using computational techniques from statistics, machine learning and information theory; it is the ideal tool for such an extraction of knowledge. Data mining is usually associated with a business or an organization's need to identify trends and profiles, allowing, for example, retailers to discover patterns on which to base marketing objectives. This book looks at both classical and recent techniques of data mining, such as clustering, discriminant analysis, logistic regression, generalized lin

  17. Harmonic statistics

    International Nuclear Information System (INIS)

    Eliazar, Iddo

    2017-01-01

    The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.

  18. Harmonic statistics

    Energy Technology Data Exchange (ETDEWEB)

    Eliazar, Iddo, E-mail: eliazar@post.tau.ac.il

    2017-05-15

    The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.

  19. Statistical process control: A feasibility study of the application of time-series measurement in early neurorehabilitation after acquired brain injury.

    Science.gov (United States)

    Markovic, Gabriela; Schult, Marie-Louise; Bartfai, Aniko; Elg, Mattias

    2017-01-31

    Progress in early cognitive recovery after acquired brain injury is uneven and unpredictable, and thus the evaluation of rehabilitation is complex. The use of time-series measurements is susceptible to statistical change due to process variation. To evaluate the feasibility of using a time-series method, statistical process control, in early cognitive rehabilitation. Participants were 27 patients with acquired brain injury undergoing interdisciplinary rehabilitation of attention within 4 months post-injury. The outcome measure, the Paced Auditory Serial Addition Test, was analysed using statistical process control. Statistical process control identifies if and when change occurs in the process according to 3 patterns: rapid, steady or stationary performers. The statistical process control method was adjusted, in terms of constructing the baseline and the total number of measurement points, in order to measure a process in change. Statistical process control methodology is feasible for use in early cognitive rehabilitation, since it provides information about change in a process, thus enabling adjustment of the individual treatment response. Together with the results indicating discernible subgroups that respond differently to rehabilitation, statistical process control could be a valid tool in clinical decision-making. This study is a starting-point in understanding the rehabilitation process using a real-time-measurements approach.

  20. Phase transition for the system of finite volume in the ϕ4 theory in the Tsallis nonextensive statistics

    Science.gov (United States)

    Ishihara, Masamichi

    2018-04-01

    We studied the effects of nonextensivity on the phase transition for the system of finite volume V in the ϕ4 theory in the Tsallis nonextensive statistics of entropic parameter q and temperature T, when the deviation from the Boltzmann-Gibbs (BG) statistics, |q ‑ 1|, is small. We calculated the condensate and the effective mass to the order q ‑ 1 with the normalized q-expectation value under the free particle approximation with zero bare mass. The following facts were found. The condensate Φ divided by v, Φ/v, at q (v is the value of the condensate at T = 0) is smaller than that at q‧ for q > q‧ as a function of Tph/v which is the physical temperature Tph divided by v. The physical temperature Tph is related to the variation of the Tsallis entropy and the variation of the internal energies, and Tph at q = 1 coincides with T. The effective mass decreases, reaches minimum, and increases after that, as Tph increases. The effective mass at q > 1 is lighter than the effective mass at q = 1 at low physical temperature and heavier than the effective mass at q = 1 at high physical temperature. The effects of the nonextensivity on the physical quantity as a function of Tph become strong as |q ‑ 1| increases. The results indicate the significance of the definition of the expectation value, the definition of the physical temperature, and the constraints for the density operator, when the terms including the volume of the system are not negligible.

  1. Fast Quantum Algorithm for Predicting Descriptive Statistics of Stochastic Processes

    Science.gov (United States)

    Williams Colin P.

    1999-01-01

    Stochastic processes are used as a modeling tool in several sub-fields of physics, biology, and finance. Analytic understanding of the long term behavior of such processes is only tractable for very simple types of stochastic processes such as Markovian processes. However, in real world applications more complex stochastic processes often arise. In physics, the complicating factor might be nonlinearities; in biology it might be memory effects; and in finance is might be the non-random intentional behavior of participants in a market. In the absence of analytic insight, one is forced to understand these more complex stochastic processes via numerical simulation techniques. In this paper we present a quantum algorithm for performing such simulations. In particular, we show how a quantum algorithm can predict arbitrary descriptive statistics (moments) of N-step stochastic processes in just O(square root of N) time. That is, the quantum complexity is the square root of the classical complexity for performing such simulations. This is a significant speedup in comparison to the current state of the art.

  2. Penultimate modeling of spatial extremes: statistical inference for max-infinitely divisible processes

    KAUST Repository

    Huser, Raphaë l; Opitz, Thomas; Thibaud, Emeric

    2018-01-01

    Extreme-value theory for stochastic processes has motivated the statistical use of max-stable models for spatial extremes. However, fitting such asymptotic models to maxima observed over finite blocks is problematic when the asymptotic stability

  3. Estimação do volume de árvores utilizando redes neurais artificiais Estimate of tree volume using artificial neural nets

    Directory of Open Access Journals (Sweden)

    Eric Bastos Gorgens

    2009-12-01

    Full Text Available Rede neural artificial consiste em um conjunto de unidades que contêm funções matemáticas, unidas por pesos. As redes são capazes de aprender, mediante modificação dos pesos sinápticos, e generalizar o aprendizado para outros arquivos desconhecidos. O projeto de redes neurais é composto por três etapas: pré-processamento, processamento e, por fim, pós-processamento dos dados. Um dos problemas clássicos que podem ser abordados por redes é a aproximação de funções. Nesse grupo, pode-se incluir a estimação do volume de árvores. Foram utilizados quatro arquiteturas diferentes, cinco pré-processamentos e duas funções de ativação. As redes que se apresentaram estatisticamente iguais aos dados observados também foram analisadas quanto ao resíduo e à distribuição dos volumes e comparadas com a estimação de volume pelo modelo de Schumacher e Hall. As redes neurais formadas por neurônios, cuja função de ativação era exponencial, apresentaram estimativas estatisticamente iguais aos dados observados. As redes treinadas com os dados normalizados pelo método da interpolação linear e equalizados tiveram melhor desempenho na estimação.The artificial neural network consists of a set of units containing mathematical functions connected by weights. Such nets are capable of learning by means of synaptic weight modification, generalizing learning for other unknown archives. The neural network project comprises three stages: pre-processing, processing and post-processing of data. One of the classical problems approached by networks is function approximation. Tree volume estimate can be included in this group. Four different architectures, five pre-processings and two activation functions were used. The nets which were statistically similar to the observed data were also analyzed in relation to residue and volume and compared to the volume estimate provided by the Schumacher and Hall equation. The neural nets formed by

  4. The challenge of developing statistical literacy, reasoning and thinking

    CERN Document Server

    Garfield, Joan

    2004-01-01

    Research in statistics education is an emerging field, with much of the work being published in diverse journals across many disciplines. Locating and synthesizing this research is often a challenging task, as is connecting the research literature to practical issues of teaching and assessing students. This book is unique in that it collects, presents, and synthesizes cutting edge research on different aspects of statistical reasoning and applies this research to the teaching of statistics to students at all educational levels. Unlike other books on how to teach statistics, or educational materials to help students learn statistics, this book presents the research foundation on which teaching should be based. The chapters in this volume are written by the today's leading researchers in statistics education. This volume will prove of great value to mathematics and statistics education researchers, statistics educators, statisticians, cognitive psychologists, mathematics teachers, mathematics and statistics cur...

  5. Causality Statistical Perspectives and Applications

    CERN Document Server

    Berzuini, Carlo; Bernardinell, Luisa

    2012-01-01

    A state of the art volume on statistical causality Causality: Statistical Perspectives and Applications presents a wide-ranging collection of seminal contributions by renowned experts in the field, providing a thorough treatment of all aspects of statistical causality. It covers the various formalisms in current use, methods for applying them to specific problems, and the special requirements of a range of examples from medicine, biology and economics to political science. This book:Provides a clear account and comparison of formal languages, concepts and models for statistical causality. Addr

  6. Computing Science and Statistics: Volume 24. Graphics and Visualization

    Science.gov (United States)

    1993-03-20

    Models Mike West Institute of Statistics & Decision Sciences Duke University, Durham NC 27708, USA Abstract density estimation techniques. With an...ratio-of-uniforms halter, D. J., Best, N. G., McNeil, A. method. Statistics and Computing, 1, (in J., Sharples , L. D. and Kirby, A. J. press). (1992b...Dept of Act. Math & Stats Box 13040 SFA Riccarton Edinburgh, Scotland EH 14 4AS Nacognoches, TX 75962 mike @cara.ma.hw.ac.uk Allen McIntosh Michael T

  7. Project T.E.A.M. (Technical Education Advancement Modules). Advanced Statistical Process Control.

    Science.gov (United States)

    Dunlap, Dale

    This instructional guide, one of a series developed by the Technical Education Advancement Modules (TEAM) project, is a 20-hour advanced statistical process control (SPC) and quality improvement course designed to develop the following competencies: (1) understanding quality systems; (2) knowing the process; (3) solving quality problems; and (4)…

  8. Functional statistics and related fields

    CERN Document Server

    Bongiorno, Enea; Cao, Ricardo; Vieu, Philippe

    2017-01-01

    This volume collects latest methodological and applied contributions on functional, high-dimensional and other complex data, related statistical models and tools as well as on operator-based statistics. It contains selected and refereed contributions presented at the Fourth International Workshop on Functional and Operatorial Statistics (IWFOS 2017) held in A Coruña, Spain, from 15 to 17 June 2017. The series of IWFOS workshops was initiated by the Working Group on Functional and Operatorial Statistics at the University of Toulouse in 2008. Since then, many of the major advances in functional statistics and related fields have been periodically presented and discussed at the IWFOS workshops. .

  9. Spherical Process Models for Global Spatial Statistics

    KAUST Repository

    Jeong, Jaehong

    2017-11-28

    Statistical models used in geophysical, environmental, and climate science applications must reflect the curvature of the spatial domain in global data. Over the past few decades, statisticians have developed covariance models that capture the spatial and temporal behavior of these global data sets. Though the geodesic distance is the most natural metric for measuring distance on the surface of a sphere, mathematical limitations have compelled statisticians to use the chordal distance to compute the covariance matrix in many applications instead, which may cause physically unrealistic distortions. Therefore, covariance functions directly defined on a sphere using the geodesic distance are needed. We discuss the issues that arise when dealing with spherical data sets on a global scale and provide references to recent literature. We review the current approaches to building process models on spheres, including the differential operator, the stochastic partial differential equation, the kernel convolution, and the deformation approaches. We illustrate realizations obtained from Gaussian processes with different covariance structures and the use of isotropic and nonstationary covariance models through deformations and geographical indicators for global surface temperature data. To assess the suitability of each method, we compare their log-likelihood values and prediction scores, and we end with a discussion of related research problems.

  10. Brain tissues volume measurements from 2D MRI using parametric approach

    Science.gov (United States)

    L'vov, A. A.; Toropova, O. A.; Litovka, Yu. V.

    2018-04-01

    The purpose of the paper is to propose a fully automated method of volume assessment of structures within human brain. Our statistical approach uses maximum interdependency principle for decision making process of measurements consistency and unequal observations. Detecting outliers performed using maximum normalized residual test. We propose a statistical model which utilizes knowledge of tissues distribution in human brain and applies partial data restoration for precision improvement. The approach proposes completed computationally efficient and independent from segmentation algorithm used in the application.

  11. Real-time bladder volume monitoring by the application of a new implantable bladder volume sensor for a small animal model

    Directory of Open Access Journals (Sweden)

    Dong Sup Lee

    2011-04-01

    Full Text Available Although real-time monitoring of bladder volume together with intravesical pressure can provide more information for understanding the functional changes of the urinary bladder, it still entails difficulties in the accurate prediction of real-time bladder volume in urodynamic studies with small animal models. We studied a new implantable bladder volume monitoring device with eight rats. During cystometry, microelectrodes prepared by the microelectromechanical systems process were placed symmetrically on both lateral walls of the bladder, and the expanded bladder volume was calculated. Immunohistological study was done after 1 week and after 4 weeks to evaluate the biocompatibility of the microelectrode. From the point that infused saline volume into the bladder was higher than 0.6 mL, estimated bladder volume was statistically correlated with the volume of saline injected (p<0.01. Additionally, the microelectromechanical system microelectrodes used in this study showed reliable biocompatibility. Therefore, the device can be used to evaluate changes in bladder volume in studies with small animals, and it may help to provide more information about functional changes in the bladder in laboratory studies. Furthermore, owing to its biocompatibility, the device could be chronically implanted in conscious ambulating animals, thus allowing a novel longitudinal study to be performed for a specific purpose.

  12. Statistical Language Models and Information Retrieval: Natural Language Processing Really Meets Retrieval

    NARCIS (Netherlands)

    Hiemstra, Djoerd; de Jong, Franciska M.G.

    2001-01-01

    Traditionally, natural language processing techniques for information retrieval have always been studied outside the framework of formal models of information retrieval. In this article, we introduce a new formal model of information retrieval based on the application of statistical language models.

  13. IMPROVING QUALITY OF STATISTICAL PROCESS CONTROL BY DEALING WITH NON‐NORMAL DATA IN AUTOMOTIVE INDUSTRY

    Directory of Open Access Journals (Sweden)

    Zuzana ANDRÁSSYOVÁ

    2012-07-01

    Full Text Available Study deals with an analysis of data to the effect that it improves the quality of statistical tools in processes of assembly of automobile seats. Normal distribution of variables is one of inevitable conditions for the analysis, examination, and improvement of the manufacturing processes (f. e.: manufacturing process capability although, there are constantly more approaches to non‐normal data handling. An appropriate probability distribution of measured data is firstly tested by the goodness of fit of empirical distribution with theoretical normal distribution on the basis of hypothesis testing using programme StatGraphics Centurion XV.II. Data are collected from the assembly process of 1st row automobile seats for each characteristic of quality (Safety Regulation ‐S/R individually. Study closely processes the measured data of an airbag´s assembly and it aims to accomplish the normal distributed data and apply it the statistical process control. Results of the contribution conclude in a statement of rejection of the null hypothesis (measured variables do not follow the normal distribution therefore it is necessary to begin to work on data transformation supported by Minitab15. Even this approach does not reach a normal distributed data and so should be proposed a procedure that leads to the quality output of whole statistical control of manufacturing processes.

  14. Multivariate Statistical Process Control Charts and the Problem of Interpretation: A Short Overview and Some Applications in Industry

    OpenAIRE

    Bersimis, Sotiris; Panaretos, John; Psarakis, Stelios

    2005-01-01

    Woodall and Montgomery [35] in a discussion paper, state that multivariate process control is one of the most rapidly developing sections of statistical process control. Nowadays, in industry, there are many situations in which the simultaneous monitoring or control, of two or more related quality - process characteristics is necessary. Process monitoring problems in which several related variables are of interest are collectively known as Multivariate Statistical Process Control (MSPC).This ...

  15. Vol. 3: Statistical Physics and Phase Transitions

    International Nuclear Information System (INIS)

    Sitenko, A.

    1993-01-01

    Problems of modern physics and the situation with physical research in Ukraine are considered. Programme of the conference includes scientific and general problems. Its proceedings are published in 6 volumes. The papers presented in this volume refer to statistical physics and phase transition theory

  16. Statistical process control for electron beam monitoring.

    Science.gov (United States)

    López-Tarjuelo, Juan; Luquero-Llopis, Naika; García-Mollá, Rafael; Quirós-Higueras, Juan David; Bouché-Babiloni, Ana; Juan-Senabre, Xavier Jordi; de Marco-Blancas, Noelia; Ferrer-Albiach, Carlos; Santos-Serra, Agustín

    2015-07-01

    To assess the electron beam monitoring statistical process control (SPC) in linear accelerator (linac) daily quality control. We present a long-term record of our measurements and evaluate which SPC-led conditions are feasible for maintaining control. We retrieved our linac beam calibration, symmetry, and flatness daily records for all electron beam energies from January 2008 to December 2013, and retrospectively studied how SPC could have been applied and which of its features could be used in the future. A set of adjustment interventions designed to maintain these parameters under control was also simulated. All phase I data was under control. The dose plots were characterized by rising trends followed by steep drops caused by our attempts to re-center the linac beam calibration. Where flatness and symmetry trends were detected they were less-well defined. The process capability ratios ranged from 1.6 to 9.3 at a 2% specification level. Simulated interventions ranged from 2% to 34% of the total number of measurement sessions. We also noted that if prospective SPC had been applied it would have met quality control specifications. SPC can be used to assess the inherent variability of our electron beam monitoring system. It can also indicate whether a process is capable of maintaining electron parameters under control with respect to established specifications by using a daily checking device, but this is not practical unless a method to establish direct feedback from the device to the linac can be devised. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  17. Statistical characterization of pitting corrosion process and life prediction

    International Nuclear Information System (INIS)

    Sheikh, A.K.; Younas, M.

    1995-01-01

    In order to prevent corrosion failures of machines and structures, it is desirable to know in advance when the corrosion damage will take place, and appropriate measures are needed to mitigate the damage. The corrosion predictions are needed both at development as well as operational stage of machines and structures. There are several forms of corrosion process through which varying degrees of damage can occur. Under certain conditions these corrosion processes at alone and in other set of conditions, several of these processes may occur simultaneously. For a certain type of machine elements and structures, such as gears, bearing, tubes, pipelines, containers, storage tanks etc., are particularly prone to pitting corrosion which is an insidious form of corrosion. The corrosion predictions are usually based on experimental results obtained from test coupons and/or field experiences of similar machines or parts of a structure. Considerable scatter is observed in corrosion processes. The probabilities nature and kinetics of pitting process makes in necessary to use statistical method to forecast the residual life of machine of structures. The focus of this paper is to characterization pitting as a time-dependent random process, and using this characterization the prediction of life to reach a critical level of pitting damage can be made. Using several data sets from literature on pitting corrosion, the extreme value modeling of pitting corrosion process, the evolution of the extreme value distribution in time, and their relationship to the reliability of machines and structure are explained. (author)

  18. Morphology of Laplacian growth processes and statistics of equivalent many-body systems

    International Nuclear Information System (INIS)

    Blumenfeld, R.

    1994-01-01

    The authors proposes a theory for the nonlinear evolution of two dimensional interfaces in Laplacian fields. The growing region is conformally mapped onto the unit disk, generating an equivalent many-body system whose dynamics and statistics are studied. The process is shown to be Hamiltonian, with the Hamiltonian being the imaginary part of the complex electrostatic potential. Surface effects are introduced through the Hamiltonian as an external field. An extension to a continuous density of particles is presented. The results are used to study the morphology of the interface using statistical mechanics for the many-body system. The distribution of the curvature and the moments of the growth probability along the interface are calculated exactly from the distribution of the particles. In the dilute limit, the distribution of the curvature is shown to develop algebraic tails, which may, for the first time, explain the origin of fractality in diffusion controlled processes

  19. Statistical fluid mechanics

    CERN Document Server

    Monin, A S

    2007-01-01

    ""If ever a field needed a definitive book, it is the study of turbulence; if ever a book on turbulence could be called definitive, it is this book."" - ScienceWritten by two of Russia's most eminent and productive scientists in turbulence, oceanography, and atmospheric physics, this two-volume survey is renowned for its clarity as well as its comprehensive treatment. The first volume begins with an outline of laminar and turbulent flow. The remainder of the book treats a variety of aspects of turbulence: its statistical and Lagrangian descriptions, shear flows near surfaces and free turbulenc

  20. Ionization processes in a transient hollow cathode discharge before electric breakdown: statistical distribution

    International Nuclear Information System (INIS)

    Zambra, M.; Favre, M.; Moreno, J.; Wyndham, E.; Chuaqui, H.; Choi, P.

    1998-01-01

    The charge formation processes in a hollow cathode region (HCR) of transient hollow cathode discharge have been studied at the final phase. The statistical distribution that describe different processes of ionization have been represented by Gaussian distributions. Nevertheless, was observed a better representation of these distributions when the pressure is near a minimum value, just before breakdown

  1. Study of film data processing systems by means of a statistical simulation

    International Nuclear Information System (INIS)

    Deart, A.F.; Gromov, A.I.; Kapustinskaya, V.I.; Okorochenko, G.E.; Sychev, A.Yu.; Tatsij, L.I.

    1974-01-01

    Considered is a statistic model of the film information processing system. The given time diagrams illustrate the model operation algorithm. The program realizing this model of the system is described in detail. The elaborated program model has been tested at the film information processing system which represents a group of measuring devices operating in line with BESM computer. The obtained functioning quantitative characteristics of the system being tested permit to estimate the system operation efficiency

  2. On-line statistical processing of radiation detector pulse trains with time-varying count rates

    International Nuclear Information System (INIS)

    Apostolopoulos, G.

    2008-01-01

    Statistical analysis is of primary importance for the correct interpretation of nuclear measurements, due to the inherent random nature of radioactive decay processes. This paper discusses the application of statistical signal processing techniques to the random pulse trains generated by radiation detectors. The aims of the presented algorithms are: (i) continuous, on-line estimation of the underlying time-varying count rate θ(t) and its first-order derivative dθ/dt; (ii) detection of abrupt changes in both of these quantities and estimation of their new value after the change point. Maximum-likelihood techniques, based on the Poisson probability distribution, are employed for the on-line estimation of θ and dθ/dt. Detection of abrupt changes is achieved on the basis of the generalized likelihood ratio statistical test. The properties of the proposed algorithms are evaluated by extensive simulations and possible applications for on-line radiation monitoring are discussed

  3. Minerals Yearbook, volume II, Area Reports—Domestic

    Science.gov (United States)

    ,

    2018-01-01

    The U.S. Geological Survey (USGS) Minerals Yearbook discusses the performance of the worldwide minerals and materials industries and provides background information to assist in interpreting that performance. Content of the individual Minerals Yearbook volumes follows:Volume I, Metals and Minerals, contains chapters about virtually all metallic and industrial mineral commodities important to the U.S. economy. Chapters on survey methods, summary statistics for domestic nonfuel minerals, and trends in mining and quarrying in the metals and industrial mineral industries in the United States are also included.Volume II, Area Reports: Domestic, contains a chapter on the mineral industry of each of the 50 States and Puerto Rico and the Administered Islands. This volume also has chapters on survey methods and summary statistics of domestic nonfuel minerals.Volume III, Area Reports: International, is published as four separate reports. These regional reports contain the latest available minerals data on more than 180 foreign countries and discuss the importance of minerals to the economies of these nations and the United States. Each report begins with an overview of the region’s mineral industries during the year. It continues with individual country chapters that examine the mining, refining, processing, and use of minerals in each country of the region and how each country’s mineral industry relates to U.S. industry. Most chapters include production tables and industry structure tables, information about Government policies and programs that affect the country’s mineral industry, and an outlook section.The USGS continually strives to improve the value of its publications to users. Constructive comments and suggestions by readers of the Minerals Yearbook are welcomed.

  4. Minerals Yearbook, volume I, Metals and Minerals

    Science.gov (United States)

    ,

    2018-01-01

    The U.S. Geological Survey (USGS) Minerals Yearbook discusses the performance of the worldwide minerals and materials industries and provides background information to assist in interpreting that performance. Content of the individual Minerals Yearbook volumes follows:Volume I, Metals and Minerals, contains chapters about virtually all metallic and industrial mineral commodities important to the U.S. economy. Chapters on survey methods, summary statistics for domestic nonfuel minerals, and trends in mining and quarrying in the metals and industrial mineral industries in the United States are also included.Volume II, Area Reports: Domestic, contains a chapter on the mineral industry of each of the 50 States and Puerto Rico and the Administered Islands. This volume also has chapters on survey methods and summary statistics of domestic nonfuel minerals.Volume III, Area Reports: International, is published as four separate reports. These regional reports contain the latest available minerals data on more than 180 foreign countries and discuss the importance of minerals to the economies of these nations and the United States. Each report begins with an overview of the region’s mineral industries during the year. It continues with individual country chapters that examine the mining, refining, processing, and use of minerals in each country of the region and how each country’s mineral industry relates to U.S. industry. Most chapters include production tables and industry structure tables, information about Government policies and programs that affect the country’s mineral industry, and an outlook section.The USGS continually strives to improve the value of its publications to users. Constructive comments and suggestions by readers of the Minerals Yearbook are welcomed.

  5. Minerals Yearbook, volume III, Area Reports—International

    Science.gov (United States)

    ,

    2018-01-01

    The U.S. Geological Survey (USGS) Minerals Yearbook discusses the performance of the worldwide minerals and materials industries and provides background information to assist in interpreting that performance. Content of the individual Minerals Yearbook volumes follows:Volume I, Metals and Minerals, contains chapters about virtually all metallic and industrial mineral commodities important to the U.S. economy. Chapters on survey methods, summary statistics for domestic nonfuel minerals, and trends in mining and quarrying in the metals and industrial mineral industries in the United States are also included.Volume II, Area Reports: Domestic, contains a chapter on the mineral industry of each of the 50 States and Puerto Rico and the Administered Islands. This volume also has chapters on survey methods and summary statistics of domestic nonfuel minerals.Volume III, Area Reports: International, is published as four separate reports. These regional reports contain the latest available minerals data on more than 180 foreign countries and discuss the importance of minerals to the economies of these nations and the United States. Each report begins with an overview of the region’s mineral industries during the year. It continues with individual country chapters that examine the mining, refining, processing, and use of minerals in each country of the region and how each country’s mineral industry relates to U.S. industry. Most chapters include production tables and industry structure tables, information about Government policies and programs that affect the country’s mineral industry, and an outlook section.The USGS continually strives to improve the value of its publications to users. Constructive comments and suggestions by readers of the Minerals Yearbook are welcomed.

  6. Method of volume-reducing processing for radioactive wastes

    International Nuclear Information System (INIS)

    Sato, Koei; Yamauchi, Noriyuki; Hirayama, Toshihiko.

    1985-01-01

    Purpose: To process the processing products of radioactive liquid wastes and burnable solid wastes produced from nuclear facilities into stable solidification products by heat melting. Method: At first, glass fiber wastes of contaminated air filters are charged in a melting furnace. Then, waste products obtained through drying, sintering, incineration, etc. are mixed with a proper amount of glass fibers and charged into the melting furnace. Both of the charged components are heated to a temperature at which the glass fibers are melted. The burnable materials are burnt out to provide a highly volume-reduced products. When the products are further heated to a temperature at which metals or metal oxides of a higher melting point than the glass fiber, the glass fibers and the metals or metal oxides are fused to each other to be combined in a molecular structure into more stabilized products. The products are excellent in strength, stability, durability and leaching resistance at ambient temperature. (Kamimura, M.)

  7. Reporting and analyzing statistical uncertainties in Monte Carlo-based treatment planning

    International Nuclear Information System (INIS)

    Chetty, Indrin J.; Rosu, Mihaela; Kessler, Marc L.; Fraass, Benedick A.; Haken, Randall K. ten; Kong, Feng-Ming; McShan, Daniel L.

    2006-01-01

    Purpose: To investigate methods of reporting and analyzing statistical uncertainties in doses to targets and normal tissues in Monte Carlo (MC)-based treatment planning. Methods and Materials: Methods for quantifying statistical uncertainties in dose, such as uncertainty specification to specific dose points, or to volume-based regions, were analyzed in MC-based treatment planning for 5 lung cancer patients. The effect of statistical uncertainties on target and normal tissue dose indices was evaluated. The concept of uncertainty volume histograms for targets and organs at risk was examined, along with its utility, in conjunction with dose volume histograms, in assessing the acceptability of the statistical precision in dose distributions. The uncertainty evaluation tools were extended to four-dimensional planning for application on multiple instances of the patient geometry. All calculations were performed using the Dose Planning Method MC code. Results: For targets, generalized equivalent uniform doses and mean target doses converged at 150 million simulated histories, corresponding to relative uncertainties of less than 2% in the mean target doses. For the normal lung tissue (a volume-effect organ), mean lung dose and normal tissue complication probability converged at 150 million histories despite the large range in the relative organ uncertainty volume histograms. For 'serial' normal tissues such as the spinal cord, large fluctuations exist in point dose relative uncertainties. Conclusions: The tools presented here provide useful means for evaluating statistical precision in MC-based dose distributions. Tradeoffs between uncertainties in doses to targets, volume-effect organs, and 'serial' normal tissues must be considered carefully in determining acceptable levels of statistical precision in MC-computed dose distributions

  8. Evaluating Statistical Process Control (SPC) techniques and computing the uncertainty of force calibrations

    Science.gov (United States)

    Navard, Sharon E.

    1989-01-01

    In recent years there has been a push within NASA to use statistical techniques to improve the quality of production. Two areas where statistics are used are in establishing product and process quality control of flight hardware and in evaluating the uncertainty of calibration of instruments. The Flight Systems Quality Engineering branch is responsible for developing and assuring the quality of all flight hardware; the statistical process control methods employed are reviewed and evaluated. The Measurement Standards and Calibration Laboratory performs the calibration of all instruments used on-site at JSC as well as those used by all off-site contractors. These calibrations must be performed in such a way as to be traceable to national standards maintained by the National Institute of Standards and Technology, and they must meet a four-to-one ratio of the instrument specifications to calibrating standard uncertainty. In some instances this ratio is not met, and in these cases it is desirable to compute the exact uncertainty of the calibration and determine ways of reducing it. A particular example where this problem is encountered is with a machine which does automatic calibrations of force. The process of force calibration using the United Force Machine is described in detail. The sources of error are identified and quantified when possible. Suggestions for improvement are made.

  9. Vivaldi: A Domain-Specific Language for Volume Processing and Visualization on Distributed Heterogeneous Systems.

    Science.gov (United States)

    Choi, Hyungsuk; Choi, Woohyuk; Quan, Tran Minh; Hildebrand, David G C; Pfister, Hanspeter; Jeong, Won-Ki

    2014-12-01

    As the size of image data from microscopes and telescopes increases, the need for high-throughput processing and visualization of large volumetric data has become more pressing. At the same time, many-core processors and GPU accelerators are commonplace, making high-performance distributed heterogeneous computing systems affordable. However, effectively utilizing GPU clusters is difficult for novice programmers, and even experienced programmers often fail to fully leverage the computing power of new parallel architectures due to their steep learning curve and programming complexity. In this paper, we propose Vivaldi, a new domain-specific language for volume processing and visualization on distributed heterogeneous computing systems. Vivaldi's Python-like grammar and parallel processing abstractions provide flexible programming tools for non-experts to easily write high-performance parallel computing code. Vivaldi provides commonly used functions and numerical operators for customized visualization and high-throughput image processing applications. We demonstrate the performance and usability of Vivaldi on several examples ranging from volume rendering to image segmentation.

  10. Mathematical SETI Statistics, Signal Processing, Space Missions

    CERN Document Server

    Maccone, Claudio

    2012-01-01

    This book introduces the Statistical Drake Equation where, from a simple product of seven positive numbers, the Drake Equation is turned into the product of seven positive random variables. The mathematical consequences of this transformation are demonstrated and it is proven that the new random variable N for the number of communicating civilizations in the Galaxy must follow the lognormal probability distribution when the number of factors in the Drake equation is allowed to increase at will. Mathematical SETI also studies the proposed FOCAL (Fast Outgoing Cyclopean Astronomical Lens) space mission to the nearest Sun Focal Sphere at 550 AU and describes its consequences for future interstellar precursor missions and truly interstellar missions. In addition the author shows how SETI signal processing may be dramatically improved by use of the Karhunen-Loève Transform (KLT) rather than Fast Fourier Transform (FFT). Finally, he describes the efforts made to persuade the United Nations to make the central part...

  11. Statistical Processing Algorithms for Human Population Databases

    Directory of Open Access Journals (Sweden)

    Camelia COLESCU

    2012-01-01

    Full Text Available The article is describing some algoritms for statistic functions aplied to a human population database. The samples are specific for the most interesting periods, when the evolution of statistical datas has spectacolous value. The article describes the most usefull form of grafical prezentation of the results

  12. Statistical Analysis of Deep Drilling Process Conditions Using Vibrations and Force Signals

    Directory of Open Access Journals (Sweden)

    Syafiq Hazwan

    2016-01-01

    Full Text Available Cooling systems is a key point for hot forming process of Ultra High Strength Steels (UHSS. Normally, cooling systems is made using deep drilling technique. Although deep twist drill is better than other drilling techniques in term of higher productivity however its main problem is premature tool breakage, which affects the production quality. In this paper, analysis of deep twist drill process parameters such as cutting speed, feed rate and depth of cut by using statistical analysis to identify the tool condition is presented. The comparisons between different two tool geometries are also studied. Measured data from vibrations and force sensors are being analyzed through several statistical parameters such as root mean square (RMS, mean, kurtosis, standard deviation and skewness. Result found that kurtosis and skewness value are the most appropriate parameters to represent the deep twist drill tool conditions behaviors from vibrations and forces data. The condition of the deep twist drill process been classified according to good, blunt and fracture. It also found that the different tool geometry parameters affect the performance of the tool drill. It believe the results of this study are useful in determining the suitable analysis method to be used for developing online tool condition monitoring system to identify the tertiary tool life stage and helps to avoid mature of tool fracture during drilling process.

  13. Load research manual. Volume 2. Fundamentals of implementing load research procedures

    Energy Technology Data Exchange (ETDEWEB)

    Brandenburg, L.; Clarkson, G.; Grund, Jr., C.; Leo, J.; Asbury, J.; Brandon-Brown, F.; Derderian, H.; Mueller, R.; Swaroop, R.

    1980-11-01

    This three-volume manual presents technical guidelines for electric utility load research. Special attention is given to issues raised by the load data reporting requirements of the Public Utility Regulatory Policies Act of 1978 and to problems faced by smaller utilities that are initiating load research programs. In Volumes 1 and 2, procedures are suggested for determining data requirements for load research, establishing the size and customer composition of a load survey sample, selecting and using equipment to record customer electricity usage, processing data tapes from the recording equipment, and analyzing the data. Statistical techniques used in customer sampling are discussed in detail. The costs of load research also are estimated, and ongoing load research programs at three utilities are described. The manual includes guides to load research literature and glossaries of load research and statistical terms.

  14. Statistical process control applied to the manufacturing of beryllia ceramics

    International Nuclear Information System (INIS)

    Ferguson, G.P.; Jech, D.E.; Sepulveda, J.L.

    1991-01-01

    To compete effectively in an international market, scrap and re-work costs must be minimized. Statistical Process Control (SPC) provides powerful tools to optimize production performance. These techniques are currently being applied to the forming, metallizing, and brazing of beryllia ceramic components. This paper describes specific examples of applications of SPC to dry-pressing of beryllium oxide 2x2 substrates, to Mo-Mn refractory metallization, and to metallization and brazing of plasma tubes used in lasers where adhesion strength is critical

  15. 12th Brazilian Meeting on Bayesian Statistics

    CERN Document Server

    Louzada, Francisco; Rifo, Laura; Stern, Julio; Lauretto, Marcelo

    2015-01-01

    Through refereed papers, this volume focuses on the foundations of the Bayesian paradigm; their comparison to objectivistic or frequentist Statistics counterparts; and the appropriate application of Bayesian foundations. This research in Bayesian Statistics is applicable to data analysis in biostatistics, clinical trials, law, engineering, and the social sciences. EBEB, the Brazilian Meeting on Bayesian Statistics, is held every two years by the ISBrA, the International Society for Bayesian Analysis, one of the most active chapters of the ISBA. The 12th meeting took place March 10-14, 2014 in Atibaia. Interest in foundations of inductive Statistics has grown recently in accordance with the increasing availability of Bayesian methodological alternatives. Scientists need to deal with the ever more difficult choice of the optimal method to apply to their problem. This volume shows how Bayes can be the answer. The examination and discussion on the foundations work towards the goal of proper application of Bayesia...

  16. Topics from Australian Conferences on Teaching Statistics

    CERN Document Server

    Phillips, Brian; Martin, Michael

    2014-01-01

    The first OZCOTS conference in 1998 was inspired by papers contributed by Australians to the 5th International Conference on Teaching Statistics. In 2008, as part of the program of one of the first National Senior Teaching Fellowships, the 6th OZCOTS was held in conjunction with the Australian Statistical Conference, with Fellowship keynotes and contributed papers, optional refereeing and proceedings. This venture was so successful that the 7th and 8th OZCOTS were similarly run, conjoined with Australian Statistical Conferences in 2010 and 2012. Authors of papers from these OZCOTS conferences were invited to develop chapters for refereeing and inclusion in this volume. There are sections on keynote topics, undergraduate curriculum and learning, professional development, postgraduate learning, and papers from OZCOTS 2012. Because OZCOTS aim to unite statisticians and statistics educators, the approaches this volume takes are immediately relevant to all who have a vested interest in good teaching practices. Glo...

  17. Constitutive Modelling in Thermomechanical Processes, Using The Control Volume Method on Staggered Grid

    DEFF Research Database (Denmark)

    Thorborg, Jesper

    , however, is constituted by the implementation of the $J_2$ flow theory in the control volume method. To apply the control volume formulation on the process of hardening concrete viscoelastic stress-strain models has been examined in terms of various rheological models. The generalized 3D models are based...... on two different suggestions in the literature, that is compressible or incompressible behaviour of the viscos response in the dashpot element. Numerical implementation of the models has shown very good agreement with corresponding analytical solutions. The viscoelastic solid mechanical model is used...

  18. Protecting the Force: Application of Statistical Process Control for Force Protection in Bosnia

    National Research Council Canada - National Science Library

    Finken, Paul

    2000-01-01

    .... In Operations Other Than War (OOTW), environments where the enemy is disorganized and incapable of mounting a deception plan, staffs could model hostile events as stochastic events and use statistical methods to detect changes to the process...

  19. Bootstrap-based confidence estimation in PCA and multivariate statistical process control

    DEFF Research Database (Denmark)

    Babamoradi, Hamid

    be used to detect outliers in the data since the outliers can distort the bootstrap estimates. Bootstrap-based confidence limits were suggested as alternative to the asymptotic limits for control charts and contribution plots in MSPC (Paper II). The results showed that in case of the Q-statistic......Traditional/Asymptotic confidence estimation has limited applicability since it needs statistical theories to estimate the confidences, which are not available for all indicators/parameters. Furthermore, in case the theories are available for a specific indicator/parameter, the theories are based....... The goal was to improve process monitoring by improving the quality of MSPC charts and contribution plots. Bootstrapping algorithm to build confidence limits was illustrated in a case study format (Paper I). The main steps in the algorithm were discussed where a set of sensible choices (plus...

  20. Past, present, and future of statistical science

    CERN Document Server

    Lin, Xihong; Banks, David L; Molenberghs, Geert; Scott, David W; Wang, Jane-Ling

    2014-01-01

    Past, Present, and Future of Statistical Science was commissioned in 2013 by the Committee of Presidents of Statistical Societies (COPSS) to celebrate its 50th anniversary and the International Year of Statistics. COPSS consists of five charter member statistical societies in North America and is best known for sponsoring prestigious awards in statistics, such as the COPSS Presidents' award. Through the contributions of a distinguished group of 50 statisticians who are past winners of at least one of the five awards sponsored by COPSS, this volume showcases the breadth and vibrancy of statisti

  1. Parametric methods for spatial point processes

    DEFF Research Database (Denmark)

    Møller, Jesper

    is studied in Section 4, and Bayesian inference in Section 5. On one hand, as the development in computer technology and computational statistics continues,computationally-intensive simulation-based methods for likelihood inference probably will play a increasing role for statistical analysis of spatial...... inference procedures for parametric spatial point process models. The widespread use of sensible but ad hoc methods based on summary statistics of the kind studied in Chapter 4.3 have through the last two decades been supplied by likelihood based methods for parametric spatial point process models......(This text is submitted for the volume ‘A Handbook of Spatial Statistics' edited by A.E. Gelfand, P. Diggle, M. Fuentes, and P. Guttorp, to be published by Chapmand and Hall/CRC Press, and planned to appear as Chapter 4.4 with the title ‘Parametric methods'.) 1 Introduction This chapter considers...

  2. Statistical physics of media processes: Mediaphysics

    Science.gov (United States)

    Kuznetsov, Dmitri V.; Mandel, Igor

    2007-04-01

    The processes of mass communications in complicated social or sociobiological systems such as marketing, economics, politics, animal populations, etc. as a subject for the special scientific subbranch-“mediaphysics”-are considered in its relation with sociophysics. A new statistical physics approach to analyze these phenomena is proposed. A keystone of the approach is an analysis of population distribution between two or many alternatives: brands, political affiliations, or opinions. Relative distances between a state of a “person's mind” and the alternatives are measures of propensity to buy (to affiliate, or to have a certain opinion). The distribution of population by those relative distances is time dependent and affected by external (economic, social, marketing, natural) and internal (influential propagation of opinions, “word of mouth”, etc.) factors, considered as fields. Specifically, the interaction and opinion-influence field can be generalized to incorporate important elements of Ising-spin-based sociophysical models and kinetic-equation ones. The distributions were described by a Schrödinger-type equation in terms of Green's functions. The developed approach has been applied to a real mass-media efficiency problem for a large company and generally demonstrated very good results despite low initial correlations of factors and the target variable.

  3. POWERNEXT Carbon statistics September 30, 2006

    International Nuclear Information System (INIS)

    2006-01-01

    This short document summarizes the statistics of Powernext Carbon, the European CO 2 trading market, for the July-September 2006 period: total market volume, daily average, highest, number and average size of trades, number of members, average closing price, variation, low and high traded. The monthly volumes and closing prices for the September 2005 - September 2006 era are summarized in a graphics. (J.S.)

  4. POWERNEXT Carbon statistics November 30, 2006

    International Nuclear Information System (INIS)

    2006-01-01

    This short document summarizes the statistics of Powernext Carbon, the European CO 2 trading market, for the September-November 2006 period: total market volume, daily average, highest, number and average size of trades, number of members, average closing price, variation, low and high traded. The monthly volumes and closing prices for the November 2005 - November 2006 era are summarized in a graphics. (J.S.)

  5. Powernext Carbon statistics - June 30, 2006

    International Nuclear Information System (INIS)

    2006-01-01

    This short document summarizes the statistics of Powernext Carbon, the European CO 2 quotas trading market, for the second quarter of 2006: total market volume, daily average, highest, number and average size of trades, number of members, average closing price, variation, low and high traded. The monthly volumes and closing prices from June 2005 to June 2006 are summarized in a graphics. (J.S.)

  6. POWERNEXT Carbon statistics October 31, 2006

    International Nuclear Information System (INIS)

    2006-01-01

    This short document summarizes the statistics of Powernext Carbon, the European CO 2 trading market, for the August-October 2006 period: total market volume, daily average, highest, number and average size of trades, number of members, average closing price, variation, low and high traded. The monthly volumes and closing prices for the October 2005 - October 2006 era are summarized in a graphics. (J.S.)

  7. Powernext Carbon statistics - May 31, 2006

    International Nuclear Information System (INIS)

    2006-01-01

    This short document summarizes the statistics of Powernext Carbon, the European CO 2 quotas trading market, for March, April and May 2006: total market volume, daily average, highest, number and average size of trades, number of members, average closing price, variation, low and high traded. The daily volume and closing price from June 2005 to May 2006 are summarized in a graphics. (J.S.)

  8. An Automated Statistical Process Control Study of Inline Mixing Using Spectrophotometric Detection

    Science.gov (United States)

    Dickey, Michael D.; Stewart, Michael D.; Willson, C. Grant

    2006-01-01

    An experiment is described, which is designed for a junior-level chemical engineering "fundamentals of measurements and data analysis" course, where students are introduced to the concept of statistical process control (SPC) through a simple inline mixing experiment. The students learn how to create and analyze control charts in an effort to…

  9. Statistics at a glance.

    Science.gov (United States)

    Ector, Hugo

    2010-12-01

    I still remember my first book on statistics: "Elementary statistics with applications in medicine and the biological sciences" by Frederick E. Croxton. For me, it has been the start of pursuing understanding statistics in daily life and in medical practice. It was the first volume in a long row of books. In his introduction, Croxton pretends that"nearly everyone involved in any aspect of medicine needs to have some knowledge of statistics". The reality is that for many clinicians, statistics are limited to a "P statistical methods. They have never had the opportunity to learn concise and clear descriptions of the key features. I have experienced how some authors can describe difficult methods in a well understandable language. Others fail completely. As a teacher, I tell my students that life is impossible without a basic knowledge of statistics. This feeling has resulted in an annual seminar of 90 minutes. This tutorial is the summary of this seminar. It is a summary and a transcription of the best pages I have detected.

  10. The Relationship between Processing Speed and Regional White Matter Volume in Healthy Young People.

    Directory of Open Access Journals (Sweden)

    Daniele Magistro

    Full Text Available Processing speed is considered a key cognitive resource and it has a crucial role in all types of cognitive performance. Some researchers have hypothesised the importance of white matter integrity in the brain for processing speed; however, the relationship at the whole-brain level between white matter volume (WMV and processing speed relevant to the modality or problem used in the task has never been clearly evaluated in healthy people. In this study, we used various tests of processing speed and Voxel-Based Morphometry (VBM analyses, it is involves a voxel-wise comparison of the local volume of gray and white, to assess the relationship between processing speed and regional WMV (rWMV. We examined the association between processing speed and WMV in 887 healthy young adults (504 men and 383 women; mean age, 20.7 years, SD, 1.85. We performed three different multiple regression analyses: we evaluated rWMV associated with individual differences in the simple processing speed task, word-colour and colour-word tasks (processing speed tasks with words and the simple arithmetic task, after adjusting for age and sex. The results showed a positive relationship at the whole-brain level between rWMV and processing speed performance. In contrast, the processing speed performance did not correlate with rWMV in any of the regions examined. Our results support the idea that WMV is associated globally with processing speed performance regardless of the type of processing speed task.

  11. The Monte Carlo method the method of statistical trials

    CERN Document Server

    Shreider, YuA

    1966-01-01

    The Monte Carlo Method: The Method of Statistical Trials is a systematic account of the fundamental concepts and techniques of the Monte Carlo method, together with its range of applications. Some of these applications include the computation of definite integrals, neutron physics, and in the investigation of servicing processes. This volume is comprised of seven chapters and begins with an overview of the basic features of the Monte Carlo method and typical examples of its application to simple problems in computational mathematics. The next chapter examines the computation of multi-dimensio

  12. Statistical dynamics of transient processes in a gas discharge plasma

    International Nuclear Information System (INIS)

    Smirnov, G.I.; Telegin, G.G.

    1991-01-01

    The properties of a gas discharge plasma to a great extent depend on random processes whose study has recently become particularly important. The present work is concerned with analyzing the statistical phenomena that occur during the prebreakdown stage in a gas discharge. Unlike other studies of breakdown in the discharge gap, in which secondary electron effects and photon processes at the electrodes must be considered, here the authors treat the case of an electrodeless rf discharge or a laser photoresonant plasma. The analysis is based on the balance between the rates of electron generation and recombination in the plasma. The fluctuation kinetics for ionization of atoms in the hot plasma may also play an important role when the electron temperature changes abruptly, as occurs during adiabatic pinching of the plasma or during electron cyclotron heating

  13. Project T.E.A.M. (Technical Education Advancement Modules). Introduction to Statistical Process Control.

    Science.gov (United States)

    Billings, Paul H.

    This instructional guide, one of a series developed by the Technical Education Advancement Modules (TEAM) project, is a 6-hour introductory module on statistical process control (SPC), designed to develop competencies in the following skill areas: (1) identification of the three classes of SPC use; (2) understanding a process and how it works; (3)…

  14. Statistics As Principled Argument

    CERN Document Server

    Abelson, Robert P

    2012-01-01

    In this illuminating volume, Robert P. Abelson delves into the too-often dismissed problems of interpreting quantitative data and then presenting them in the context of a coherent story about one's research. Unlike too many books on statistics, this is a remarkably engaging read, filled with fascinating real-life (and real-research) examples rather than with recipes for analysis. It will be of true interest and lasting value to beginning graduate students and seasoned researchers alike. The focus of the book is that the purpose of statistics is to organize a useful argument from quantitative

  15. Statistical mechanics and the physics of fluids

    CERN Document Server

    Tosi, Mario

    This volume collects the lecture notes of a course on statistical mechanics, held at Scuola Normale Superiore di Pisa for third-to-fifth year students in physics and chemistry. Three main themes are covered in the book. The first part gives a compact presentation of the foundations of statistical mechanics and their connections with thermodynamics. Applications to ideal gases of material particles and of excitation quanta are followed by a brief introduction to a real classical gas and to a weakly coupled classical plasma, and by a broad overview on the three states of matter.The second part is devoted to fluctuations around equilibrium and their correlations. Coverage of liquid structure and critical phenomena is followed by a discussion of irreversible processes as exemplified by diffusive motions and by the dynamics of density and heat fluctuations. Finally, the third part is an introduction to some advanced themes: supercooling and the glassy state, non-Newtonian fluids including polymers and liquid cryst...

  16. Powernext Carbon statistics - August 31, 2005

    International Nuclear Information System (INIS)

    2005-01-01

    This short document summarizes the statistics of Powernext Carbon, the European CO 2 quotas trading market, for June, July and August 2005: total market volume, daily average, highest, number and average size of trades, number of members, average closing price, variation, low and high traded. The daily volume and closing price from June 2005 to August 2005 are summarized in a graphics and a members list is supplied. (J.S.)

  17. Powernext Carbon statistics - March 31, 2006

    International Nuclear Information System (INIS)

    2006-01-01

    This short document summarizes the statistics of Powernext Carbon, the European CO 2 quotas trading market, for the first quarter of 2006: total market volume, daily average, highest, number and average size of trades, number of members, average closing price, variation, low and high traded. The daily volume and closing price from June 2005 to March 2006 are summarized in a graphics and a members list is supplied. (J.S.)

  18. Powernext Carbon statistics - November 30, 2005

    International Nuclear Information System (INIS)

    2005-01-01

    This short document summarizes the statistics of Powernext Carbon, the European CO 2 quotas trading market, for September, October and November 2005: total market volume, daily average, highest, number and average size of trades, number of members, average closing price, variation, low and high traded. The daily volume and closing price from June 2005 to November 2005 are summarized in a graphics and a members list is supplied. (J.S.)

  19. Powernext Carbon statistics - February 28, 2006

    International Nuclear Information System (INIS)

    2006-01-01

    This short document summarizes the statistics of Powernext Carbon, the European CO 2 quotas trading market, for December 2005 and January-February 2006: total market volume, daily average, highest, number and average size of trades, number of members, average closing price, variation, low and high traded. The daily volume and closing price from June 2005 to February 2005 are summarized in a graphics and a members list is supplied. (J.S.)

  20. Powernext Carbon statistics - October 31, 2005

    International Nuclear Information System (INIS)

    2005-01-01

    This short document summarizes the statistics of Powernext Carbon, the European CO 2 quotas trading market, for August, September and October 2005: total market volume, daily average, highest, number and average size of trades, number of members, average closing price, variation, low and high traded. The daily volume and closing price from June 2005 to November 2005 are summarized in a graphics and a members list is supplied. (J.S.)

  1. Gamma ray densitometry techniques for measuring of volume fractions

    International Nuclear Information System (INIS)

    Affonso, Renato Raoni Werneck; Silva, Ademir Xavier da; Salgado, Cesar Marques

    2015-01-01

    Knowledge of the volume fraction in a multiphase flow is of key importance in predicting the performance of many systems and processes. It is therefore an important parameter to characterize such flows. In the context of nuclear techniques, the gamma ray densitometry is promising and this is due to its non-invasive characteristics and very reliable results. It is used in several applications for multiphase flows (water-oil-air), which are employed tools such as: computational fluid dynamics, artificial neural networks and statistical methods of radiation transport, such as the Monte Carlo method. Based on the gamma radiation techniques for measurements of volume fractions, the aim of this paper is to present several techniques developed for this purpose. (author)

  2. Gamma ray densitometry techniques for measuring of volume fractions

    Energy Technology Data Exchange (ETDEWEB)

    Affonso, Renato Raoni Werneck; Silva, Ademir Xavier da; Salgado, Cesar Marques, E-mail: raoniwa@yahoo.com.br, E-mail: ademir@nuclear.ufrj.br, E-mail: otero@ien.gov.br [Instituto de Engenharia Nuclear (IEN/CNEN-RJ), Rio de Janeiro, RJ (Brazil)

    2015-07-01

    Knowledge of the volume fraction in a multiphase flow is of key importance in predicting the performance of many systems and processes. It is therefore an important parameter to characterize such flows. In the context of nuclear techniques, the gamma ray densitometry is promising and this is due to its non-invasive characteristics and very reliable results. It is used in several applications for multiphase flows (water-oil-air), which are employed tools such as: computational fluid dynamics, artificial neural networks and statistical methods of radiation transport, such as the Monte Carlo method. Based on the gamma radiation techniques for measurements of volume fractions, the aim of this paper is to present several techniques developed for this purpose. (author)

  3. Erwin Schroedinger: Collected papers V. 1. Contributions to statistical mechanics

    International Nuclear Information System (INIS)

    Schroedinger, E.

    1984-01-01

    38 publications reprinted in this volume show that the interest for statistical problems accompanied Schroedinger during his entire scientific career. Already in his second paper he worked on the magnetism of solid states. The classical considerations come close to the heart of diamagnetism and also to the origin of paramagnetism. In classical investigations of the specific heat Schroedinger helped through abstract theory but also by analysing a gigantic amount of experimental material. In 1926 he and F. Kohlrausch actually played the 'Urngame of Ehrenfest' as a model of the H-curve and published the results. Inclination towards experimenting, sequences of measurements and statistical evaluation of experimental data led to papers on the foundation of the theory of probability, where he tries to put the subjective probability concept on into a systematic framework. Two earlier papers on dynamics of the elastic chain remained particularly valuable. By solving the initial value problem with Bessel-functions this many-body-problem is led to an explicit discussion. These studies are likely to be the roots of another keynote in Schroedinger's thinking, namely, the irreversibility. 1945 a statistical theory of chain-reactions was published under the inconspicuous title of 'Probability Problems in Nuclear Chemistry'. In his last work Schroedinger turns off in a wrong direction: it is that energy should only be a statistical concept and should not be conserved in elementary processes, but somehow only in the mean. These short remarks only illuminate the diversity of the material in this volume, but testify Schroedinger's deep understanding in this field. (W.K.)

  4. Prediction of Al2O3 leaching recovery in the Bayer process using statistical multilinear regresion analysis

    Directory of Open Access Journals (Sweden)

    Đurić I.

    2010-01-01

    Full Text Available This paper presents the results of defining the mathematical model which describes the dependence of leaching degree of Al2O3 in bauxite from the most influential input parameters in industrial conditions of conducting the leaching process in the Bayer technology of alumina production. Mathematical model is defined using the stepwise MLRA method, with R2 = 0.764 and significant statistical reliability - VIF<2 and p<0.05, on the one-year statistical sample. Validation of the acquired model was performed using the data from the following year, collected from the process conducted under industrial conditions, rendering the same statistical reliability, with R2 = 0.759.

  5. Optimization Model for Uncertain Statistics Based on an Analytic Hierarchy Process

    Directory of Open Access Journals (Sweden)

    Yongchao Hou

    2014-01-01

    Full Text Available Uncertain statistics is a methodology for collecting and interpreting the expert’s experimental data by uncertainty theory. In order to estimate uncertainty distributions, an optimization model based on analytic hierarchy process (AHP and interpolation method is proposed in this paper. In addition, the principle of least squares method is presented to estimate uncertainty distributions with known functional form. Finally, the effectiveness of this method is illustrated by an example.

  6. Volume of the adrenal and pituitary glands in depression

    DEFF Research Database (Denmark)

    Kessing, Lars Vedel; Willer, Inge Stoel; Knorr, Ulla

    2011-01-01

    Numerous studies have shown that the hypothalamic-pituitary-adrenal (HPA) axis is hyperactive in some depressed patients. It is unclear whether such hyperactivity results in changed volumes of the adrenal glands, pituitary gland and hypothalamus. We systematically reviewed all controlled studies...... on the adrenal or pituitary glands or hypothalamus volume in unipolar depressive disorder published in PubMed 1966 to December 2009. We identified three studies that investigated the volume of the adrenal glands and eight studies that examined the volume of the pituitary gland, but no studies on hypothalamus...... were found. Two out of three studies found a statistically significant increase in adrenal volume in patients compared to controls. Four out of eight studies found a statistically significant increase in pituitary volume in patients compared to controls. Different methodological problems were...

  7. Braid group, knot theory and statistical mechanics II

    CERN Document Server

    Yang Chen Ning

    1994-01-01

    The present volume is an updated version of the book edited by C N Yang and M L Ge on the topics of braid groups and knot theory, which are related to statistical mechanics. This book is based on the 1989 volume but has new material included and new contributors.

  8. A flexible statistics web processing service--added value for information systems for experiment data.

    Science.gov (United States)

    Heimann, Dennis; Nieschulze, Jens; König-Ries, Birgitta

    2010-04-20

    Data management in the life sciences has evolved from simple storage of data to complex information systems providing additional functionalities like analysis and visualization capabilities, demanding the integration of statistical tools. In many cases the used statistical tools are hard-coded within the system. That leads to an expensive integration, substitution, or extension of tools because all changes have to be done in program code. Other systems are using generic solutions for tool integration but adapting them to another system is mostly rather extensive work. This paper shows a way to provide statistical functionality over a statistics web service, which can be easily integrated in any information system and set up using XML configuration files. The statistical functionality is extendable by simply adding the description of a new application to a configuration file. The service architecture as well as the data exchange process between client and service and the adding of analysis applications to the underlying service provider are described. Furthermore a practical example demonstrates the functionality of the service.

  9. Statistical Analysis of the First Passage Path Ensemble of Jump Processes

    Science.gov (United States)

    von Kleist, Max; Schütte, Christof; Zhang, Wei

    2018-02-01

    The transition mechanism of jump processes between two different subsets in state space reveals important dynamical information of the processes and therefore has attracted considerable attention in the past years. In this paper, we study the first passage path ensemble of both discrete-time and continuous-time jump processes on a finite state space. The main approach is to divide each first passage path into nonreactive and reactive segments and to study them separately. The analysis can be applied to jump processes which are non-ergodic, as well as continuous-time jump processes where the waiting time distributions are non-exponential. In the particular case that the jump processes are both Markovian and ergodic, our analysis elucidates the relations between the study of the first passage paths and the study of the transition paths in transition path theory. We provide algorithms to numerically compute statistics of the first passage path ensemble. The computational complexity of these algorithms scales with the complexity of solving a linear system, for which efficient methods are available. Several examples demonstrate the wide applicability of the derived results across research areas.

  10. Brain Volume Estimation Enhancement by Morphological Image Processing Tools

    Directory of Open Access Journals (Sweden)

    Zeinali R.

    2017-12-01

    Full Text Available Background: Volume estimation of brain is important for many neurological applications. It is necessary in measuring brain growth and changes in brain in normal/ abnormal patients. Thus, accurate brain volume measurement is very important. Magnetic resonance imaging (MRI is the method of choice for volume quantification due to excellent levels of image resolution and between-tissue contrast. Stereology method is a good method for estimating volume but it requires to segment enough MRI slices and have a good resolution. In this study, it is desired to enhance stereology method for volume estimation of brain using less MRI slices with less resolution. Methods: In this study, a program for calculating volume using stereology method has been introduced. After morphologic method, dilation was applied and the stereology method enhanced. For the evaluation of this method, we used T1-wighted MR images from digital phantom in BrainWeb which had ground truth. Results: The volume of 20 normal brain extracted from BrainWeb, was calculated. The volumes of white matter, gray matter and cerebrospinal fluid with given dimension were estimated correctly. Volume calculation from Stereology method in different cases was made. In three cases, Root Mean Square Error (RMSE was measured. Case I with T=5, d=5, Case II with T=10, D=10 and Case III with T=20, d=20 (T=slice thickness, d=resolution as stereology parameters. By comparing these results of two methods, it is obvious that RMSE values for our proposed method are smaller than Stereology method. Conclusion: Using morphological operation, dilation allows to enhance the estimation volume method, Stereology. In the case with less MRI slices and less test points, this method works much better compared to Stereology method.

  11. Flaws and fallacies in statistical thinking

    CERN Document Server

    Campbell, Stephen K

    2004-01-01

    This book was written with a dual purpose: first, the author was motivated to relieve his distress over the faulty conclusions drawn from the frequent misuse of relatively simple statistical tools such as percents, graphs, and averages. Second, his objective was to create a nontechnical book that would help people make better-informed decisions by increasing their ability to judge the quality of statistical evidence. This volume achieves both, serving as a supplemental text for students taking their first course in statistics, and as a self-help guide for anyone wishing to evaluate statistica

  12. Statistical mechanics principles and selected applications

    CERN Document Server

    Hill, Terrell L

    1956-01-01

    ""Excellent … a welcome addition to the literature on the subject."" - ScienceBefore the publication of this standard, oft-cited book, there were few if any statistical-mechanics texts that incorporated reviews of both fundamental principles and recent developments in the field.In this volume, Professor Hill offers just such a dual presentation - a useful account of basic theory and of its applications, made accessible in a comprehensive format. The book opens with concise, unusually clear introductory chapters on classical statistical mechanics, quantum statistical mechanics and the relatio

  13. Statistical relation between particle contaminations in ultra pure water and defects generated by process tools

    NARCIS (Netherlands)

    Wali, F.; Knotter, D. Martin; Wortelboer, Ronald; Mud, Auke

    2007-01-01

    Ultra pure water supplied inside the Fab is used in different tools at different stages of processing. Data of the particles measured in ultra pure water was compared with the defect density on wafers processed on these tools and a statistical relation is found Keywords— Yield, defect density,

  14. Spectral deformation techniques applied to the study of quantum statistical irreversible processes

    International Nuclear Information System (INIS)

    Courbage, M.

    1978-01-01

    A procedure of analytic continuation of the resolvent of Liouville operators for quantum statistical systems is discussed. When applied to the theory of irreversible processes of the Brussels School, this method supports the idea that the restriction to a class of initial conditions is necessary to obtain an irreversible behaviour. The general results are tested on the Friedrichs model. (Auth.)

  15. Nuclear multifragmentation within the framework of different statistical ensembles

    International Nuclear Information System (INIS)

    Aguiar, C.E.; Donangelo, R.; Souza, S.R.

    2006-01-01

    The sensitivity of the statistical multifragmentation model to the underlying statistical assumptions is investigated. We concentrate on its microcanonical, canonical, and isobaric formulations. As far as average values are concerned, our results reveal that all the ensembles make very similar predictions, as long as the relevant macroscopic variables (such as temperature, excitation energy, and breakup volume) are the same in all statistical ensembles. It also turns out that the multiplicity dependence of the breakup volume in the microcanonical version of the model mimics a system at (approximately) constant pressure, at least in the plateau region of the caloric curve. However, in contrast to average values, our results suggest that the distributions of physical observables are quite sensitive to the statistical assumptions. This finding may help in deciding which hypothesis corresponds to the best picture for the freeze-out stage

  16. Preparing systems engineering and computing science students in disciplined methods, quantitative, and advanced statistical techniques to improve process performance

    Science.gov (United States)

    McCray, Wilmon Wil L., Jr.

    The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization

  17. HUMAN GLOMERULAR VOLUME QUANTIFICATIONDURING THE AGING PROCESS

    Directory of Open Access Journals (Sweden)

    Dejan Zdravković

    2004-12-01

    Full Text Available Kidney function is directly related to the changes of renal tissue, especially glomeruli, which is particularly distinct during the aging process. The impossibility of kidney function substitution points to the need for glomerular morphologic and functional characteristics estimation during the aging process.Human cadaveric kidney tissue samples were used as material during research. Age of cadavers ranged from 20 to 70 years and they were classified according to the scheme: I (20–29; II (30–39; III (40–49; IV (50–59; V (60–69 i VI (older than 70. After the routine histologic preparation of the renal tissue the slices were analized stereologicaly under the light microscope with projection screen (Reichert Visopan with 40 x lens magnification. M42 test system was used and 100, by unbased method selected glomeruli, were analyzed.Average glomerular capillary network volume shows significant increase (p< 0,001 as far as to the age of 50 years in regard to the age of 20 to 29 years. This parameter shows insignificant decrease after the age of 50 until the age of 70 years. This decrease was significant after the age of 70 years in regard to the period of the 20 to 29 (p< 0,05 and the period of 40 to 49 years (p<0,01.

  18. Implementing the “Big Data” Concept in Official Statistics

    Directory of Open Access Journals (Sweden)

    О. V.

    2017-02-01

    Full Text Available Big data is a huge resource that needs to be used at all levels of economic planning. The article is devoted to the study of the development of the concept of “Big Data” in the world and its impact on the transformation of statistical simulation of economic processes. Statistics at the current stage should take into account the complex system of international economic relations, which functions in the conditions of globalization and brings new forms of economic development in small open economies. Statistical science should take into account such phenomena as gig-economy, common economy, institutional factors, etc. The concept of “Big Data” and open data are analyzed, problems of implementation of “Big Data” in the official statistics are shown. The ways of implementation of “Big Data” in the official statistics of Ukraine through active use of technological opportunities of mobile operators, navigation systems, surveillance cameras, social networks, etc. are presented. The possibilities of using “Big Data” in different sectors of the economy, also on the level of companies are shown. The problems of storage of large volumes of data are highlighted. The study shows that “Big Data” is a huge resource that should be used across the Ukrainian economy.

  19. Tools for the analysis of dose optimization: I. Effect-volume histogram

    International Nuclear Information System (INIS)

    Alber, M.; Nuesslin, F.

    2002-01-01

    With the advent of dose optimization algorithms, predominantly for intensity-modulated radiotherapy (IMRT), computer software has progressed beyond the point of being merely a tool at the hands of an expert and has become an active, independent mediator of the dosimetric conflicts between treatment goals and risks. To understand and control the internal decision finding as well as to provide means to influence it, a tool for the analysis of the dose distribution is presented which reveals the decision-making process performed by the algorithm. The internal trade-offs between partial volumes receiving high or low doses are driven by functions which attribute a weight to each volume element. The statistics of the distribution of these weights is cast into an effect-volume histogram (EVH) in analogy to dose-volume histograms. The analysis of the EVH reveals which traits of the optimum dose distribution result from the defined objectives, and which are a random consequence of under- or misspecification of treatment goals. The EVH can further assist in the process of finding suitable objectives and balancing conflicting objectives. If biologically inspired objectives are used, the EVH shows the distribution of local dose effect relative to the prescribed level. (author)

  20. Person Fit Based on Statistical Process Control in an Adaptive Testing Environment. Research Report 98-13.

    Science.gov (United States)

    van Krimpen-Stoop, Edith M. L. A.; Meijer, Rob R.

    Person-fit research in the context of paper-and-pencil tests is reviewed, and some specific problems regarding person fit in the context of computerized adaptive testing (CAT) are discussed. Some new methods are proposed to investigate person fit in a CAT environment. These statistics are based on Statistical Process Control (SPC) theory. A…

  1. Electric power annual 1994. Volume 2, Operational and financial data

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-11-28

    This year, the annual is published in two volumes. Volume I focused on US electric utilities and contained final 1994 data on net generation, fossil fuel consumption, stocks, receipts, and cost. This Volume II presents annual 1994 summary statistics for the electric power industry, including information on both electric utilities and nonutility power producers. Included are preliminary data for electric utility retail sales of electricity, associated revenue, and average revenue per kilowatthour of electricity sold (based on form EIA-861) and for electric utility financial statistics, environmental statistics, power transactions, and demand- side management. Final 1994 data for US nonutility power producers on installed capacity and gross generation, as well as supply and disposition information, are also provided in Volume II. Technical notes and a glossary are included.

  2. Integer, fractional, and anomalous quantum Hall effects explained with Eyring's rate process theory and free volume concept.

    Science.gov (United States)

    Hao, Tian

    2017-02-22

    The Hall effects, especially the integer, fractional and anomalous quantum Hall effects, have been addressed using Eyring's rate process theory and free volume concept. The basic assumptions are that the conduction process is a common rate controlled "reaction" process that can be described with Eyring's absolute rate process theory; the mobility of electrons should be dependent on the free volume available for conduction electrons. The obtained Hall conductivity is clearly quantized as with prefactors related to both the magnetic flux quantum number and the magnetic quantum number via the azimuthal quantum number, with and without an externally applied magnetic field. This article focuses on two dimensional (2D) systems, but the approaches developed in this article can be extended to 3D systems.

  3. Diagnostic Accuracy of the Volume Rendering Images of Multi-Detector CT for the Detection of Lumbar Transverse Process Fractures

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Yun Hak; Chun, Tong Jin [Dept. of Radiology, Eulji University Hospital, Daejeon (Korea, Republic of)

    2012-01-15

    To compare the accuracy of three-dimensional computed tomographic (3D CT) volume rendering techniques with axial images of multi-detector row computed tomography to identify lumbar transverse process (LTP) fractures in trauma patients. We retrospectively evaluated 42 patients with back pain as a result of blunt trauma between January and June of 2010. Two radiologists examined the 3D CT volume rendering images independently. The confirmation of a LTP fracture was based on the consensus of the axial images by the two radiologists. The results of 3D CT volume rendering images were compared with the axial images and the diagnostic powers (sensitivity, specificity, and accuracy) were calculated. Seven of the 42 patients had twenty five lumbar transverse process fractures. The diagnostic power of the 3D CT volume rendering technique is as accurate as axial images. Reader 1, sensitivity 96%, specificity 100%, accuracy 99.9%; and Reader 2 sensitivity 100%, specificity 99.8%, accuracy 99.8%. The accordance of the two radiologists was 99.8%. 3D CT volume rendering images can alternate axial images to detect lumbar transverse process fractures with good image quality.

  4. NJOY nuclear data processing system. Volume IV. The ERRORR and COVR modules

    International Nuclear Information System (INIS)

    Muir, D.W.; MacFarlane, R.E.

    1985-12-01

    The NJOY nuclear data processing system is a comprehensive computer code package for producing cross sections and related nuclear parameters from ENDF/B evaluated nuclear data. This volume provides detailed descriptions of the NJOY modules ERRORR and COVR, which are concerned with the covariances (uncertainties and correlations) of multigroup cross sections and fission neutron yield (anti nu) values. 17 refs

  5. VOLUME STUDY WITH HIGH DENSITY OF PARTICLES BASED ON CONTOUR AND CORRELATION IMAGE ANALYSIS

    Directory of Open Access Journals (Sweden)

    Tatyana Yu. Nikolaeva

    2014-11-01

    Full Text Available The subject of study is the techniques of particle statistics evaluation, in particular, processing methods of particle images obtained by coherent illumination. This paper considers the problem of recognition and statistical accounting for individual images of small scattering particles in an arbitrary section of the volume in case of high concentrations. For automatic recognition of focused particles images, a special algorithm for statistical analysis based on contouring and thresholding was used. By means of the mathematical formalism of the scalar diffraction theory, coherent images of the particles formed by the optical system with high numerical aperture were simulated. Numerical testing of the method proposed for the cases of different concentrations and distributions of particles in the volume was performed. As a result, distributions of density and mass fraction of the particles were obtained, and the efficiency of the method in case of different concentrations of particles was evaluated. At high concentrations, the effect of coherent superposition of the particles from the adjacent planes strengthens, which makes it difficult to recognize images of particles using the algorithm considered in the paper. In this case, we propose to supplement the method with calculating the cross-correlation function of particle images from adjacent segments of the volume, and evaluating the ratio between the height of the correlation peak and the height of the function pedestal in the case of different distribution characters. The method of statistical accounting of particles considered in this paper is of practical importance in the study of volume with particles of different nature, for example, in problems of biology and oceanography. Effective work in the regime of high concentrations expands the limits of applicability of these methods for practically important cases and helps to optimize determination time of the distribution character and

  6. Volume calibration for nuclear materials control: ANSI N15.19-1989 and beyond

    International Nuclear Information System (INIS)

    Liebetrau, A.M.

    1994-03-01

    Since the last IAEA International Safeguards Symposium, a revised standard for volume calibration methodology was issued in the United States. Because the new standard reflects the advent of high-precision volume measurement technology, it is significantly different from the earlier standard which it supersedes. The new standard outlines a unified data standardization model that applies to process tanks equipped with differential pressure measurement systems for determining liquid content. At the heart of the model is an algorithm to determine liquid height from pressure measurements that accounts for the major factors affecting the accuracy of those measurements. The standardization model also contains algorithms that adjust data from volumetric and gravimetric provers to a standard set of reference conditions. A key component of the standardization model is an algorithm to take account of temperature-induced dimensional changes in the tank. Improved methods for the statistical treatment of calibration data have also appeared since the last Safeguards Symposium. A statistical method of alignment has been introduced that employs a least-squares criterion to determine ''optimal'' alignment factors. More importantly, a statistical model has been proposed that yields plausible estimates of the variance of height and volume measurements when significant run-to-run differences are present in the calibration data. The new standardization model and statistical methods described here are being implemented in a portable, user-friendly software program for use by IAEA inspectors and statisticians. Perhaps these methods will eventually find their way into appropriate international standards

  7. Parallel statistical image reconstruction for cone-beam x-ray CT on a shared memory computation platform

    International Nuclear Information System (INIS)

    Kole, J S; Beekman, F J

    2005-01-01

    Statistical reconstruction methods offer possibilities of improving image quality as compared to analytical methods, but current reconstruction times prohibit routine clinical applications. To reduce reconstruction times we have parallelized a statistical reconstruction algorithm for cone-beam x-ray CT, the ordered subset convex algorithm (OSC), and evaluated it on a shared memory computer. Two different parallelization strategies were developed: one that employs parallelism by computing the work for all projections within a subset in parallel, and one that divides the total volume into parts and processes the work for each sub-volume in parallel. Both methods are used to reconstruct a three-dimensional mathematical phantom on two different grid densities. The reconstructed images are binary identical to the result of the serial (non-parallelized) algorithm. The speed-up factor equals approximately 30 when using 32 to 40 processors, and scales almost linearly with the number of cpus for both methods. The huge reduction in computation time allows us to apply statistical reconstruction to clinically relevant studies for the first time

  8. Guideline implementation in clinical practice: use of statistical process control charts as visual feedback devices.

    Science.gov (United States)

    Al-Hussein, Fahad A

    2009-01-01

    To use statistical control charts in a series of audits to improve the acceptance and consistant use of guidelines, and reduce the variations in prescription processing in primary health care. A series of audits were done at the main satellite of King Saud Housing Family and Community Medicine Center, National Guard Health Affairs, Riyadh, where three general practitioners and six pharmacists provide outpatient care to about 3000 residents. Audits were carried out every fortnight to calculate the proportion of prescriptions that did not conform to the given guidelines of prescribing and dispensing. Simple random samples of thirty were chosen from a sampling frame of all prescriptions given in the two previous weeks. Thirty six audits were carried out from September 2004 to February 2006. P-charts were constructed around a parametric specification of non-conformities not exceeding 25%. Of the 1081 prescriptions, the most frequent non-conformity was failure to write generic names (35.5%), followed by the failure to record patient's weight (16.4%), pharmacist's name (14.3%), duration of therapy (9.1%), and the use of inappropriate abbreviations (6.0%). Initially, 100% of prescriptions did not conform to the guidelines, but within a period of three months, this came down to 40%. A process of audits in the context of statistical process control is necessary for any improvement in the implementation of guidelines in primary care. Statistical process control charts are an effective means of visual feedback to the care providers.

  9. MEAN PLATELET VOLUME AND RISK OF THROMBOTIC STROKE

    Directory of Open Access Journals (Sweden)

    Prasantha Kumar Thankappan

    2017-07-01

    Full Text Available BACKGROUND Stroke is a major cause of long term morbidity and mortality. Several factors are known to increase the liability to stroke. Platelets play a crucial role in the pathogenesis of atherosclerotic complications, contributing to thrombus formation. Platelet size (mean platelet volume, MPV is a marker and possible determinant of platelet function, large platelets being potentially more reactive. Hence an attempt has-been made to study the association if any between mean platelet volume and thrombotic stroke. The aim of this study was to determine whether an association exists between Mean Platelet Volume (MPV and thrombotic stroke. MATERIALS AND METHODS The study is a case control study and data was collected at Government Medical College Hospital, Kottayam, Kerala a tertiary care referral centre. The study was carried out among fifty patients diagnosed with thrombotic stroke and presenting to the hospital within forty eight hours of onset of symptoms. Fifty age group and sex matched controls were also recruited. Mean platelet volume was obtained using a SYSMEX automated analyser. RESULTS This study has shown a statistically significant relation between mean platelet volume and risk of thrombotic stroke but no statistically significant correlation between clinical severity of stroke and mean platelet volume. CONCLUSION This study has shown an elevation of MPV in acute phase of thrombotic stroke. Platelet mass was found to be more or less a constant. This study did not find a statistically significant correlation between clinical severity of stroke and mean platelet volume.

  10. Genetic correlations between brain volumes and the WAIS-III dimensions of verbal comprehension, working memory, perceptual organization, and processing speed.

    Science.gov (United States)

    Posthuma, Daniëlle; Baaré, Wim F C; Hulshoff Pol, Hilleke E; Kahn, René S; Boomsma, Dorret I; De Geus, Eco J C

    2003-04-01

    We recently showed that the correlation of gray and white matter volume with full scale IQ and the Working Memory dimension are completely mediated by common genetic factors (Posthuma et al., 2002). Here we examine whether the other WAIS III dimensions (Verbal Comprehension, Perceptual Organization, Processing Speed) are also related to gray and white matter volume, and whether any of the dimensions are related to cerebellar volume. Two overlapping samples provided 135 subjects from 60 extended twin families for whom both MRI scans and WAIS III data were available. All three brain volumes are related to Working Memory capacity (r = 0.27). This phenotypic correlation is completely due to a common underlying genetic factor. Processing Speed was genetically related to white matter volume (r(g) = 0.39). Perceptual Organization was both genetically (r(g) = 0.39) and environmentally (r(e) = -0.71) related to cerebellar volume. Verbal Comprehension was not related to any of the three brain volumes. It is concluded that brain volumes are genetically related to intelligence which suggests that genes that influence brain volume may also be important for intelligence. It is also noted however, that the direction of causation (i.e., do genes influence brain volume which in turn influences intelligence, or alternatively, do genes influence intelligence which in turn influences brain volume), or the presence or absence of pleiotropy has not been resolved yet.

  11. Quantitative study on lung volume and lung perfusion using SPECT and CT in thoracal tumors

    International Nuclear Information System (INIS)

    Beyer-Enke, S.A.; Goerich, J.; Strauss, L.G.

    1988-01-01

    22 patients with space occupying lesions in the thoracal region were investigated by computer tomography and by perfusion scintigraphy using SPECT. In order to evaluate the CT images quantitatively, the lung volume was determined using approximation method and compared with the perfusion in the SPECT study. For this, anatomically equivalent transaxial SPECT slices had been coordinated to the CT slices. Between the determined lung volumes and the activity in the ocrresponding layers, a statistically significant correlation was found. It could be shown that the stronger perfusion, frequently observed at the right side of the healthy lung, may be explained by an higher volume of the right pulmonary lobe. Whereas in benign displacing processes the relation activity to volume was similar to the one of the healthy lung, a strongly reduced perfusion together with inconspicuous lung volumes became apparent with malignant tumors. In addition to the great morphological evidence of CT and SPECT studies, additional informations regarding the dignity of displacing processes may be derived from the quantitative evaluation of both methods. (orig.) [de

  12. Processing plutonium-contaminated soild for volume reduction using the segmented gate system

    International Nuclear Information System (INIS)

    Moroney, K.S.; Moroney, J.D.; Turney, J.M.; Doane, R.W.

    1994-01-01

    TMA/Eberline has developed and demonstrated an effective method for removing mixed plutonium and americium contamination from a coral soil matrix at the Defense Nuclear Agency's Johnston Atoll site. TMA's onsite soil processing for volume reduction is ongoing at a rate of over 2000 metric tons per week. The system uses arrays of sensitive radiation detectors coupled with sophisticated computer software developed by Eberline Instrument Corporation. The proprietary software controls four soil sorting units operating in parallel that utilize TMA's unique Segmented Gate System technology to remove radiologically contaminated soil from a moving supply on conveyor belts. Clean soil is released for use elsewhere on the island. Contaminated soil is diverted to either a metal drum for collecting higher activity open-quotes hotclose quotes particles (>5000 Becquerels), or to a supplementary soil washing process designed to remove finely divided particles of dispersed low level contamination. Site contamination limits specify maximum dispersed radioactivity of no more than 500 Becquerels per kilogram of soil averaged over no more than 0.1 cubic meter. Results of soil processing at this site have been excellent. After processing over 50,000 metric tons, the volume of contaminated material that would have required expensive special handling, packaging, and disposal as radioactive waste has been successfully reduced by over 98 percent. By mid-January 1994, nearly three million kiloBecquerels of plutonium/americium contamination had been physically separated from the contaminated feed by TMA's Segmented Gate System, and quality control sampling showed no radioactivity above release criteria in the open-quotes cleanclose quotes soil pile

  13. A numerical-statistical approach to determining the representative elementary volume (REV of cement paste for measuring diffusivity

    Directory of Open Access Journals (Sweden)

    Zhang, M. Z.

    2010-12-01

    Full Text Available Concrete diffusivity is a function of its microstructure on many scales, ranging from nanometres to millimetres. Multi-scale techniques are therefore needed to model this parameter. Representative elementary volume (REV, in conjunction with the homogenization principle, is one of the most common multi-scale approaches. This study aimed to establish a procedure for establishing the REV required to determine cement paste diffusivity based on a three-step, numerical-statistical approach. First, several series of 3D cement paste microstructures were generated with HYMOSTRUC3D, a cement hydration and microstructure model, for different volumes of cement paste and w/c ratios ranging from 0.30 to 0.60. Second, the finite element method was used to simulate the diffusion of tritiated water through these microstructures. Effective cement paste diffusivity values for different REVs were obtained by applying Fick’s law. Finally, statistical analysis was used to find the fluctuation in effective diffusivity with cement paste volume, from which the REV was then determined. The conclusion drawn was that the REV for measuring diffusivity in cement paste is 100x100x100 μm3.

    La difusividad del hormigón depende de su microestructura a numerosas escalas, desde nanómetros hasta milímetros, por lo que se precisa de técnicas multiescala para representar este parámetro. Junto con el principio de homogeneización, uno de los métodos multiescala más habituales es el volumen elemental representativo (VER. El objeto de este estudio era establecer un procedimiento que permitiera determinar el VER necesario para calcular la difusividad de la pasta de cemento, basándose en un método numéricoestadístico que consta de tres etapas. Primero, se crearon varias series de microestructuras de pasta de cemento en 3D con HYMOSTRUC3D, un programa que permite crear un modelo de la hidratación y microestructura del cemento. Luego se empleó el método de

  14. Decision support using nonparametric statistics

    CERN Document Server

    Beatty, Warren

    2018-01-01

    This concise volume covers nonparametric statistics topics that most are most likely to be seen and used from a practical decision support perspective. While many degree programs require a course in parametric statistics, these methods are often inadequate for real-world decision making in business environments. Much of the data collected today by business executives (for example, customer satisfaction opinions) requires nonparametric statistics for valid analysis, and this book provides the reader with a set of tools that can be used to validly analyze all data, regardless of type. Through numerous examples and exercises, this book explains why nonparametric statistics will lead to better decisions and how they are used to reach a decision, with a wide array of business applications. Online resources include exercise data, spreadsheets, and solutions.

  15. Statistical methods for physical science

    CERN Document Server

    Stanford, John L

    1994-01-01

    This volume of Methods of Experimental Physics provides an extensive introduction to probability and statistics in many areas of the physical sciences, with an emphasis on the emerging area of spatial statistics. The scope of topics covered is wide-ranging-the text discusses a variety of the most commonly used classical methods and addresses newer methods that are applicable or potentially important. The chapter authors motivate readers with their insightful discussions, augmenting their material withKey Features* Examines basic probability, including coverage of standard distributions, time s

  16. [Monitoring method of extraction process for Schisandrae Chinensis Fructus based on near infrared spectroscopy and multivariate statistical process control].

    Science.gov (United States)

    Xu, Min; Zhang, Lei; Yue, Hong-Shui; Pang, Hong-Wei; Ye, Zheng-Liang; Ding, Li

    2017-10-01

    To establish an on-line monitoring method for extraction process of Schisandrae Chinensis Fructus, the formula medicinal material of Yiqi Fumai lyophilized injection by combining near infrared spectroscopy with multi-variable data analysis technology. The multivariate statistical process control (MSPC) model was established based on 5 normal batches in production and 2 test batches were monitored by PC scores, DModX and Hotelling T2 control charts. The results showed that MSPC model had a good monitoring ability for the extraction process. The application of the MSPC model to actual production process could effectively achieve on-line monitoring for extraction process of Schisandrae Chinensis Fructus, and can reflect the change of material properties in the production process in real time. This established process monitoring method could provide reference for the application of process analysis technology in the process quality control of traditional Chinese medicine injections. Copyright© by the Chinese Pharmaceutical Association.

  17. Designing Solutions by a Student Centred Approach: Integration of Chemical Process Simulation with Statistical Tools to Improve Distillation Systems

    Directory of Open Access Journals (Sweden)

    Isabel M. Joao

    2017-09-01

    Full Text Available Projects thematically focused on simulation and statistical techniques for designing and optimizing chemical processes can be helpful in chemical engineering education in order to meet the needs of engineers. We argue for the relevance of the projects to improve a student centred approach and boost higher order thinking skills. This paper addresses the use of Aspen HYSYS by Portuguese chemical engineering master students to model distillation systems together with statistical experimental design techniques in order to optimize the systems highlighting the value of applying problem specific knowledge, simulation tools and sound statistical techniques. The paper summarizes the work developed by the students in order to model steady-state processes, dynamic processes and optimize the distillation systems emphasizing the benefits of the simulation tools and statistical techniques in helping the students learn how to learn. Students strengthened their domain specific knowledge and became motivated to rethink and improve chemical processes in their future chemical engineering profession. We discuss the main advantages of the methodology from the students’ and teachers perspective

  18. Application of machine learning and expert systems to Statistical Process Control (SPC) chart interpretation

    Science.gov (United States)

    Shewhart, Mark

    1991-01-01

    Statistical Process Control (SPC) charts are one of several tools used in quality control. Other tools include flow charts, histograms, cause and effect diagrams, check sheets, Pareto diagrams, graphs, and scatter diagrams. A control chart is simply a graph which indicates process variation over time. The purpose of drawing a control chart is to detect any changes in the process signalled by abnormal points or patterns on the graph. The Artificial Intelligence Support Center (AISC) of the Acquisition Logistics Division has developed a hybrid machine learning expert system prototype which automates the process of constructing and interpreting control charts.

  19. Frontiers in statistical quality control

    CERN Document Server

    Wilrich, Peter-Theodor

    1997-01-01

    Like the preceding volumes, and met with a lively response, the present volume is collecting contributions stressed on methodology or successful industrial applications. The papers are classified under four main headings: sampling inspection, process quality control, data analysis and process capability studies and finally experimental design.

  20. Segmentation of Brain MRI Using SOM-FCM-Based Method and 3D Statistical Descriptors

    Directory of Open Access Journals (Sweden)

    Andrés Ortiz

    2013-01-01

    Full Text Available Current medical imaging systems provide excellent spatial resolution, high tissue contrast, and up to 65535 intensity levels. Thus, image processing techniques which aim to exploit the information contained in the images are necessary for using these images in computer-aided diagnosis (CAD systems. Image segmentation may be defined as the process of parcelling the image to delimit different neuroanatomical tissues present on the brain. In this paper we propose a segmentation technique using 3D statistical features extracted from the volume image. In addition, the presented method is based on unsupervised vector quantization and fuzzy clustering techniques and does not use any a priori information. The resulting fuzzy segmentation method addresses the problem of partial volume effect (PVE and has been assessed using real brain images from the Internet Brain Image Repository (IBSR.

  1. Discussion of "Modern statistics for spatial point processes"

    DEFF Research Database (Denmark)

    Jensen, Eva Bjørn Vedel; Prokesová, Michaela; Hellmund, Gunnar

    2007-01-01

    ABSTRACT. The paper ‘Modern statistics for spatial point processes’ by Jesper Møller and Rasmus P. Waagepetersen is based on a special invited lecture given by the authors at the 21st Nordic Conference on Mathematical Statistics, held at Rebild, Denmark, in June 2006. At the conference, Antti...

  2. Making Women Count: Gender-Typing, Technology and Path Dependencies in Dutch Statistical Data Processing

    NARCIS (Netherlands)

    van den Ende, Jan; van Oost, Elizabeth C.J.

    2001-01-01

    This article is a longitudinal analysis of the relation between gendered labour divisions and new data processing technologies at the Dutch Central Bureau of Statistics (CBS). Following social-constructivist and evolutionary economic approaches, the authors hold that the relation between technology

  3. Solution of the statistical bootstrap with Bose statistics

    International Nuclear Information System (INIS)

    Engels, J.; Fabricius, K.; Schilling, K.

    1977-01-01

    A brief and transparent way to introduce Bose statistics into the statistical bootstrap of Hagedorn and Frautschi is presented. The resulting bootstrap equation is solved by a cluster expansion for the grand canonical partition function. The shift of the ultimate temperature due to Bose statistics is determined through an iteration process. We discuss two-particle spectra of the decaying fireball (with given mass) as obtained from its grand microcanonical level density

  4. IBM SPSS statistics 19 made simple

    CERN Document Server

    Gray, Colin D

    2012-01-01

    This new edition of one of the most widely read textbooks in its field introduces the reader to data analysis with the most powerful and versatile statistical package on the market: IBM SPSS Statistics 19. Each new release of SPSS Statistics features new options and other improvements. There remains a core of fundamental operating principles and techniques which have continued to apply to all releases issued in recent years and have been proved to be worth communicating in a small volume. This practical and informal book combines simplicity and clarity of presentation with a comprehensive trea

  5. Genetic correlations between brain volumes and the WAIS-III dimensions of verbal comprehension, working memory, perceptual organization, and processing speed

    DEFF Research Database (Denmark)

    Posthuma, Daniëlle; Baare, Wim F.C.; Hulshoff Pol, Hilleke E.

    2003-01-01

    We recently showed that the correlation of gray and white matter volume with full scale IQ and the Working Memory dimension are completely mediated by common genetic factors (Posthuma et al., 2002). Here we examine whether the other WAIS III dimensions (Verbal Comprehension, Perceptual Organization......, Processing Speed) are also related to gray and white matter volume, and whether any of the dimensions are related to cerebellar volume. Two overlapping samples provided 135 subjects from 60 extended twin families for whom both MRI scans and WAIS III data were available. All three brain volumes are related...... to Working Memory capacity (r = 0.27). This phenotypic correlation is completely due to a common underlying genetic factor. Processing Speed was genetically related to white matter volume (r(g) = 0.39). Perceptual Organization was both genetically (r(g) = 0.39) and environmentally (r(e) = -0.71) related...

  6. An integrated model of statistical process control and maintenance based on the delayed monitoring

    International Nuclear Information System (INIS)

    Yin, Hui; Zhang, Guojun; Zhu, Haiping; Deng, Yuhao; He, Fei

    2015-01-01

    This paper develops an integrated model of statistical process control and maintenance decision. The proposal of delayed monitoring policy postpones the sampling process till a scheduled time and contributes to ten-scenarios of the production process, where equipment failure may occur besides quality shift. The equipment failure and the control chart alert trigger the corrective maintenance and the predictive maintenance, respectively. The occurrence probability, the cycle time and the cycle cost of each scenario are obtained by integral calculation; therefore, a mathematical model is established to minimize the expected cost by using the genetic algorithm. A Monte Carlo simulation experiment is conducted and compared with the integral calculation in order to ensure the analysis of the ten-scenario model. Another ordinary integrated model without delayed monitoring is also established as comparison. The results of a numerical example indicate satisfactory economic performance of the proposed model. Finally, a sensitivity analysis is performed to investigate the effect of model parameters. - Highlights: • We develop an integrated model of statistical process control and maintenance. • We propose delayed monitoring policy and derive an economic model with 10 scenarios. • We consider two deterioration mechanisms, quality shift and equipment failure. • The delayed monitoring policy will help reduce the expected cost

  7. Studies in Theoretical and Applied Statistics

    CERN Document Server

    Pratesi, Monica; Ruiz-Gazen, Anne

    2018-01-01

    This book includes a wide selection of the papers presented at the 48th Scientific Meeting of the Italian Statistical Society (SIS2016), held in Salerno on 8-10 June 2016. Covering a wide variety of topics ranging from modern data sources and survey design issues to measuring sustainable development, it provides a comprehensive overview of the current Italian scientific research in the fields of open data and big data in public administration and official statistics, survey sampling, ordinal and symbolic data, statistical models and methods for network data, time series forecasting, spatial analysis, environmental statistics, economic and financial data analysis, statistics in the education system, and sustainable development. Intended for researchers interested in theoretical and empirical issues, this volume provides interesting starting points for further research.

  8. A safeguards verification technique for solution homogeneity and volume measurements in process tanks

    International Nuclear Information System (INIS)

    Suda, S.; Franssen, F.

    1987-01-01

    A safeguards verification technique is being developed for determining whether process-liquid homogeneity has been achieved in process tanks and for authenticating volume-measurement algorithms involving temperature corrections. It is proposed that, in new designs for bulk-handling plants employing automated process lines, bubbler probes and thermocouples be installed at several heights in key accountability tanks. High-accuracy measurements of density using an electromanometer can now be made which match or even exceed analytical-laboratory accuracies. Together with regional determination of tank temperatures, these measurements provide density, liquid-column weight and temperature gradients over the fill range of the tank that can be used to ascertain when the tank solution has reached equilibrium. Temperature-correction algorithms can be authenticated by comparing the volumes obtained from the several bubbler-probe liquid-height measurements, each based on different amounts of liquid above and below the probe. The verification technique is based on the automated electromanometer system developed by Brookhaven National Laboratory (BNL). The IAEA has recently approved the purchase of a stainless-steel tank equipped with multiple bubbler and thermocouple probes for installation in its Bulk Calibration Laboratory at IAEA Headquarters, Vienna. The verification technique is scheduled for preliminary trials in late 1987

  9. DEVELOPMENT OF A METHOD STATISTICAL ANALYSIS ACCURACY AND PROCESS STABILITY PRODUCTION OF EPOXY RESIN ED-20

    Directory of Open Access Journals (Sweden)

    N. V. Zhelninskaya

    2015-01-01

    Full Text Available Statistical methods play an important role in the objective evaluation of quantitative and qualitative characteristics of the process and are one of the most important elements of the quality assurance system production and total quality management process. To produce a quality product, one must know the real accuracy of existing equipment, to determine compliance with the accuracy of a selected technological process specified accuracy products, assess process stability. Most of the random events in life, particularly in manufacturing and scientific research, are characterized by the presence of a large number of random factors, is described by a normal distribution, which is the main in many practical studies. Modern statistical methods is quite difficult to grasp and wide practical use without in-depth mathematical training of all participants in the process. When we know the distribution of a random variable, you can get all the features of this batch of products, to determine the mean value and the variance. Using statistical control methods and quality control in the analysis of accuracy and stability of the technological process of production of epoxy resin ED20. Estimated numerical characteristics of the law of distribution of controlled parameters and determined the percentage of defects of the investigated object products. For sustainability assessment of manufacturing process of epoxy resin ED-20 selected Shewhart control charts, using quantitative data, maps of individual values of X and sliding scale R. Using Pareto charts identify the causes that affect low dynamic viscosity in the largest extent. For the analysis of low values of dynamic viscosity were the causes of defects using Ishikawa diagrams, which shows the most typical factors of the variability of the results of the process. To resolve the problem, it is recommended to modify the polymer composition of carbon fullerenes and to use the developed method for the production of

  10. Predicting uncertainty in future marine ice sheet volume using Bayesian statistical methods

    Science.gov (United States)

    Davis, A. D.

    2015-12-01

    The marine ice instability can trigger rapid retreat of marine ice streams. Recent observations suggest that marine ice systems in West Antarctica have begun retreating. However, unknown ice dynamics, computationally intensive mathematical models, and uncertain parameters in these models make predicting retreat rate and ice volume difficult. In this work, we fuse current observational data with ice stream/shelf models to develop probabilistic predictions of future grounded ice sheet volume. Given observational data (e.g., thickness, surface elevation, and velocity) and a forward model that relates uncertain parameters (e.g., basal friction and basal topography) to these observations, we use a Bayesian framework to define a posterior distribution over the parameters. A stochastic predictive model then propagates uncertainties in these parameters to uncertainty in a particular quantity of interest (QoI)---here, the volume of grounded ice at a specified future time. While the Bayesian approach can in principle characterize the posterior predictive distribution of the QoI, the computational cost of both the forward and predictive models makes this effort prohibitively expensive. To tackle this challenge, we introduce a new Markov chain Monte Carlo method that constructs convergent approximations of the QoI target density in an online fashion, yielding accurate characterizations of future ice sheet volume at significantly reduced computational cost.Our second goal is to attribute uncertainty in these Bayesian predictions to uncertainties in particular parameters. Doing so can help target data collection, for the purpose of constraining the parameters that contribute most strongly to uncertainty in the future volume of grounded ice. For instance, smaller uncertainties in parameters to which the QoI is highly sensitive may account for more variability in the prediction than larger uncertainties in parameters to which the QoI is less sensitive. We use global sensitivity

  11. Statistical considerations on safety analysis

    International Nuclear Information System (INIS)

    Pal, L.; Makai, M.

    2004-01-01

    The authors have investigated the statistical methods applied to safety analysis of nuclear reactors and arrived at alarming conclusions: a series of calculations with the generally appreciated safety code ATHLET were carried out to ascertain the stability of the results against input uncertainties in a simple experimental situation. Scrutinizing those calculations, we came to the conclusion that the ATHLET results may exhibit chaotic behavior. A further conclusion is that the technological limits are incorrectly set when the output variables are correlated. Another formerly unnoticed conclusion of the previous ATHLET calculations that certain innocent looking parameters (like wall roughness factor, the number of bubbles per unit volume, the number of droplets per unit volume) can influence considerably such output parameters as water levels. The authors are concerned with the statistical foundation of present day safety analysis practices and can only hope that their own misjudgment will be dispelled. Until then, the authors suggest applying correct statistical methods in safety analysis even if it makes the analysis more expensive. It would be desirable to continue exploring the role of internal parameters (wall roughness factor, steam-water surface in thermal hydraulics codes, homogenization methods in neutronics codes) in system safety codes and to study their effects on the analysis. In the validation and verification process of a code one carries out a series of computations. The input data are not precisely determined because measured data have an error, calculated data are often obtained from a more or less accurate model. Some users of large codes are content with comparing the nominal output obtained from the nominal input, whereas all the possible inputs should be taken into account when judging safety. At the same time, any statement concerning safety must be aleatory, and its merit can be judged only when the probability is known with which the

  12. Introduction to high-dimensional statistics

    CERN Document Server

    Giraud, Christophe

    2015-01-01

    Ever-greater computing technologies have given rise to an exponentially growing volume of data. Today massive data sets (with potentially thousands of variables) play an important role in almost every branch of modern human activity, including networks, finance, and genetics. However, analyzing such data has presented a challenge for statisticians and data analysts and has required the development of new statistical methods capable of separating the signal from the noise.Introduction to High-Dimensional Statistics is a concise guide to state-of-the-art models, techniques, and approaches for ha

  13. Mathematical-statistical models and qualitative theories for economic and social sciences

    CERN Document Server

    Maturo, Fabrizio; Kacprzyk, Janusz

    2017-01-01

    This book presents a broad spectrum of problems related to statistics, mathematics, teaching, social science, and economics as well as a range of tools and techniques that can be used to solve these problems. It is the result of a scientific collaboration between experts in the field of economic and social systems from the University of Defence in Brno (Czech Republic), G. d’Annunzio University of Chieti-Pescara (Italy), Pablo de Olavid eUniversity of Sevilla (Spain), and Ovidius University in Constanţa, (Romania). The studies included were selected using a peer-review process and reflect heterogeneity and complexity of economic and social phenomena. They and present interesting empirical research from around the globe and from several research fields, such as statistics, decision making, mathematics, complexity, psychology, sociology and economics. The volume is divided into two parts. The first part, “Recent trends in mathematical and statistical models for economic and social sciences”, collects pap...

  14. Processes for an Architecture of Volume

    DEFF Research Database (Denmark)

    Mcgee, Wes; Feringa, Jelle; Søndergaard, Asbjørn

    2013-01-01

    This paper addresses both the architectural, conceptual motivations and the tools and techniques necessary for the digital production of an architecture of volume. The robotic manufacturing techniques of shaping volumetric materials by hot wire and abrasive wire cutting are discussed through...

  15. Mini-Digest of Education Statistics, 2010. NCES 2011-016

    Science.gov (United States)

    Snyder, Thomas D.

    2011-01-01

    This pocket-sized compilation of statistical information covers prekindergarten through graduate school to describe the current American education scene. The "Mini-Digest" is designed as an easy reference for materials found in detail in the "Digest of Education Statistics". These volumes include selections of data from many…

  16. Mini-Digest of Education Statistics, 2009. NCES 2010-014

    Science.gov (United States)

    Snyder, Thomas D.

    2010-01-01

    This compilation of statistical information covers prekindergarten through graduate school to describe the current American education scene. The "Mini-Digest" is designed as an easy reference for materials found in detail in the "Digest of Education Statistics, 2009". These volumes include selections of data from many…

  17. Optimized statistical parametric mapping for partial-volume-corrected amyloid positron emission tomography in patients with Alzheimer's disease and Lewy body dementia

    Science.gov (United States)

    Oh, Jungsu S.; Kim, Jae Seung; Chae, Sun Young; Oh, Minyoung; Oh, Seung Jun; Cha, Seung Nam; Chang, Ho-Jong; Lee, Chong Sik; Lee, Jae Hong

    2017-03-01

    We present an optimized voxelwise statistical parametric mapping (SPM) of partial-volume (PV)-corrected positron emission tomography (PET) of 11C Pittsburgh Compound B (PiB), incorporating the anatomical precision of magnetic resonance image (MRI) and amyloid β (A β) burden-specificity of PiB PET. First, we applied region-based partial-volume correction (PVC), termed the geometric transfer matrix (GTM) method, to PiB PET, creating MRI-based lobar parcels filled with mean PiB uptakes. Then, we conducted a voxelwise PVC by multiplying the original PET by the ratio of a GTM-based PV-corrected PET to a 6-mm-smoothed PV-corrected PET. Finally, we conducted spatial normalizations of the PV-corrected PETs onto the study-specific template. As such, we increased the accuracy of the SPM normalization and the tissue specificity of SPM results. Moreover, lobar smoothing (instead of whole-brain smoothing) was applied to increase the signal-to-noise ratio in the image without degrading the tissue specificity. Thereby, we could optimize a voxelwise group comparison between subjects with high and normal A β burdens (from 10 patients with Alzheimer's disease, 30 patients with Lewy body dementia, and 9 normal controls). Our SPM framework outperformed than the conventional one in terms of the accuracy of the spatial normalization (85% of maximum likelihood tissue classification volume) and the tissue specificity (larger gray matter, and smaller cerebrospinal fluid volume fraction from the SPM results). Our SPM framework optimized the SPM of a PV-corrected A β PET in terms of anatomical precision, normalization accuracy, and tissue specificity, resulting in better detection and localization of A β burdens in patients with Alzheimer's disease and Lewy body dementia.

  18. International Conference on Trends and Perspectives in Linear Statistical Inference

    CERN Document Server

    Rosen, Dietrich

    2018-01-01

    This volume features selected contributions on a variety of topics related to linear statistical inference. The peer-reviewed papers from the International Conference on Trends and Perspectives in Linear Statistical Inference (LinStat 2016) held in Istanbul, Turkey, 22-25 August 2016, cover topics in both theoretical and applied statistics, such as linear models, high-dimensional statistics, computational statistics, the design of experiments, and multivariate analysis. The book is intended for statisticians, Ph.D. students, and professionals who are interested in statistical inference. .

  19. Vitrification process for the volume reduction and stabilization of organic resins

    International Nuclear Information System (INIS)

    Buelt, J.L.

    1982-10-01

    Pacific Northwest Laboratory has completed a series of experimental tests sponsored by the US Department of Energy (DOE) to determine the feasibility of incinerating and vitrifying organic ion-exchange resins in a single-step process. The resins used in this study were identical to those used for decontaminating auxiliary building water at the Three Mile Island (TMI) Unit 2 reactor. The primarily organic resins were loaded with nonradioactive isotopes of cesium and strontium for processing in a pilot-scale, joule-heated glass melter modified to support resin combustion. The feasibility tests demonstrated an average process rate of 3.0 kg/h. Based on this rate, if 50 organic resin liners were vitrified in a six-month campaign, a melter 2.5 times the size of the pilot scale unit would be adequate. A maximum achievable volume reduction of 91% was demonstrated in these tests

  20. Lévy matters IV estimation for discretely observed Lévy processes

    CERN Document Server

    Belomestny, Denis; Genon-Catalot, Valentine; Masuda, Hiroki; Reiß, Markus

    2015-01-01

    The aim of this volume is to provide an extensive account of the most recent advances in statistics for discretely observed Lévy processes. These days, statistics for stochastic processes is a lively topic, driven by the needs of various fields of application, such as finance, the biosciences, and telecommunication. The three chapters of this volume are completely dedicated to the estimation of Lévy processes, and are written by experts in the field. The first chapter by Denis Belomestny and Markus Reiß treats the low frequency situation, and estimation methods are based on the empirical characteristic function. The second chapter by Fabienne Comte and Valery Genon-Catalon is dedicated to non-parametric estimation mainly covering the high-frequency data case. A distinctive feature of this part is the construction of adaptive estimators, based on deconvolution or projection or kernel methods. The last chapter by Hiroki Masuda considers the parametric situation. The chapters cover the main aspects of the est...

  1. Growth Curve Models and Applications : Indian Statistical Institute

    CERN Document Server

    2017-01-01

    Growth curve models in longitudinal studies are widely used to model population size, body height, biomass, fungal growth, and other variables in the biological sciences, but these statistical methods for modeling growth curves and analyzing longitudinal data also extend to general statistics, economics, public health, demographics, epidemiology, SQC, sociology, nano-biotechnology, fluid mechanics, and other applied areas.   There is no one-size-fits-all approach to growth measurement. The selected papers in this volume build on presentations from the GCM workshop held at the Indian Statistical Institute, Giridih, on March 28-29, 2016. They represent recent trends in GCM research on different subject areas, both theoretical and applied. This book includes tools and possibilities for further work through new techniques and modification of existing ones. The volume includes original studies, theoretical findings and case studies from a wide range of app lied work, and these contributions have been externally r...

  2. Machine vision for high-precision volume measurement applied to levitated containerless material processing

    International Nuclear Information System (INIS)

    Bradshaw, R.C.; Schmidt, D.P.; Rogers, J.R.; Kelton, K.F.; Hyers, R.W.

    2005-01-01

    By combining the best practices in optical dilatometry with numerical methods, a high-speed and high-precision technique has been developed to measure the volume of levitated, containerlessly processed samples with subpixel resolution. Containerless processing provides the ability to study highly reactive materials without the possibility of contamination affecting thermophysical properties. Levitation is a common technique used to isolate a sample as it is being processed. Noncontact optical measurement of thermophysical properties is very important as traditional measuring methods cannot be used. Modern, digitally recorded images require advanced numerical routines to recover the subpixel locations of sample edges and, in turn, produce high-precision measurements

  3. A high volume cost efficient production macrostructuring process. [for silicon solar cell surface treatment

    Science.gov (United States)

    Chitre, S. R.

    1978-01-01

    The paper presents an experimentally developed surface macro-structuring process suitable for high volume production of silicon solar cells. The process lends itself easily to automation for high throughput to meet low-cost solar array goals. The tetrahedron structure observed is 0.5 - 12 micron high. The surface has minimal pitting with virtually no or very few undeveloped areas across the surface. This process has been developed for (100) oriented as cut silicon. Chemi-etched, hydrophobic and lapped surfaces were successfully texturized. A cost analysis as per Samics is presented.

  4. Statistical Uncertainty in the Medicare Shared Savings...

    Data.gov (United States)

    U.S. Department of Health & Human Services — According to analysis reported in Statistical Uncertainty in the Medicare Shared Savings Program published in Volume 2, Issue 4 of the Medicare and Medicaid Research...

  5. Auditory Magnetoencephalographic Frequency-Tagged Responses Mirror the Ongoing Segmentation Processes Underlying Statistical Learning.

    Science.gov (United States)

    Farthouat, Juliane; Franco, Ana; Mary, Alison; Delpouve, Julie; Wens, Vincent; Op de Beeck, Marc; De Tiège, Xavier; Peigneux, Philippe

    2017-03-01

    Humans are highly sensitive to statistical regularities in their environment. This phenomenon, usually referred as statistical learning, is most often assessed using post-learning behavioural measures that are limited by a lack of sensibility and do not monitor the temporal dynamics of learning. In the present study, we used magnetoencephalographic frequency-tagged responses to investigate the neural sources and temporal development of the ongoing brain activity that supports the detection of regularities embedded in auditory streams. Participants passively listened to statistical streams in which tones were grouped as triplets, and to random streams in which tones were randomly presented. Results show that during exposure to statistical (vs. random) streams, tritone frequency-related responses reflecting the learning of regularities embedded in the stream increased in the left supplementary motor area and left posterior superior temporal sulcus (pSTS), whereas tone frequency-related responses decreased in the right angular gyrus and right pSTS. Tritone frequency-related responses rapidly developed to reach significance after 3 min of exposure. These results suggest that the incidental extraction of novel regularities is subtended by a gradual shift from rhythmic activity reflecting individual tone succession toward rhythmic activity synchronised with triplet presentation, and that these rhythmic processes are subtended by distinct neural sources.

  6. Quantitative study on the statistical properties of fibre architecture of genuine and numerical composite microstructures

    DEFF Research Database (Denmark)

    Hansen, Jens Zangenberg; Brøndsted, Povl

    2013-01-01

    A quantitative study is carried out regarding the statistical properties of the fibre architecture found in composite laminates and that generated numerically using Statistical Representative Volume Elements (SRVE’s). The aim is to determine the reliability and consistency of SRVE’s for represent......A quantitative study is carried out regarding the statistical properties of the fibre architecture found in composite laminates and that generated numerically using Statistical Representative Volume Elements (SRVE’s). The aim is to determine the reliability and consistency of SRVE...

  7. Statistical inference for financial engineering

    CERN Document Server

    Taniguchi, Masanobu; Ogata, Hiroaki; Taniai, Hiroyuki

    2014-01-01

    This monograph provides the fundamentals of statistical inference for financial engineering and covers some selected methods suitable for analyzing financial time series data. In order to describe the actual financial data, various stochastic processes, e.g. non-Gaussian linear processes, non-linear processes, long-memory processes, locally stationary processes etc. are introduced and their optimal estimation is considered as well. This book also includes several statistical approaches, e.g., discriminant analysis, the empirical likelihood method, control variate method, quantile regression, realized volatility etc., which have been recently developed and are considered to be powerful tools for analyzing the financial data, establishing a new bridge between time series and financial engineering. This book is well suited as a professional reference book on finance, statistics and statistical financial engineering. Readers are expected to have an undergraduate-level knowledge of statistics.

  8. Influence of volume of sample processed on detection of Chlamydia trachomatis in urogenital samples by PCR

    NARCIS (Netherlands)

    Goessens, W H; Kluytmans, J A; den Toom, N; van Rijsoort-Vos, T H; Niesters, B G; Stolz, E; Verbrugh, H A; Quint, W G

    In the present study, it was demonstrated that the sensitivity of the PCR for the detection of Chlamydia trachomatis is influenced by the volume of the clinical sample which is processed in the PCR. An adequate sensitivity for PCR was established by processing at least 4%, i.e., 80 microliters, of

  9. Effect of the spray volume adjustment model on the efficiency of fungicides and residues in processing tomato

    Energy Technology Data Exchange (ETDEWEB)

    Ratajkiewicz, H.; Kierzek, R.; Raczkowski, M.; Hołodyńska-Kulas, A.; Łacka, A.; Wójtowicz, A.; Wachowiak, M.

    2016-11-01

    This study compared the effects of a proportionate spray volume (PSV) adjustment model and a fixed model (300 L/ha) on the infestation of processing tomato with potato late blight (Phytophthora infestans (Mont.) de Bary) (PLB) and azoxystrobin and chlorothalonil residues in fruits in three consecutive seasons. The fungicides were applied in alternating system with or without two spreader adjuvants. The proportionate spray volume adjustment model was based on the number of leaves on plants and spray volume index. The modified Quick, Easy, Cheap, Effective, Rugged, and Safe (QuEChERS) method was optimized and validated for extraction of azoxystrobin and chlorothalonil residue. Gas chromatography with a nitrogen and phosphorus detector and an electron capture detector were used for the analysis of fungicides. The results showed that higher fungicidal residues were connected with lower infestation of tomato with PLB. PSV adjustment model resulted in lower infestation of tomato than the fixed model (300 L/ha) when fungicides were applied at half the dose without adjuvants. Higher expected spray interception into the tomato canopy with the PSV system was recognized as the reasons of better control of PLB. The spreader adjuvants did not have positive effect on the biological efficacy of spray volume application systems. The results suggest that PSV adjustment model can be used to determine the spray volume for fungicide application for processing tomato crop. (Author)

  10. Effect of the spray volume adjustment model on the efficiency of fungicides and residues in processing tomato

    Directory of Open Access Journals (Sweden)

    Henryk Ratajkiewicz

    2016-08-01

    Full Text Available This study compared the effects of a proportionate spray volume (PSV adjustment model and a fixed model (300 L/ha on the infestation of processing tomato with potato late blight (Phytophthora infestans (Mont. de Bary (PLB and azoxystrobin and chlorothalonil residues in fruits in three consecutive seasons. The fungicides were applied in alternating system with or without two spreader adjuvants. The proportionate spray volume adjustment model was based on the number of leaves on plants and spray volume index. The modified Quick, Easy, Cheap, Effective, Rugged, and Safe (QuEChERS method was optimized and validated for extraction of azoxystrobin and chlorothalonil residue. Gas chromatography with a nitrogen and phosphorus detector and an electron capture detector were used for the analysis of fungicides. The results showed that higher fungicidal residues were connected with lower infestation of tomato with PLB. PSV adjustment model resulted in lower infestation of tomato than the fixed model (300 L/ha when fungicides were applied at half the dose without adjuvants. Higher expected spray interception into the tomato canopy with the PSV system was recognized as the reasons of better control of PLB. The spreader adjuvants did not have positive effect on the biological efficacy of spray volume application systems. The results suggest that PSV adjustment model can be used to determine the spray volume for fungicide application for processing tomato crop.

  11. ANALISIS KEHILANGAN MINYAK PADA CRUDE PALM OIL (CPO DENGAN MENGGUNAKAN METODE STATISTICAL PROCESS CONTROL

    Directory of Open Access Journals (Sweden)

    Vera Devani

    2014-06-01

    Full Text Available PKS “XYZ” merupakan perusahaan yang bergerak di bidang pengolahan kelapa sawit. Produk yang dihasilkan adalah Crude Palm Oil (CPO dan Palm Kernel Oil (PKO. Tujuan penelitian ini adalah menganalisa kehilangan minyak (oil losses dan faktor-faktor penyebab dengan menggunakan metoda Statistical Process Control. Statistical Process Control adalah sekumpulan strategi, teknik, dan tindakan yang diambil oleh sebuah organisasi untuk memastikan bahwa strategi tersebut menghasilkan produk yang berkualitas atau menyediakan pelayanan yang berkualitas. Sampel terjadinya oil losses pada CPO yang diteliti adalah tandan kosong (tankos, biji (nut, ampas (fibre, dan sludge akhir. Berdasarkan Peta Kendali I-MR dapat disimpulkan bahwa kondisi keempat jenis oil losses CPO berada dalam batas kendali dan konsisten. Sedangkan nilai Cpk dari total oil losses berada di luar batas kendali rata-rata proses, hal ini berarti CPO yang diproduksi telah memenuhi kebutuhan pelanggan, dengan total oil losses kurang dari batas maksimum yang ditetapkan oleh perusahaan yaitu 1,65%.

  12. Statistical reliability analyses of two wood plastic composite extrusion processes

    International Nuclear Information System (INIS)

    Crookston, Kevin A.; Mark Young, Timothy; Harper, David; Guess, Frank M.

    2011-01-01

    Estimates of the reliability of wood plastic composites (WPC) are explored for two industrial extrusion lines. The goal of the paper is to use parametric and non-parametric analyses to examine potential differences in the WPC metrics of reliability for the two extrusion lines that may be helpful for use by the practitioner. A parametric analysis of the extrusion lines reveals some similarities and disparities in the best models; however, a non-parametric analysis reveals unique and insightful differences between Kaplan-Meier survival curves for the modulus of elasticity (MOE) and modulus of rupture (MOR) of the WPC industrial data. The distinctive non-parametric comparisons indicate the source of the differences in strength between the 10.2% and 48.0% fractiles [3,183-3,517 MPa] for MOE and for MOR between the 2.0% and 95.1% fractiles [18.9-25.7 MPa]. Distribution fitting as related to selection of the proper statistical methods is discussed with relevance to estimating the reliability of WPC. The ability to detect statistical differences in the product reliability of WPC between extrusion processes may benefit WPC producers in improving product reliability and safety of this widely used house-decking product. The approach can be applied to many other safety and complex system lifetime comparisons.

  13. Review of the patient positioning reproducibility in head-and-neck radiotherapy using Statistical Process Control.

    Science.gov (United States)

    Moore, Sarah J; Herst, Patries M; Louwe, Robert J W

    2018-05-01

    A remarkable improvement in patient positioning was observed after the implementation of various process changes aiming to increase the consistency of patient positioning throughout the radiotherapy treatment chain. However, no tool was available to describe these changes over time in a standardised way. This study reports on the feasibility of Statistical Process Control (SPC) to highlight changes in patient positioning accuracy and facilitate correlation of these changes with the underlying process changes. Metrics were designed to quantify the systematic and random patient deformation as input for the SPC charts. These metrics were based on data obtained from multiple local ROI matches for 191 patients who were treated for head-and-neck cancer during the period 2011-2016. SPC highlighted a significant improvement in patient positioning that coincided with multiple intentional process changes. The observed improvements could be described as a combination of a reduction in outliers and a systematic improvement in the patient positioning accuracy of all patients. SPC is able to track changes in the reproducibility of patient positioning in head-and-neck radiation oncology, and distinguish between systematic and random process changes. Identification of process changes underlying these trends requires additional statistical analysis and seems only possible when the changes do not overlap in time. Copyright © 2018 Elsevier B.V. All rights reserved.

  14. Fault diagnosis and comparing risk for the steel coil manufacturing process using statistical models for binary data

    International Nuclear Information System (INIS)

    Debón, A.; Carlos Garcia-Díaz, J.

    2012-01-01

    Advanced statistical models can help industry to design more economical and rational investment plans. Fault detection and diagnosis is an important problem in continuous hot dip galvanizing. Increasingly stringent quality requirements in the automotive industry also require ongoing efforts in process control to make processes more robust. Robust methods for estimating the quality of galvanized steel coils are an important tool for the comprehensive monitoring of the performance of the manufacturing process. This study applies different statistical regression models: generalized linear models, generalized additive models and classification trees to estimate the quality of galvanized steel coils on the basis of short time histories. The data, consisting of 48 galvanized steel coils, was divided into sets of conforming and nonconforming coils. Five variables were selected for monitoring the process: steel strip velocity and four bath temperatures. The present paper reports a comparative evaluation of statistical models for binary data using Receiver Operating Characteristic (ROC) curves. A ROC curve is a graph or a technique for visualizing, organizing and selecting classifiers based on their performance. The purpose of this paper is to examine their use in research to obtain the best model to predict defective steel coil probability. In relation to the work of other authors who only propose goodness of fit statistics, we should highlight one distinctive feature of the methodology presented here, which is the possibility of comparing the different models with ROC graphs which are based on model classification performance. Finally, the results are validated by bootstrap procedures.

  15. Statistical metrology - measurement and modeling of variation for advanced process development and design rule generation

    International Nuclear Information System (INIS)

    Boning, Duane S.; Chung, James E.

    1998-01-01

    Advanced process technology will require more detailed understanding and tighter control of variation in devices and interconnects. The purpose of statistical metrology is to provide methods to measure and characterize variation, to model systematic and random components of that variation, and to understand the impact of variation on both yield and performance of advanced circuits. Of particular concern are spatial or pattern-dependencies within individual chips; such systematic variation within the chip can have a much larger impact on performance than wafer-level random variation. Statistical metrology methods will play an important role in the creation of design rules for advanced technologies. For example, a key issue in multilayer interconnect is the uniformity of interlevel dielectric (ILD) thickness within the chip. For the case of ILD thickness, we describe phases of statistical metrology development and application to understanding and modeling thickness variation arising from chemical-mechanical polishing (CMP). These phases include screening experiments including design of test structures and test masks to gather electrical or optical data, techniques for statistical decomposition and analysis of the data, and approaches to calibrating empirical and physical variation models. These models can be integrated with circuit CAD tools to evaluate different process integration or design rule strategies. One focus for the generation of interconnect design rules are guidelines for the use of 'dummy fill' or 'metal fill' to improve the uniformity of underlying metal density and thus improve the uniformity of oxide thickness within the die. Trade-offs that can be evaluated via statistical metrology include the improvements to uniformity possible versus the effect of increased capacitance due to additional metal

  16. Application of pedagogy reflective in statistical methods course and practicum statistical methods

    Science.gov (United States)

    Julie, Hongki

    2017-08-01

    Subject Elementary Statistics, Statistical Methods and Statistical Methods Practicum aimed to equip students of Mathematics Education about descriptive statistics and inferential statistics. The students' understanding about descriptive and inferential statistics were important for students on Mathematics Education Department, especially for those who took the final task associated with quantitative research. In quantitative research, students were required to be able to present and describe the quantitative data in an appropriate manner, to make conclusions from their quantitative data, and to create relationships between independent and dependent variables were defined in their research. In fact, when students made their final project associated with quantitative research, it was not been rare still met the students making mistakes in the steps of making conclusions and error in choosing the hypothetical testing process. As a result, they got incorrect conclusions. This is a very fatal mistake for those who did the quantitative research. There were some things gained from the implementation of reflective pedagogy on teaching learning process in Statistical Methods and Statistical Methods Practicum courses, namely: 1. Twenty two students passed in this course and and one student did not pass in this course. 2. The value of the most accomplished student was A that was achieved by 18 students. 3. According all students, their critical stance could be developed by them, and they could build a caring for each other through a learning process in this course. 4. All students agreed that through a learning process that they undergo in the course, they can build a caring for each other.

  17. The use of process models to inform and improve statistical models of nitrate occurrence, Great Miami River Basin, southwestern Ohio

    Science.gov (United States)

    Walter, Donald A.; Starn, J. Jeffrey

    2013-01-01

    Statistical models of nitrate occurrence in the glacial aquifer system of the northern United States, developed by the U.S. Geological Survey, use observed relations between nitrate concentrations and sets of explanatory variables—representing well-construction, environmental, and source characteristics— to predict the probability that nitrate, as nitrogen, will exceed a threshold concentration. However, the models do not explicitly account for the processes that control the transport of nitrogen from surface sources to a pumped well and use area-weighted mean spatial variables computed from within a circular buffer around the well as a simplified source-area conceptualization. The use of models that explicitly represent physical-transport processes can inform and, potentially, improve these statistical models. Specifically, groundwater-flow models simulate advective transport—predominant in many surficial aquifers— and can contribute to the refinement of the statistical models by (1) providing for improved, physically based representations of a source area to a well, and (2) allowing for more detailed estimates of environmental variables. A source area to a well, known as a contributing recharge area, represents the area at the water table that contributes recharge to a pumped well; a well pumped at a volumetric rate equal to the amount of recharge through a circular buffer will result in a contributing recharge area that is the same size as the buffer but has a shape that is a function of the hydrologic setting. These volume-equivalent contributing recharge areas will approximate circular buffers in areas of relatively flat hydraulic gradients, such as near groundwater divides, but in areas with steep hydraulic gradients will be elongated in the upgradient direction and agree less with the corresponding circular buffers. The degree to which process-model-estimated contributing recharge areas, which simulate advective transport and therefore account for

  18. The application of statistical process control in linac quality assurance

    International Nuclear Information System (INIS)

    Li Dingyu; Dai Jianrong

    2009-01-01

    Objective: To improving linac quality assurance (QA) program with statistical process control (SPC) method. Methods: SPC is applied to set the control limit of QA data, draw charts and differentiate the random and systematic errors. A SPC quality assurance software named QA M ANAGER has been developed by VB programming for clinical use. Two clinical cases are analyzed with SPC to study daily output QA of a 6MV photon beam. Results: In the clinical case, the SPC is able to identify the systematic errors. Conclusion: The SPC application may be assistant to detect systematic errors in linac quality assurance thus it alarms the abnormal trend to eliminate the systematic errors and improves quality control. (authors)

  19. Damage localization by statistical evaluation of signal-processed mode shapes

    DEFF Research Database (Denmark)

    Ulriksen, Martin Dalgaard; Damkilde, Lars

    2015-01-01

    Due to their inherent ability to provide structural information on a local level, mode shapes and their derivatives are utilized extensively for structural damage identification. Typically, more or less advanced mathematical methods are implemented to identify damage-induced discontinuities in th...... is conducted on the basis of T2-statistics. The proposed method is demonstrated in the context of analytical work with a free-vibrating Euler-Bernoulli beam under noisy conditions.......) and subsequent application of a generalized discrete Teager-Kaiser energy operator (GDTKEO) to identify damage-induced mode shape discontinuities. In order to evaluate whether the identified discontinuities are in fact damage-induced, outlier analysis of principal components of the signal-processed mode shapes...

  20. Evaluation of economic and technical efficiency of diesel engines operation on the basis of volume combustion rate

    Directory of Open Access Journals (Sweden)

    І. О. Берестовой

    2016-11-01

    Full Text Available The article deals with a new approach to evaluation of complex efficiency of diesel engines. Traditionally, cylinder’s capacity, rotation frequency, average efficient pressure inside cylinder, piston’s stroke, average piston’s velocity, fuel specific consumption and other indices are used as generalizing criteria, characterizing diesel engine’s efficiency, but they do not reflect interrelation between engine’s complex efficiency and a set of economic, mass-dimensional, operational and ecological efficiency. The approach applied in the article makes it possible to reveal the existing and modify the existing methods of solving the problem of improving diesel engine’s efficiency with due regard to interrelation of the parameters, characterizing efficiency of their operation. Statistic analyses were carried out, on the basis of which an assumption regarding the existence of interrelation between specific fuel consumption and the analyzed engine’s parameters was made. Processing of statistical data for various analyzed functions of diesel engines helped offer a function, illustrating the link between volume combustion rate, piston’s area and nominal theoretical specific fuel consumption. Interrelation between volume combustion rate, nominal parameters of diesel operation and efficiency indices, obtained by processing statistical data of more than 500 models of diesels of different series was evaluated, the main feature of it being a mathematical trend. The analysis of the obtained function makes it possible to establish an interrelation between economic efficiency of a diesel, its main index being specific fuel consumption and volume combustion rate and design peculiarities

  1. Measuring and improving the quality of postoperative epidural analgesia for major abdominal surgery using statistical process control charts.

    Science.gov (United States)

    Duncan, Fiona; Haigh, Carol

    2013-10-01

    To explore and improve the quality of continuous epidural analgesia for pain relief using Statistical Process Control tools. Measuring the quality of pain management interventions is complex. Intermittent audits do not accurately capture the results of quality improvement initiatives. The failure rate for one intervention, epidural analgesia, is approximately 30% in everyday practice, so it is an important area for improvement. Continuous measurement and analysis are required to understand the multiple factors involved in providing effective pain relief. Process control and quality improvement Routine prospectively acquired data collection started in 2006. Patients were asked about their pain and side effects of treatment. Statistical Process Control methods were applied for continuous data analysis. A multidisciplinary group worked together to identify reasons for variation in the data and instigated ideas for improvement. The key measure for improvement was a reduction in the percentage of patients with an epidural in severe pain. The baseline control charts illustrated the recorded variation in the rate of several processes and outcomes for 293 surgical patients. The mean visual analogue pain score (VNRS) was four. There was no special cause variation when data were stratified by surgeons, clinical area or patients who had experienced pain before surgery. Fifty-seven per cent of patients were hypotensive on the first day after surgery. We were able to demonstrate a significant improvement in the failure rate of epidurals as the project continued with quality improvement interventions. Statistical Process Control is a useful tool for measuring and improving the quality of pain management. The applications of Statistical Process Control methods offer the potential to learn more about the process of change and outcomes in an Acute Pain Service both locally and nationally. We have been able to develop measures for improvement and benchmarking in routine care that

  2. Statistical methods

    CERN Document Server

    Szulc, Stefan

    1965-01-01

    Statistical Methods provides a discussion of the principles of the organization and technique of research, with emphasis on its application to the problems in social statistics. This book discusses branch statistics, which aims to develop practical ways of collecting and processing numerical data and to adapt general statistical methods to the objectives in a given field.Organized into five parts encompassing 22 chapters, this book begins with an overview of how to organize the collection of such information on individual units, primarily as accomplished by government agencies. This text then

  3. Petroleum supply annual 1996: Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-06-01

    The Petroleum Supply Annual (PSA) contains information on the supply and disposition of crude oil and petroleum products. The publication reflects data that were collected from the petroleum industry during 1996 through annual and monthly surveys. The PSA is divided into two volumes. This first volume contains three sections: Summary Statistics, Detailed Statistics, and Refinery Capacity; each with final annual data. The summary statistics section show 16 years of data depicting the balance between supply, disposition and ending stocks for various commodities including crude oil, motor gasoline, distillate fuel oil, residual fuel oil, jet fuel propane/propylene, and liquefied petroleum gases. The detailed statistics section provide 1996 detailed statistics on supply and disposition, refinery operations, imports and exports, stocks, and transportation of crude oil and petroleum products. The refinery capacity contain listings of refineries and associated crude oil distillation and downstream capacities by State, as of January 1, 1997, as well as summaries of corporate refinery capacities and refinery storage capacities. In addition, refinery receipts of crude oil by method of transportation for 1996 are provided. Also included are fuels consumed at refineries, and lists of shutdowns, sales, reactivations, and mergers during 1995 and 1996. 16 figs., 59 tabs.

  4. Petroleum supply annual 1996: Volume 1

    International Nuclear Information System (INIS)

    1997-06-01

    The Petroleum Supply Annual (PSA) contains information on the supply and disposition of crude oil and petroleum products. The publication reflects data that were collected from the petroleum industry during 1996 through annual and monthly surveys. The PSA is divided into two volumes. This first volume contains three sections: Summary Statistics, Detailed Statistics, and Refinery Capacity; each with final annual data. The summary statistics section show 16 years of data depicting the balance between supply, disposition and ending stocks for various commodities including crude oil, motor gasoline, distillate fuel oil, residual fuel oil, jet fuel propane/propylene, and liquefied petroleum gases. The detailed statistics section provide 1996 detailed statistics on supply and disposition, refinery operations, imports and exports, stocks, and transportation of crude oil and petroleum products. The refinery capacity contain listings of refineries and associated crude oil distillation and downstream capacities by State, as of January 1, 1997, as well as summaries of corporate refinery capacities and refinery storage capacities. In addition, refinery receipts of crude oil by method of transportation for 1996 are provided. Also included are fuels consumed at refineries, and lists of shutdowns, sales, reactivations, and mergers during 1995 and 1996. 16 figs., 59 tabs

  5. Aarhus Conference on Probability, Statistics and Their Applications : Celebrating the Scientific Achievements of Ole E. Barndorff-Nielsen

    CERN Document Server

    Stelzer, Robert; Thorbjørnsen, Steen; Veraart, Almut

    2016-01-01

    Collecting together twenty-three self-contained articles, this volume presents the current research of a number of renowned scientists in both probability theory and statistics as well as their various applications in economics, finance, the physics of wind-blown sand, queueing systems, risk assessment, turbulence and other areas. The contributions are dedicated to and inspired by the research of Ole E. Barndorff-Nielsen who, since the early 1960s, has been and continues to be a very active and influential researcher working on a wide range of important problems. The topics covered include, but are not limited to, econometrics, exponential families, Lévy processes and infinitely divisible distributions, limit theory, mathematical finance, random matrices, risk assessment, statistical inference for stochastic processes, stochastic analysis and optimal control, time series, and turbulence. The book will be of interest to researchers and graduate students in probability, statistics and their applications. .

  6. Errors in patient specimen collection: application of statistical process control.

    Science.gov (United States)

    Dzik, Walter Sunny; Beckman, Neil; Selleng, Kathleen; Heddle, Nancy; Szczepiorkowski, Zbigniew; Wendel, Silvano; Murphy, Michael

    2008-10-01

    Errors in the collection and labeling of blood samples for pretransfusion testing increase the risk of transfusion-associated patient morbidity and mortality. Statistical process control (SPC) is a recognized method to monitor the performance of a critical process. An easy-to-use SPC method was tested to determine its feasibility as a tool for monitoring quality in transfusion medicine. SPC control charts were adapted to a spreadsheet presentation. Data tabulating the frequency of mislabeled and miscollected blood samples from 10 hospitals in five countries from 2004 to 2006 were used to demonstrate the method. Control charts were produced to monitor process stability. The participating hospitals found the SPC spreadsheet very suitable to monitor the performance of the sample labeling and collection and applied SPC charts to suit their specific needs. One hospital monitored subcategories of sample error in detail. A large hospital monitored the number of wrong-blood-in-tube (WBIT) events. Four smaller-sized facilities, each following the same policy for sample collection, combined their data on WBIT samples into a single control chart. One hospital used the control chart to monitor the effect of an educational intervention. A simple SPC method is described that can monitor the process of sample collection and labeling in any hospital. SPC could be applied to other critical steps in the transfusion processes as a tool for biovigilance and could be used to develop regional or national performance standards for pretransfusion sample collection. A link is provided to download the spreadsheet for free.

  7. UNMANNED AERIAL VEHICLE USE FOR WOOD CHIPS PILE VOLUME ESTIMATION

    Directory of Open Access Journals (Sweden)

    M. Mokroš

    2016-06-01

    Full Text Available The rapid development of unmanned aerial vehicles is a challenge for applied research. Many technologies are developed and then researcher are looking up for their application in different sectors. Therefore, we decided to verify the use of the unmanned aerial vehicle for wood chips pile monitoring. We compared the use of GNSS device and unmanned aerial vehicle for volume estimation of four wood chips piles. We used DJI Phantom 3 Professional with the built-in camera and GNSS device (geoexplorer 6000. We used Agisoft photoscan for processing photos and ArcGIS for processing points. Volumes calculated from pictures were not statistically significantly different from amounts calculated from GNSS data and high correlation between them was found (p = 0.9993. We conclude that the use of unmanned aerial vehicle instead of the GNSS device does not lead to significantly different results. Tthe data collection consumed from almost 12 to 20 times less time with the use of UAV. Additionally, UAV provides documentation trough orthomosaic.

  8. Using Statistical Process Control Charts to Study Stuttering Frequency Variability during a Single Day

    Science.gov (United States)

    Karimi, Hamid; O'Brian, Sue; Onslow, Mark; Jones, Mark; Menzies, Ross; Packman, Ann

    2013-01-01

    Purpose: Stuttering varies between and within speaking situations. In this study, the authors used statistical process control charts with 10 case studies to investigate variability of stuttering frequency. Method: Participants were 10 adults who stutter. The authors counted the percentage of syllables stuttered (%SS) for segments of their speech…

  9. 1st Conference of the International Society for Nonparametric Statistics

    CERN Document Server

    Lahiri, S; Politis, Dimitris

    2014-01-01

    This volume is composed of peer-reviewed papers that have developed from the First Conference of the International Society for NonParametric Statistics (ISNPS). This inaugural conference took place in Chalkidiki, Greece, June 15-19, 2012. It was organized with the co-sponsorship of the IMS, the ISI, and other organizations. M.G. Akritas, S.N. Lahiri, and D.N. Politis are the first executive committee members of ISNPS, and the editors of this volume. ISNPS has a distinguished Advisory Committee that includes Professors R.Beran, P.Bickel, R. Carroll, D. Cook, P. Hall, R. Johnson, B. Lindsay, E. Parzen, P. Robinson, M. Rosenblatt, G. Roussas, T. SubbaRao, and G. Wahba. The Charting Committee of ISNPS consists of more than 50 prominent researchers from all over the world.   The chapters in this volume bring forth recent advances and trends in several areas of nonparametric statistics. In this way, the volume facilitates the exchange of research ideas, promotes collaboration among researchers from all over the wo...

  10. Using the Statistical Indicators for the General Insurances Activity

    Directory of Open Access Journals (Sweden)

    Ion Partachi

    2007-04-01

    Full Text Available The statistics of the general insurances activity is largely used in the actuarial calculations. The actuarial analysis are achieved exclusively on the basis of primary and derived indicators, which are drawn up by various statistical methods. The statistical indicators which are used in this respect are obtained on the basis of the factors and conditions allowing the compensation cases to occur.The actuarial analysis is performed over the time as well, by using the chronological which allow the decomposition of the phenomenon being studied by its factors of influence.In this article, after briefly presenting a number of point of view regarding the utilization of the statistical indicators in the actuarial analysis, we have analyzed, successively, a series of issues, such as: the statistical indicators as regards the general insurances fund forming, expressed in physical and value units, or as absolute, relative and average volumes; the statistical indicators of the utilization of the general insurances funds (with the same diversified form of expression and the statistical indicators of the outcomes of the general insurances activity.A particular accent went to the underlying of certain methodological aspects regarding the calculation of the above mentioned indicators, emphasizing certain particular characteristics concerning their utilization in the frame of the actuarial analysis.The article is stressing the clarification of the fact that these indicators are used in the actuarial analysis as a real system. The respective proportions are enumerated, by underlying the concrete possibilities of computation, which secure the possibility of performing the necessary analysis involved by a decisional process.

  11. The physics benchmark processes for the detector performance studies used in CLIC CDR Volume 3

    CERN Document Server

    Allanach, B.J.; Desch, K.; Ellis, J.; Giudice, G.; Grefe, C.; Kraml, S.; Lastovicka, T.; Linssen, L.; Marschall, J.; Martin, S.P.; Muennich, A.; Poss, S.; Roloff, P.; Simon, F.; Strube, J.; Thomson, M.; Wells, J.D.

    2012-01-01

    This note describes the detector benchmark processes used in volume 3 of the CLIC conceptual design report (CDR), which explores a staged construction and operation of the CLIC accelerator. The goal of the detector benchmark studies is to assess the performance of the CLIC ILD and CLIC SiD detector concepts for different physics processes and at a few CLIC centre-of-mass energies.

  12. Severe postpartum haemorrhage after vaginal delivery: a statistical process control chart to report seven years of continuous quality improvement.

    Science.gov (United States)

    Dupont, Corinne; Occelli, Pauline; Deneux-Tharaux, Catherine; Touzet, Sandrine; Duclos, Antoine; Bouvier-Colle, Marie-Hélène; Rudigoz, René-Charles; Huissoud, Cyril

    2014-07-01

    Severe postpartum haemorrhage after vaginal delivery: a statistical process control chart to report seven years of continuous quality improvement To use statistical process control charts to describe trends in the prevalence of severe postpartum haemorrhage after vaginal delivery. This assessment was performed 7 years after we initiated a continuous quality improvement programme that began with regular criteria-based audits Observational descriptive study, in a French maternity unit in the Rhône-Alpes region. Quarterly clinical audit meetings to analyse all cases of severe postpartum haemorrhage after vaginal delivery and provide feedback on quality of care with statistical process control tools. The primary outcomes were the prevalence of severe PPH after vaginal delivery and its quarterly monitoring with a control chart. The secondary outcomes included the global quality of care for women with severe postpartum haemorrhage, including the performance rate of each recommended procedure. Differences in these variables between 2005 and 2012 were tested. From 2005 to 2012, the prevalence of severe postpartum haemorrhage declined significantly, from 1.2% to 0.6% of vaginal deliveries (pcontrol limits, that is, been out of statistical control. The proportion of cases that were managed consistently with the guidelines increased for all of their main components. Implementation of continuous quality improvement efforts began seven years ago and used, among other tools, statistical process control charts. During this period, the prevalence of severe postpartum haemorrhage after vaginal delivery has been reduced by 50%. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  13. Spatial statistics of pitting corrosion patterning: Quadrat counts and the non-homogeneous Poisson process

    International Nuclear Information System (INIS)

    Lopez de la Cruz, J.; Gutierrez, M.A.

    2008-01-01

    This paper presents a stochastic analysis of spatial point patterns as effect of localized pitting corrosion. The Quadrat Counts method is studied with two empirical pit patterns. The results are dependent on the quadrat size and bias is introduced when empty quadrats are accounted for the analysis. The spatially inhomogeneous Poisson process is used to improve the performance of the Quadrat Counts method. The latter combines Quadrat Counts with distance-based statistics in the analysis of pit patterns. The Inter-Event and the Nearest-Neighbour statistics are here implemented in order to compare their results. Further, the treatment of patterns in irregular domains is discussed

  14. Mathematical and Statistical Methods for Actuarial Sciences and Finance

    CERN Document Server

    Legros, Florence; Perna, Cira; Sibillo, Marilena

    2017-01-01

    This volume gathers selected peer-reviewed papers presented at the international conference "MAF 2016 – Mathematical and Statistical Methods for Actuarial Sciences and Finance”, held in Paris (France) at the Université Paris-Dauphine from March 30 to April 1, 2016. The contributions highlight new ideas on mathematical and statistical methods in actuarial sciences and finance. The cooperation between mathematicians and statisticians working in insurance and finance is a very fruitful field, one that yields unique  theoretical models and practical applications, as well as new insights in the discussion of problems of national and international interest. This volume is addressed to academicians, researchers, Ph.D. students and professionals.

  15. Theoretical physics 8 statistical physics

    CERN Document Server

    Nolting, Wolfgang

    2018-01-01

    This textbook offers a clear and comprehensive introduction to statistical physics, one of the core components of advanced undergraduate physics courses. It follows on naturally from the previous volumes in this series, using methods of probability theory and statistics to solve physical problems. The first part of the book gives a detailed overview on classical statistical physics and introduces all mathematical tools needed. The second part of the book covers topics related to quantized states, gives a thorough introduction to quantum statistics, followed by a concise treatment of quantum gases. Ideally suited to undergraduate students with some grounding in quantum mechanics, the book is enhanced throughout with learning features such as boxed inserts and chapter summaries, with key mathematical derivations highlighted to aid understanding. The text is supported by numerous worked examples and end of chapter problem sets. About the Theoretical Physics series Translated from the renowned and highly successf...

  16. Using statistical process control for monitoring the prevalence of hospital-acquired pressure ulcers.

    Science.gov (United States)

    Kottner, Jan; Halfens, Ruud

    2010-05-01

    Institutionally acquired pressure ulcers are used as outcome indicators to assess the quality of pressure ulcer prevention programs. Determining whether quality improvement projects that aim to decrease the proportions of institutionally acquired pressure ulcers lead to real changes in clinical practice depends on the measurement method and statistical analysis used. To examine whether nosocomial pressure ulcer prevalence rates in hospitals in the Netherlands changed, a secondary data analysis using different statistical approaches was conducted of annual (1998-2008) nationwide nursing-sensitive health problem prevalence studies in the Netherlands. Institutions that participated regularly in all survey years were identified. Risk-adjusted nosocomial pressure ulcers prevalence rates, grade 2 to 4 (European Pressure Ulcer Advisory Panel system) were calculated per year and hospital. Descriptive statistics, chi-square trend tests, and P charts based on statistical process control (SPC) were applied and compared. Six of the 905 healthcare institutions participated in every survey year and 11,444 patients in these six hospitals were identified as being at risk for pressure ulcers. Prevalence rates per year ranged from 0.05 to 0.22. Chi-square trend tests revealed statistically significant downward trends in four hospitals but based on SPC methods, prevalence rates of five hospitals varied by chance only. Results of chi-square trend tests and SPC methods were not comparable, making it impossible to decide which approach is more appropriate. P charts provide more valuable information than single P values and are more helpful for monitoring institutional performance. Empirical evidence about the decrease of nosocomial pressure ulcer prevalence rates in the Netherlands is contradictory and limited.

  17. Radar Derived Spatial Statistics of Summer Rain. Volume 2; Data Reduction and Analysis

    Science.gov (United States)

    Konrad, T. G.; Kropfli, R. A.

    1975-01-01

    Data reduction and analysis procedures are discussed along with the physical and statistical descriptors used. The statistical modeling techniques are outlined and examples of the derived statistical characterization of rain cells in terms of the several physical descriptors are presented. Recommendations concerning analyses which can be pursued using the data base collected during the experiment are included.

  18. Eliciting and Developing Teachers' Conceptions of Random Processes in a Probability and Statistics Course

    Science.gov (United States)

    Smith, Toni M.; Hjalmarson, Margret A.

    2013-01-01

    The purpose of this study is to examine prospective mathematics specialists' engagement in an instructional sequence designed to elicit and develop their understandings of random processes. The study was conducted with two different sections of a probability and statistics course for K-8 teachers. Thirty-two teachers participated. Video analyses…

  19. A bibliometric analysis of 50 years of worldwide research on statistical process control

    Directory of Open Access Journals (Sweden)

    Fabiane Letícia Lizarelli

    Full Text Available Abstract An increasing number of papers on statistical process control (SPC has emerged in the last fifty years, especially in the last fifteen years. This may be attributed to the increased global competitiveness generated by innovation and the continuous improvement of products and processes. In this sense, SPC has a fundamentally important role in quality and production systems. The research in this paper considers the context of technological improvement and innovation of products and processes to increase corporate competitiveness. There are several other statistical technics and tools for assisting continuous improvement and innovation of products and processes but, despite the limitations in their use in the improvement projects, there is growing concern about the use of SPC. A gap between the SPC technics taught in engineering courses and their practical applications to industrial problems is observed in empirical research; thus, it is important to understand what has been done and identify the trends in SPC research. The bibliometric study in this paper is proposed in this direction and uses the Web of Science (WoS database. Data analysis indicates that there was a growth rate of more than 90% in the number of publications on SPC after 1990. Our results reveal the countries where these publications have come from, the authors with the highest number of papers and their networks. Main sources of publications are also identified; it is observed that the publications of SPC papers are concentrated in some of the international research journals, not necessarily those with the major high-impact factors. Furthermore, the papers are focused on industrial engineering, operations research and management science fields. The most common term found in the papers was cumulative sum control charts, but new topics have emerged and have been researched in the past ten years, such as multivariate methods for process monitoring and nonparametric methods.

  20. Statistical methods to assess and control processes and products during nuclear fuel fabrication

    International Nuclear Information System (INIS)

    Weidinger, H.

    1999-01-01

    Very good statistical tools and techniques are available today to access the quality and the reliability of fabrication process as the original sources for a good and reliable quality of the fabricated processes. Quality control charts of different types play a key role and the high capability of modern electronic data acquisition technologies proved, at least potentially, a high efficiency in the more or less online application of these methods. These techniques focus mainly on stability and the reliability of the fabrication process. In addition, relatively simple statistical tolls are available to access the capability of fabrication process, assuming they are stable, to fulfill the product specifications. All these techniques can only result in as good a product as the product design is able to describe the product requirements necessary for good performance. Therefore it is essential that product design is strictly and closely performance oriented. However, performance orientation is only successful through an open and effective cooperation with the customer who uses or applies those products. During the last one to two decades in the west, a multi-vendor strategy has been developed by the utility, sometimes leading to three different fuel vendors for one reactor core. This development resulted in better economic conditions for the user but did not necessarily increase an open attitude with the vendor toward the using utility. The responsibility of the utility increased considerably to ensure an adequate quality of the fuel they received. As a matter of fact, sometimes the utilities had to pay a high price because of unexpected performance problems. Thus the utilities are now learning that they need to increase their knowledge and experience in the area of nuclear fuel quality management and technology. This process started some time ago in the west. However, it now also reaches the utilities in the eastern countries. (author)

  1. BTS statistical standards manual

    Science.gov (United States)

    2005-10-01

    The Bureau of Transportation Statistics (BTS), like other federal statistical agencies, establishes professional standards to guide the methods and procedures for the collection, processing, storage, and presentation of statistical data. Standards an...

  2. Energy Statistics

    International Nuclear Information System (INIS)

    Anon.

    1994-01-01

    For the years 1992 and 1993, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period. The tables and figures shown in this publication are: Changes in the volume of GNP and energy consumption; Coal consumption; Natural gas consumption; Peat consumption; Domestic oil deliveries; Import prices of oil; Price development of principal oil products; Fuel prices for power production; Total energy consumption by source; Electricity supply; Energy imports by country of origin in 1993; Energy exports by recipient country in 1993; Consumer prices of liquid fuels; Consumer prices of hard coal and natural gas, prices of indigenous fuels; Average electricity price by type of consumer; Price of district heating by type of consumer and Excise taxes and turnover taxes included in consumer prices of some energy sources

  3. Correlation of ultrasound estimated placental volume and umbilical cord blood volume in term pregnancy.

    Science.gov (United States)

    Pannopnut, Papinwit; Kitporntheranunt, Maethaphan; Paritakul, Panwara; Kongsomboon, Kittipong

    2015-01-01

    To investigate the correlation between ultrasound measured placental volume and collected umbilical cord blood (UCB) volume in term pregnancy. An observational cross-sectional study of term singleton pregnant women in the labor ward at Maha Chakri Sirindhorn Medical Center was conducted. Placental thickness, height, and width were measured using two-dimensional (2D) ultrasound and calculated for placental volume using the volumetric mathematic model. After the delivery of the baby, UCB was collected and measured for its volume immediately. Then, birth weight, placental weight, and the actual placental volume were analyzed. The Pearson's correlation was used to determine the correlation between each two variables. A total of 35 pregnant women were eligible for the study. The mean and standard deviation of estimated placental volume and actual placental volume were 534±180 mL and 575±118 mL, respectively. The median UCB volume was 140 mL (range 98-220 mL). The UCB volume did not have a statistically significant correlation with the estimated placental volume (correlation coefficient 0.15; p=0.37). However, the UCB volume was significantly correlated with the actual placental volume (correlation coefficient 0.62; pcorrelation coefficient 0.38; p=0.02). The estimated placental volume by 2D ultrasound was not significantly correlated with the UCB volume. Further studies to establish the correlation between the UCB volume and the estimated placental volume using other types of placental imaging may be needed.

  4. Region-specific reduction in brain volume in young adults with perinatal hypoxic-ischaemic encephalopathy.

    Science.gov (United States)

    Bregant, Tina; Rados, Milan; Vasung, Lana; Derganc, Metka; Evans, Alan C; Neubauer, David; Kostovic, Ivica

    2013-11-01

    A severe form of perinatal hypoxic-ischaemic encephalopathy (HIE) carries a high risk of perinatal death and severe neurological sequelae while in mild HIE only discrete cognitive disorders may occur. To compare total brain volumes and region-specific cortical measurements between young adults with mild-moderate perinatal HIE and a healthy control group of the same age. MR imaging was performed in a cohort of 14 young adults (9 males, 5 females) with a history of mild or moderate perinatal HIE. The control group consisted of healthy participants, matched with HIE group by age and gender. Volumetric analysis was done after the processing of MR images using a fully automated CIVET pipeline. We measured gyrification indexes, total brain volume, volume of grey and white matter, and of cerebrospinal fluid. We also measured volume, thickness and area of the cerebral cortex in the parietal, occipital, frontal, and temporal lobe, and of the isthmus cinguli, parahippocampal and cingulated gyrus, and insula. The HIE patient group showed smaller absolute volumetric data. Statistically significant (p right hemisphere, of cortical areas in the right temporal lobe and parahippocampal gyrus, of cortical volumes in the right temporal lobe and of cortical thickness in the right isthmus of the cingulate gyrus were found. Comparison between the healthy group and the HIE group of the same gender showed statistically significant changes in the male HIE patients, where a significant reduction was found in whole brain volume; left parietal, bilateral temporal, and right parahippocampal gyrus cortical areas; and bilateral temporal lobe cortical volume. Our analysis of total brain volumes and region-specific corticometric parameters suggests that mild-moderate forms of perinatal HIE lead to reductions in whole brain volumes. In the study reductions were most pronounced in temporal lobe and parahippocampal gyrus. Copyright © 2013 European Paediatric Neurology Society. All rights reserved.

  5. An Introduction to the Special Volume on

    Directory of Open Access Journals (Sweden)

    Micah Altman

    2011-08-01

    Full Text Available This special volume of the Journal of Statistical Software on political methodology includes 14 papers, with wide-ranging software contributions of political scientists to their own field, and more generally to statistical data analysis in the the social sciences and beyond. Special emphasis is given to software that is written in or can cooperate with the R system for statistical computing.

  6. Alternative derivations of the statistical mechanical distribution laws.

    Science.gov (United States)

    Wall, F T

    1971-08-01

    A new approach is presented for the derivation of statistical mechanical distribution laws. The derivations are accomplished by minimizing the Helmholtz free energy under constant temperature and volume, instead of maximizing the entropy under constant energy and volume. An alternative method involves stipulating equality of chemical potential, or equality of activity, for particles in different energy levels. This approach leads to a general statement of distribution laws applicable to all systems for which thermodynamic probabilities can be written. The methods also avoid use of the calculus of variations, Lagrangian multipliers, and Stirling's approximation for the factorial. The results are applied specifically to Boltzmann, Fermi-Dirac, and Bose-Einstein statistics. The special significance of chemical potential and activity is discussed for microscopic systems.

  7. Preliminary Retrospective Analysis of Daily Tomotherapy Output Constancy Checks Using Statistical Process Control.

    Science.gov (United States)

    Mezzenga, Emilio; D'Errico, Vincenzo; Sarnelli, Anna; Strigari, Lidia; Menghi, Enrico; Marcocci, Francesco; Bianchini, David; Benassi, Marcello

    2016-01-01

    The purpose of this study was to retrospectively evaluate the results from a Helical TomoTherapy Hi-Art treatment system relating to quality controls based on daily static and dynamic output checks using statistical process control methods. Individual value X-charts, exponentially weighted moving average charts, and process capability and acceptability indices were used to monitor the treatment system performance. Daily output values measured from January 2014 to January 2015 were considered. The results obtained showed that, although the process was in control, there was an out-of-control situation in the principal maintenance intervention for the treatment system. In particular, process capability indices showed a decreasing percentage of points in control which was, however, acceptable according to AAPM TG148 guidelines. Our findings underline the importance of restricting the acceptable range of daily output checks and suggest a future line of investigation for a detailed process control of daily output checks for the Helical TomoTherapy Hi-Art treatment system.

  8. Sectional analysis for volume determination and selection of volume equations for the Tapajos Nacional Forest

    Directory of Open Access Journals (Sweden)

    Renato Bezerra da Silva Ribeiro

    2014-12-01

    Full Text Available The aim of this study was to analyze different sections lengths for volume determination, fitting of volumetric models for timber production estimation in an area of forest management in the Tapajós National Forest (FNT. Six treatments for sectioning were tested in 152 logs of 12 commercial species. The obtained volumes were statistically compared by analysis of variance (ANOVA for the choice of the best method of sectioning and calculating the actual volume of 2,094 sample trees in different diameter commercial classes. Ten mathematical models were fitted to the whole data and to the species Manilkara huberi (Ducke Chevalier (maçaranduba Lecythis lurida (Miers Samori (jarana and Hymenaea courbaril L. (Jatobá. The criteria to choose the best model were adjusted coefficient of determination in percentage (R2adj%, standard error of estimate in percentage (Syx%, significance of the parameters, normality of residuals, Variance Inflation Factor (VIF and residuals graphic distribution. There was no statistical difference between the methods of sectioning and thus the total length of the logs was more operational in the field. The models in logarithmic form of Schumacher and Hall and Spurr were the best to estimate the volume for the species and for the whole sample set.

  9. Spherical Process Models for Global Spatial Statistics

    KAUST Repository

    Jeong, Jaehong; Jun, Mikyoung; Genton, Marc G.

    2017-01-01

    Statistical models used in geophysical, environmental, and climate science applications must reflect the curvature of the spatial domain in global data. Over the past few decades, statisticians have developed covariance models that capture

  10. Stochastic geometry, spatial statistics and random fields models and algorithms

    CERN Document Server

    2015-01-01

    Providing a graduate level introduction to various aspects of stochastic geometry, spatial statistics and random fields, this volume places a special emphasis on fundamental classes of models and algorithms as well as on their applications, for example in materials science, biology and genetics. This book has a strong focus on simulations and includes extensive codes in Matlab and R, which are widely used in the mathematical community. It can be regarded as a continuation of the recent volume 2068 of Lecture Notes in Mathematics, where other issues of stochastic geometry, spatial statistics and random fields were considered, with a focus on asymptotic methods.

  11. Stochastic processes, optimization, and control theory a volume in honor of Suresh Sethi

    CERN Document Server

    Yan, Houmin

    2006-01-01

    This edited volume contains 16 research articles. It presents recent and pressing issues in stochastic processes, control theory, differential games, optimization, and their applications in finance, manufacturing, queueing networks, and climate control. One of the salient features is that the book is highly multi-disciplinary. The book is dedicated to Professor Suresh Sethi on the occasion of his 60th birthday, in view of his distinguished career.

  12. Difficult Decisions: A Qualitative Exploration of the Statistical Decision Making Process from the Perspectives of Psychology Students and Academics.

    Science.gov (United States)

    Allen, Peter J; Dorozenko, Kate P; Roberts, Lynne D

    2016-01-01

    Quantitative research methods are essential to the development of professional competence in psychology. They are also an area of weakness for many students. In particular, students are known to struggle with the skill of selecting quantitative analytical strategies appropriate for common research questions, hypotheses and data types. To begin understanding this apparent deficit, we presented nine psychology undergraduates (who had all completed at least one quantitative methods course) with brief research vignettes, and asked them to explicate the process they would follow to identify an appropriate statistical technique for each. Thematic analysis revealed that all participants found this task challenging, and even those who had completed several research methods courses struggled to articulate how they would approach the vignettes on more than a very superficial and intuitive level. While some students recognized that there is a systematic decision making process that can be followed, none could describe it clearly or completely. We then presented the same vignettes to 10 psychology academics with particular expertise in conducting research and/or research methods instruction. Predictably, these "experts" were able to describe a far more systematic, comprehensive, flexible, and nuanced approach to statistical decision making, which begins early in the research process, and pays consideration to multiple contextual factors. They were sensitive to the challenges that students experience when making statistical decisions, which they attributed partially to how research methods and statistics are commonly taught. This sensitivity was reflected in their pedagogic practices. When asked to consider the format and features of an aid that could facilitate the statistical decision making process, both groups expressed a preference for an accessible, comprehensive and reputable resource that follows a basic decision tree logic. For the academics in particular, this aid

  13. Minerals Yearbook, volume III, Area Reports—International—Europe and Central Eurasia

    Science.gov (United States)

    Geological Survey, U.S.

    2018-01-01

    The U.S. Geological Survey (USGS) Minerals Yearbook discusses the performance of the worldwide minerals and materials industries and provides background information to assist in interpreting that performance. Content of the individual Minerals Yearbook volumes follows:Volume I, Metals and Minerals, contains chapters about virtually all metallic and industrial mineral commodities important to the U.S. economy. Chapters on survey methods, summary statistics for domestic nonfuel minerals, and trends in mining and quarrying in the metals and industrial mineral industries in the United States are also included.Volume II, Area Reports: Domestic, contains a chapter on the mineral industry of each of the 50 States and Puerto Rico and the Administered Islands. This volume also has chapters on survey methods and summary statistics of domestic nonfuel minerals.Volume III, Area Reports: International, is published as four separate reports. These regional reports contain the latest available minerals data on more than 180 foreign countries and discuss the importance of minerals to the economies of these nations and the United States. Each report begins with an overview of the region’s mineral industries during the year. It continues with individual country chapters that examine the mining, refining, processing, and use of minerals in each country of the region and how each country’s mineral industry relates to U.S. industry. Most chapters include production tables and industry structure tables, information about Government policies and programs that affect the country’s mineral industry, and an outlook section.The USGS continually strives to improve the value of its publications to users. Constructive comments and suggestions by readers of the Minerals Yearbook are welcomed.

  14. Minerals Yearbook, volume III, Area Reports—International—Asia and the Pacific

    Science.gov (United States)

    Geological Survey, U.S.

    2018-01-01

    The U.S. Geological Survey (USGS) Minerals Yearbook discusses the performance of the worldwide minerals and materials industries and provides background information to assist in interpreting that performance. Content of the individual Minerals Yearbook volumes follows:Volume I, Metals and Minerals, contains chapters about virtually all metallic and industrial mineral commodities important to the U.S. economy. Chapters on survey methods, summary statistics for domestic nonfuel minerals, and trends in mining and quarrying in the metals and industrial mineral industries in the United States are also included.Volume II, Area Reports: Domestic, contains a chapter on the mineral industry of each of the 50 States and Puerto Rico and the Administered Islands. This volume also has chapters on survey methods and summary statistics of domestic nonfuel minerals.Volume III, Area Reports: International, is published as four separate reports. These regional reports contain the latest available minerals data on more than 180 foreign countries and discuss the importance of minerals to the economies of these nations and the United States. Each report begins with an overview of the region’s mineral industries during the year. It continues with individual country chapters that examine the mining, refining, processing, and use of minerals in each country of the region and how each country’s mineral industry relates to U.S. industry. Most chapters include production tables and industry structure tables, information about Government policies and programs that affect the country’s mineral industry, and an outlook section.The USGS continually strives to improve the value of its publications to users. Constructive comments and suggestions by readers of the Minerals Yearbook are welcomed.

  15. Minerals Yearbook, volume III, Area Reports—International—Latin America and Canada

    Science.gov (United States)

    ,

    2018-01-01

    The U.S. Geological Survey (USGS) Minerals Yearbook discusses the performance of the worldwide minerals and materials industries and provides background information to assist in interpreting that performance. Content of the individual Minerals Yearbook volumes follows:Volume I, Metals and Minerals, contains chapters about virtually all metallic and industrial mineral commodities important to the U.S. economy. Chapters on survey methods, summary statistics for domestic nonfuel minerals, and trends in mining and quarrying in the metals and industrial mineral industries in the United States are also included.Volume II, Area Reports: Domestic, contains a chapter on the mineral industry of each of the 50 States and Puerto Rico and the Administered Islands. This volume also has chapters on survey methods and summary statistics of domestic nonfuel minerals.Volume III, Area Reports: International, is published as four separate reports. These regional reports contain the latest available minerals data on more than 180 foreign countries and discuss the importance of minerals to the economies of these nations and the United States. Each report begins with an overview of the region’s mineral industries during the year. It continues with individual country chapters that examine the mining, refining, processing, and use of minerals in each country of the region and how each country’s mineral industry relates to U.S. industry. Most chapters include production tables and industry structure tables, information about Government policies and programs that affect the country’s mineral industry, and an outlook section.The USGS continually strives to improve the value of its publications to users. Constructive comments and suggestions by readers of the Minerals Yearbook are welcomed.

  16. Simple classical model for Fano statistics in radiation detectors

    Energy Technology Data Exchange (ETDEWEB)

    Jordan, David V. [Pacific Northwest National Laboratory, National Security Division - Radiological and Chemical Sciences Group PO Box 999, Richland, WA 99352 (United States)], E-mail: David.Jordan@pnl.gov; Renholds, Andrea S.; Jaffe, John E.; Anderson, Kevin K.; Rene Corrales, L.; Peurrung, Anthony J. [Pacific Northwest National Laboratory, National Security Division - Radiological and Chemical Sciences Group PO Box 999, Richland, WA 99352 (United States)

    2008-02-01

    A simple classical model that captures the essential statistics of energy partitioning processes involved in the creation of information carriers (ICs) in radiation detectors is presented. The model pictures IC formation from a fixed amount of deposited energy in terms of the statistically analogous process of successively sampling water from a large, finite-volume container ('bathtub') with a small dipping implement ('shot or whiskey glass'). The model exhibits sub-Poisson variance in the distribution of the number of ICs generated (the 'Fano effect'). Elementary statistical analysis of the model clarifies the role of energy conservation in producing the Fano effect and yields Fano's prescription for computing the relative variance of the IC number distribution in terms of the mean and variance of the underlying, single-IC energy distribution. The partitioning model is applied to the development of the impact ionization cascade in semiconductor radiation detectors. It is shown that, in tandem with simple assumptions regarding the distribution of energies required to create an (electron, hole) pair, the model yields an energy-independent Fano factor of 0.083, in accord with the lower end of the range of literature values reported for silicon and high-purity germanium. The utility of this simple picture as a diagnostic tool for guiding or constraining more detailed, 'microscopic' physical models of detector material response to ionizing radiation is discussed.

  17. Workshop on Analytical Methods in Statistics

    CERN Document Server

    Jurečková, Jana; Maciak, Matúš; Pešta, Michal

    2017-01-01

    This volume collects authoritative contributions on analytical methods and mathematical statistics. The methods presented include resampling techniques; the minimization of divergence; estimation theory and regression, eventually under shape or other constraints or long memory; and iterative approximations when the optimal solution is difficult to achieve. It also investigates probability distributions with respect to their stability, heavy-tailness, Fisher information and other aspects, both asymptotically and non-asymptotically. The book not only presents the latest mathematical and statistical methods and their extensions, but also offers solutions to real-world problems including option pricing. The selected, peer-reviewed contributions were originally presented at the workshop on Analytical Methods in Statistics, AMISTAT 2015, held in Prague, Czech Republic, November 10-13, 2015.

  18. Impact analysis of critical success factors on the benefits from statistical process control implementation

    Directory of Open Access Journals (Sweden)

    Fabiano Rodrigues Soriano

    Full Text Available Abstract The Statistical Process Control - SPC is a set of statistical techniques focused on process control, monitoring and analyzing variation causes in the quality characteristics and/or in the parameters used to control and process improvements. Implementing SPC in organizations is a complex task. The reasons for its failure are related to organizational or social factors such as lack of top management commitment and little understanding about its potential benefits. Other aspects concern technical factors such as lack of training on and understanding about the statistical techniques. The main aim of the present article is to understand the interrelations between conditioning factors associated with top management commitment (Support, SPC Training and Application, as well as to understand the relationships between these factors and the benefits associated with the implementation of the program. The Partial Least Squares Structural Equation Modeling (PLS-SEM was used in the analysis since the main goal is to establish the causal relations. A cross-section survey was used as research method to collect information of samples from Brazilian auto-parts companies, which were selected according to guides from the auto-parts industry associations. A total of 170 companies were contacted by e-mail and by phone in order to be invited to participate in the survey. However, just 93 industries agreed on participating, and only 43 answered the questionnaire. The results showed that the senior management support considerably affects the way companies develop their training programs. In turn, these trainings affect the way companies apply the techniques. Thus, it will reflect on the benefits gotten from implementing the program. It was observed that the managerial and technical aspects are closely connected to each other and that they are represented by the ratio between top management and training support. The technical aspects observed through SPC

  19. Computing Science and Statistics. Volume 24. Graphics and Visualization

    Science.gov (United States)

    1993-03-01

    Mike West Institute of Statistics & Decision Sciences Duke University, Durham NC 27708, USA Abstract density estimation techniques. With an importance...in J., Sharples , L. D. and Kirby, A. J. press). (1992b) Modelling complexity: applica- Wakefield J. C., Smith, A. F. M., Racine- tions of Gibbs...Math & Stats Box 13040 SFA Riccarton Edinburgh, Scotland EH 14 4AS Nacognoches, TX 75962 mike @cara.ma.hw.ac.uk Allen McIntosh Michael T. Longnecker

  20. Acidic precipitation. Volume 4: Soils, aquatic processes, and lake acidification. Advances in environmental science

    Energy Technology Data Exchange (ETDEWEB)

    Norton, S.A.; Lindberg, S.E.; Page, A.L. (eds.)

    1990-01-01

    Acidic precipitation and its effects have been the focus of intense research for over two decades. Recently, research has focused on a greater understanding of dose-response relationships between atmospheric loading of acidifying material and lake acidity. This volume of the subseries Acidic Precipitation emphasizes acid neutralizing processes and the capacity of terrestrial and aquatic systems to assimilate acidifying substances and, conversely, the ability of systems to recover after acid loading diminishes. Eight chapters have been processed separately for inclusion in the appropriate data bases.

  1. MASKED AREAS IN SHEAR PEAK STATISTICS: A FORWARD MODELING APPROACH

    International Nuclear Information System (INIS)

    Bard, D.; Kratochvil, J. M.; Dawson, W.

    2016-01-01

    The statistics of shear peaks have been shown to provide valuable cosmological information beyond the power spectrum, and will be an important constraint of models of cosmology in forthcoming astronomical surveys. Surveys include masked areas due to bright stars, bad pixels etc., which must be accounted for in producing constraints on cosmology from shear maps. We advocate a forward-modeling approach, where the impacts of masking and other survey artifacts are accounted for in the theoretical prediction of cosmological parameters, rather than correcting survey data to remove them. We use masks based on the Deep Lens Survey, and explore the impact of up to 37% of the survey area being masked on LSST and DES-scale surveys. By reconstructing maps of aperture mass the masking effect is smoothed out, resulting in up to 14% smaller statistical uncertainties compared to simply reducing the survey area by the masked area. We show that, even in the presence of large survey masks, the bias in cosmological parameter estimation produced in the forward-modeling process is ≈1%, dominated by bias caused by limited simulation volume. We also explore how this potential bias scales with survey area and evaluate how much small survey areas are impacted by the differences in cosmological structure in the data and simulated volumes, due to cosmic variance

  2. JULIDE: a software tool for 3D reconstruction and statistical analysis of autoradiographic mouse brain sections.

    Directory of Open Access Journals (Sweden)

    Delphine Ribes

    Full Text Available In this article we introduce JULIDE, a software toolkit developed to perform the 3D reconstruction, intensity normalization, volume standardization by 3D image registration and voxel-wise statistical analysis of autoradiographs of mouse brain sections. This software tool has been developed in the open-source ITK software framework and is freely available under a GPL license. The article presents the complete image processing chain from raw data acquisition to 3D statistical group analysis. Results of the group comparison in the context of a study on spatial learning are shown as an illustration of the data that can be obtained with this tool.

  3. The statistical stability phenomenon

    CERN Document Server

    Gorban, Igor I

    2017-01-01

    This monograph investigates violations of statistical stability of physical events, variables, and processes and develops a new physical-mathematical theory taking into consideration such violations – the theory of hyper-random phenomena. There are five parts. The first describes the phenomenon of statistical stability and its features, and develops methods for detecting violations of statistical stability, in particular when data is limited. The second part presents several examples of real processes of different physical nature and demonstrates the violation of statistical stability over broad observation intervals. The third part outlines the mathematical foundations of the theory of hyper-random phenomena, while the fourth develops the foundations of the mathematical analysis of divergent and many-valued functions. The fifth part contains theoretical and experimental studies of statistical laws where there is violation of statistical stability. The monograph should be of particular interest to engineers...

  4. Statistical learning problem of artificial neural network to control roofing process

    Directory of Open Access Journals (Sweden)

    Lapidus Azariy

    2017-01-01

    Full Text Available Now software developed on the basis of artificial neural networks (ANN has been actively implemented in construction companies to support decision-making in organization and management of construction processes. ANN learning is the main stage of its development. A key question for supervised learning is how many number of training examples we need to approximate the true relationship between network inputs and output with the desired accuracy. Also designing of ANN architecture is related to learning problem known as “curse of dimensionality”. This problem is important for the study of construction process management because of the difficulty to get training data from construction sites. In previous studies the authors have designed a 4-layer feedforward ANN with a unit model of 12-5-4-1 to approximate estimation and prediction of roofing process. This paper presented the statistical learning side of created ANN with simple-error-minimization algorithm. The sample size to efficient training and the confidence interval of network outputs defined. In conclusion the authors predicted successful ANN learning in a large construction business company within a short space of time.

  5. Learning Curves and Bootstrap Estimates for Inference with Gaussian Processes: A Statistical Mechanics Study

    DEFF Research Database (Denmark)

    Malzahn, Dorthe; Opper, Manfred

    2003-01-01

    We employ the replica method of statistical physics to study the average case performance of learning systems. The new feature of our theory is that general distributions of data can be treated, which enables applications to real data. For a class of Bayesian prediction models which are based...... on Gaussian processes, we discuss Bootstrap estimates for learning curves....

  6. Improving Usage Statistics Processing for a Library Consortium: The Virtual Library of Virginia's Experience

    Science.gov (United States)

    Matthews, Tansy E.

    2009-01-01

    This article describes the development of the Virtual Library of Virginia (VIVA). The VIVA statistics-processing system remains a work in progress. Member libraries will benefit from the ability to obtain the actual data from the VIVA site, rather than just the summaries, so a project to make these data available is currently being planned. The…

  7. Process automation system for integration and operation of Large Volume Plasma Device

    International Nuclear Information System (INIS)

    Sugandhi, R.; Srivastava, P.K.; Sanyasi, A.K.; Srivastav, Prabhakar; Awasthi, L.M.; Mattoo, S.K.

    2016-01-01

    Highlights: • Analysis and design of process automation system for Large Volume Plasma Device (LVPD). • Data flow modeling for process model development. • Modbus based data communication and interfacing. • Interface software development for subsystem control in LabVIEW. - Abstract: Large Volume Plasma Device (LVPD) has been successfully contributing towards understanding of the plasma turbulence driven by Electron Temperature Gradient (ETG), considered as a major contributor for the plasma loss in the fusion devices. Large size of the device imposes certain difficulties in the operation, such as access of the diagnostics, manual control of subsystems and large number of signals monitoring etc. To achieve integrated operation of the machine, automation is essential for the enhanced performance and operational efficiency. Recently, the machine is undergoing major upgradation for the new physics experiments. The new operation and control system consists of following: (1) PXIe based fast data acquisition system for the equipped diagnostics; (2) Modbus based Process Automation System (PAS) for the subsystem controls and (3) Data Utilization System (DUS) for efficient storage, processing and retrieval of the acquired data. In the ongoing development, data flow model of the machine’s operation has been developed. As a proof of concept, following two subsystems have been successfully integrated: (1) Filament Power Supply (FPS) for the heating of W- filaments based plasma source and (2) Probe Positioning System (PPS) for control of 12 number of linear probe drives for a travel length of 100 cm. The process model of the vacuum production system has been prepared and validated against acquired pressure data. In the next upgrade, all the subsystems of the machine will be integrated in a systematic manner. The automation backbone is based on 4-wire multi-drop serial interface (RS485) using Modbus communication protocol. Software is developed on LabVIEW platform using

  8. Process automation system for integration and operation of Large Volume Plasma Device

    Energy Technology Data Exchange (ETDEWEB)

    Sugandhi, R., E-mail: ritesh@ipr.res.in; Srivastava, P.K.; Sanyasi, A.K.; Srivastav, Prabhakar; Awasthi, L.M.; Mattoo, S.K.

    2016-11-15

    Highlights: • Analysis and design of process automation system for Large Volume Plasma Device (LVPD). • Data flow modeling for process model development. • Modbus based data communication and interfacing. • Interface software development for subsystem control in LabVIEW. - Abstract: Large Volume Plasma Device (LVPD) has been successfully contributing towards understanding of the plasma turbulence driven by Electron Temperature Gradient (ETG), considered as a major contributor for the plasma loss in the fusion devices. Large size of the device imposes certain difficulties in the operation, such as access of the diagnostics, manual control of subsystems and large number of signals monitoring etc. To achieve integrated operation of the machine, automation is essential for the enhanced performance and operational efficiency. Recently, the machine is undergoing major upgradation for the new physics experiments. The new operation and control system consists of following: (1) PXIe based fast data acquisition system for the equipped diagnostics; (2) Modbus based Process Automation System (PAS) for the subsystem controls and (3) Data Utilization System (DUS) for efficient storage, processing and retrieval of the acquired data. In the ongoing development, data flow model of the machine’s operation has been developed. As a proof of concept, following two subsystems have been successfully integrated: (1) Filament Power Supply (FPS) for the heating of W- filaments based plasma source and (2) Probe Positioning System (PPS) for control of 12 number of linear probe drives for a travel length of 100 cm. The process model of the vacuum production system has been prepared and validated against acquired pressure data. In the next upgrade, all the subsystems of the machine will be integrated in a systematic manner. The automation backbone is based on 4-wire multi-drop serial interface (RS485) using Modbus communication protocol. Software is developed on LabVIEW platform using

  9. The Role of Statistics in Business and Industry

    CERN Document Server

    Hahn, Gerald J

    2011-01-01

    An insightful guide to the use of statistics for solving key problems in modern-day business and industry This book has been awarded the Technometrics Ziegel Prize for the best book reviewed by the journal in 2010. Technometrics is a journal of statistics for the physical, chemical and engineering sciences, published jointly by the American Society for Quality and the American Statistical Association. Criteria for the award include that the book brings together in one volume a body of material previously only available in scattered research articles and having the potential to significantly im

  10. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  11. On the structure of dynamic principal component analysis used in statistical process monitoring

    DEFF Research Database (Denmark)

    Vanhatalo, Erik; Kulahci, Murat; Bergquist, Bjarne

    2017-01-01

    When principal component analysis (PCA) is used for statistical process monitoring it relies on the assumption that data are time independent. However, industrial data will often exhibit serial correlation. Dynamic PCA (DPCA) has been suggested as a remedy for high-dimensional and time...... for determining the number of principal components to retain. The number of retained principal components is determined by visual inspection of the serial correlation in the squared prediction error statistic, Q (SPE), together with the cumulative explained variance of the model. The methods are illustrated using...... driven method to determine the maximum number of lags in DPCA with a foundation in multivariate time series analysis. The method is based on the behavior of the eigenvalues of the lagged autocorrelation and partial autocorrelation matrices. Given a specific lag structure we also propose a method...

  12. Multivariate statistical process control in product quality review assessment - A case study.

    Science.gov (United States)

    Kharbach, M; Cherrah, Y; Vander Heyden, Y; Bouklouze, A

    2017-11-01

    According to the Food and Drug Administration and the European Good Manufacturing Practices (GMP) guidelines, Annual Product Review (APR) is a mandatory requirement in GMP. It consists of evaluating a large collection of qualitative or quantitative data in order to verify the consistency of an existing process. According to the Code of Federal Regulation Part 11 (21 CFR 211.180), all finished products should be reviewed annually for the quality standards to determine the need of any change in specification or manufacturing of drug products. Conventional Statistical Process Control (SPC) evaluates the pharmaceutical production process by examining only the effect of a single factor at the time using a Shewhart's chart. It neglects to take into account the interaction between the variables. In order to overcome this issue, Multivariate Statistical Process Control (MSPC) can be used. Our case study concerns an APR assessment, where 164 historical batches containing six active ingredients, manufactured in Morocco, were collected during one year. Each batch has been checked by assaying the six active ingredients by High Performance Liquid Chromatography according to European Pharmacopoeia monographs. The data matrix was evaluated both by SPC and MSPC. The SPC indicated that all batches are under control, while the MSPC, based on Principal Component Analysis (PCA), for the data being either autoscaled or robust scaled, showed four and seven batches, respectively, out of the Hotelling T 2 95% ellipse. Also, an improvement of the capability of the process is observed without the most extreme batches. The MSPC can be used for monitoring subtle changes in the manufacturing process during an APR assessment. Copyright © 2017 Académie Nationale de Pharmacie. Published by Elsevier Masson SAS. All rights reserved.

  13. Fiducial registration error as a statistical process control metric in image-guided radiotherapy with prostatic markers

    International Nuclear Information System (INIS)

    Ung, M.N.; Wee, Leonard

    2010-01-01

    Full text: Portal imaging of implanted fiducial markers has been in use for image-guided radiotherapy (TORT) of prostate cancer, with ample attention to localization accuracy and organ motion. The geometric uncertainties in point-based rigid-body (PBRB) image registration during localization of prostate fiducial markers can be quantified in terms of a fiducial registration error (FRE). Statistical process control charts for individual patients can be designed to identify potentially significant deviation of FRE from expected behaviour. In this study, the aim was to retrospectively apply statistical process control methods to FREs in 34 individuals to identify parameters that may impact on the process stability in image-based localization. A robust procedure for estimating control parameters, control lim its and fixed tolerance levels from a small number of initial observations has been proposed and discussed. Four distinct types of qualitative control chart behavior have been observed. Probable clinical factors leading to IORT process instability are discussed in light of the control chart behaviour. Control charts have been shown to be a useful decision-making tool for detecting potentially out-of control processes on an individual basis. It can sensitively identify potential problems that warrant more detailed investigation in the 10RT of prostate cancer.

  14. Statistical process control analysis for patient-specific IMRT and VMAT QA.

    Science.gov (United States)

    Sanghangthum, Taweap; Suriyapee, Sivalee; Srisatit, Somyot; Pawlicki, Todd

    2013-05-01

    This work applied statistical process control to establish the control limits of the % gamma pass of patient-specific intensity modulated radiotherapy (IMRT) and volumetric modulated arc therapy (VMAT) quality assurance (QA), and to evaluate the efficiency of the QA process by using the process capability index (Cpml). A total of 278 IMRT QA plans in nasopharyngeal carcinoma were measured with MapCHECK, while 159 VMAT QA plans were undertaken with ArcCHECK. Six megavolts with nine fields were used for the IMRT plan and 2.5 arcs were used to generate the VMAT plans. The gamma (3%/3 mm) criteria were used to evaluate the QA plans. The % gamma passes were plotted on a control chart. The first 50 data points were employed to calculate the control limits. The Cpml was calculated to evaluate the capability of the IMRT/VMAT QA process. The results showed higher systematic errors in IMRT QA than VMAT QA due to the more complicated setup used in IMRT QA. The variation of random errors was also larger in IMRT QA than VMAT QA because the VMAT plan has more continuity of dose distribution. The average % gamma pass was 93.7% ± 3.7% for IMRT and 96.7% ± 2.2% for VMAT. The Cpml value of IMRT QA was 1.60 and VMAT QA was 1.99, which implied that the VMAT QA process was more accurate than the IMRT QA process. Our lower control limit for % gamma pass of IMRT is 85.0%, while the limit for VMAT is 90%. Both the IMRT and VMAT QA processes are good quality because Cpml values are higher than 1.0.

  15. The product composition control system at Savannah River: Statistical process control algorithm

    International Nuclear Information System (INIS)

    Brown, K.G.

    1994-01-01

    The Defense Waste Processing Facility (DWPF) at the Savannah River Site (SRS) will be used to immobilize the approximately 130 million liters of high-level nuclear waste currently stored at the site in 51 carbon steel tanks. Waste handling operations separate this waste into highly radioactive insoluble sludge and precipitate and less radioactive water soluble salts. In DWPF, precipitate (PHA) is blended with insoluble sludge and ground glass frit to produce melter feed slurry which is continuously fed to the DWPF melter. The melter produces a molten borosilicate glass which is poured into stainless steel canisters for cooling and, ultimately, shipment to and storage in an geologic repository. Described here is the Product Composition Control System (PCCS) process control algorithm. The PCCS is the amalgam of computer hardware and software intended to ensure that the melt will be processable and that the glass wasteform produced will be acceptable. Within PCCS, the Statistical Process Control (SPC) Algorithm is the means which guides control of the DWPF process. The SPC Algorithm is necessary to control the multivariate DWPF process in the face of uncertainties arising from the process, its feeds, sampling, modeling, and measurement systems. This article describes the functions performed by the SPC Algorithm, characterization of DWPF prior to making product, accounting for prediction uncertainty, accounting for measurement uncertainty, monitoring a SME batch, incorporating process information, and advantages of the algorithm. 9 refs., 6 figs

  16. Computational analysis of particle reinforced viscoelastic polymer nanocomposites - statistical study of representative volume element

    Science.gov (United States)

    Hu, Anqi; Li, Xiaolin; Ajdari, Amin; Jiang, Bing; Burkhart, Craig; Chen, Wei; Brinson, L. Catherine

    2018-05-01

    The concept of representative volume element (RVE) is widely used to determine the effective material properties of random heterogeneous materials. In the present work, the RVE is investigated for the viscoelastic response of particle-reinforced polymer nanocomposites in the frequency domain. The smallest RVE size and the minimum number of realizations at a given volume size for both structural and mechanical properties are determined for a given precision using the concept of margin of error. It is concluded that using the mean of many realizations of a small RVE instead of a single large RVE can retain the desired precision of a result with much lower computational cost (up to three orders of magnitude reduced computation time) for the property of interest. Both the smallest RVE size and the minimum number of realizations for a microstructure with higher volume fraction (VF) are larger compared to those of one with lower VF at the same desired precision. Similarly, a clustered structure is shown to require a larger minimum RVE size as well as a larger number of realizations at a given volume size compared to the well-dispersed microstructures.

  17. Numerical consideration for multiscale statistical process control method applied to nuclear material accountancy

    International Nuclear Information System (INIS)

    Suzuki, Mitsutoshi; Hori, Masato; Asou, Ryoji; Usuda, Shigekazu

    2006-01-01

    The multiscale statistical process control (MSSPC) method is applied to clarify the elements of material unaccounted for (MUF) in large scale reprocessing plants using numerical calculations. Continuous wavelet functions are used to decompose the process data, which simulate batch operation superimposed by various types of disturbance, and the disturbance components included in the data are divided into time and frequency spaces. The diagnosis of MSSPC is applied to distinguish abnormal events from the process data and shows how to detect abrupt and protracted diversions using principle component analysis. Quantitative performance of MSSPC for the time series data is shown with average run lengths given by Monte-Carlo simulation to compare to the non-detection probability β. Recent discussion about bias corrections in material balances is introduced and another approach is presented to evaluate MUF without assuming the measurement error model. (author)

  18. Exploring the relation between process design and efficiency in high-volume cataract pathways from a lean thinking perspective

    NARCIS (Netherlands)

    van Vliet, Ellen J.; Bredenhoff, E.; Bredenhoff, Eelco; Sermeus, Walter; Kop, Lucas M.; Sol, Johannes C.A.; van Harten, Willem H.

    2011-01-01

    Objective: To compare process designs of three high-volume cataract pathways in a lean thinking framework and to explore how efficiency in terms of lead times, hospital visits and costs is related to process design. Design: International retrospective comparative benchmark study with a mixed-method

  19. Proceedings of waste stream minimization and utilization innovative concepts: An experimental technology exchange. Volume 1, Industrial solid waste processing municipal waste reduction/recycling

    Energy Technology Data Exchange (ETDEWEB)

    Lee, V.E. [ed.; Watts, R.L.

    1993-04-01

    This two-volume proceedings summarizes the results of fifteen innovations that were funded through the US Department of Energy`s Innovative Concept Program. The fifteen innovations were presented at the sixth Innovative Concepts Fair, held in Austin, Texas, on April 22--23, 1993. The concepts in this year`s fair address innovations that can substantially reduce or use waste streams. Each paper describes the need for the proposed concept, the concept being proposed, and the concept`s economics and market potential, key experimental results, and future development needs. The papers are divided into two volumes: Volume 1 addresses innovations for industrial solid waste processing and municipal waste reduction/recycling, and Volume 2 addresses industrial liquid waste processing and industrial gaseous waste processing. Selected papers have been indexed separately for inclusion in the Energy Science and Technology Database.

  20. ASYMPTOTIC COMPARISONS OF U-STATISTICS, V-STATISTICS AND LIMITS OF BAYES ESTIMATES BY DEFICIENCIES

    OpenAIRE

    Toshifumi, Nomachi; Hajime, Yamato; Graduate School of Science and Engineering, Kagoshima University:Miyakonojo College of Technology; Faculty of Science, Kagoshima University

    2001-01-01

    As estimators of estimable parameters, we consider three statistics which are U-statistic, V-statistic and limit of Bayes estimate. This limit of Bayes estimate, called LB-statistic in this paper, is obtained from Bayes estimate of estimable parameter based on Dirichlet process, by letting its parameter tend to zero. For the estimable parameter with non-degenerate kernel, the asymptotic relative efficiencies of LB-statistic with respect to U-statistic and V-statistic and that of V-statistic w...

  1. Weibull statistics effective area and volume in the ball-on-ring testing method

    DEFF Research Database (Denmark)

    Frandsen, Henrik Lund

    2014-01-01

    The ball-on-ring method is together with other biaxial bending methods often used for measuring the strength of plates of brittle materials, because machining defects are remote from the high stresses causing the failure of the specimens. In order to scale the measured Weibull strength...... to geometries relevant for the application of the material, the effective area or volume for the test specimen must be evaluated. In this work analytical expressions for the effective area and volume of the ball-on-ring test specimen is derived. In the derivation the multiaxial stress field has been accounted...

  2. Difficult decisions: A qualitative exploration of the statistical decision making process from the perspectives of psychology students and academics

    Directory of Open Access Journals (Sweden)

    Peter James Allen

    2016-02-01

    Full Text Available Quantitative research methods are essential to the development of professional competence in psychology. They are also an area of weakness for many students. In particular, students are known to struggle with the skill of selecting quantitative analytical strategies appropriate for common research questions, hypotheses and data types. To begin understanding this apparent deficit, we presented nine psychology undergraduates (who had all completed at least one quantitative methods course with brief research vignettes, and asked them to explicate the process they would follow to identify an appropriate statistical technique for each. Thematic analysis revealed that all participants found this task challenging, and even those who had completed several research methods courses struggled to articulate how they would approach the vignettes on more than a very superficial and intuitive level. While some students recognized that there is a systematic decision making process that can be followed, none could describe it clearly or completely. We then presented the same vignettes to 10 psychology academics with particular expertise in conducting research and/or research methods instruction. Predictably, these ‘experts’ were able to describe a far more systematic, comprehensive, flexible and nuanced approach to statistical decision making, which begins early in the research process, and pays consideration to multiple contextual factors. They were sensitive to the challenges that students experience when making statistical decisions, which they attributed partially to how research methods and statistics are commonly taught. This sensitivity was reflected in their pedagogic practices. When asked to consider the format and features of an aid that could facilitate the statistical decision making process, both groups expressed a preference for an accessible, comprehensive and reputable resource that follows a basic decision tree logic. For the academics in

  3. Difficult Decisions: A Qualitative Exploration of the Statistical Decision Making Process from the Perspectives of Psychology Students and Academics

    Science.gov (United States)

    Allen, Peter J.; Dorozenko, Kate P.; Roberts, Lynne D.

    2016-01-01

    Quantitative research methods are essential to the development of professional competence in psychology. They are also an area of weakness for many students. In particular, students are known to struggle with the skill of selecting quantitative analytical strategies appropriate for common research questions, hypotheses and data types. To begin understanding this apparent deficit, we presented nine psychology undergraduates (who had all completed at least one quantitative methods course) with brief research vignettes, and asked them to explicate the process they would follow to identify an appropriate statistical technique for each. Thematic analysis revealed that all participants found this task challenging, and even those who had completed several research methods courses struggled to articulate how they would approach the vignettes on more than a very superficial and intuitive level. While some students recognized that there is a systematic decision making process that can be followed, none could describe it clearly or completely. We then presented the same vignettes to 10 psychology academics with particular expertise in conducting research and/or research methods instruction. Predictably, these “experts” were able to describe a far more systematic, comprehensive, flexible, and nuanced approach to statistical decision making, which begins early in the research process, and pays consideration to multiple contextual factors. They were sensitive to the challenges that students experience when making statistical decisions, which they attributed partially to how research methods and statistics are commonly taught. This sensitivity was reflected in their pedagogic practices. When asked to consider the format and features of an aid that could facilitate the statistical decision making process, both groups expressed a preference for an accessible, comprehensive and reputable resource that follows a basic decision tree logic. For the academics in particular, this aid

  4. Minerals Yearbook, volume III, Area Reports—International—Africa and the Middle East

    Science.gov (United States)

    ,

    2018-01-01

    The U.S. Geological Survey (USGS) Minerals Yearbook discusses the performance of the worldwide minerals and materials industries and provides background information to assist in interpreting that performance. Content of the individual Minerals Yearbook volumes follows:Volume I, Metals and Minerals, contains chapters about virtually all metallic and industrial mineral commodities important to the U.S. economy. Chapters on survey methods, summary statistics for domestic nonfuel minerals, and trends in mining and quarrying in the metals and industrial mineral industries in the United States are also included.Volume II, Area Reports: Domestic, contains a chapter on the mineral industry of each of the 50 States and Puerto Rico and the Administered Islands. This volume also has chapters on survey methods and summary statistics of domestic nonfuel minerals.Volume III, Area Reports: International, is published as four separate reports. These regional reports contain the latest available minerals data on more than 180 foreign countries and discuss the importance of minerals to the economies of these nations and the United States. Each report begins with an overview of the region’s mineral industries during the year. It continues with individual country chapters that examine the mining, refining, processing, and use of minerals in each country of the region and how each country’s mineral industry relates to U.S. industry. Most chapters include production tables and industry structure tables, information about Government policies and programs that affect the country’s mineral industry, and an outlook section.The USGS continually strives to improve the value of its publications to users. Constructive comments and suggestions by readers of the Minerals Yearbook are welcomed.

  5. Confidence limits for contribution plots in multivariate statistical process control using bootstrap estimates.

    Science.gov (United States)

    Babamoradi, Hamid; van den Berg, Frans; Rinnan, Åsmund

    2016-02-18

    In Multivariate Statistical Process Control, when a fault is expected or detected in the process, contribution plots are essential for operators and optimization engineers in identifying those process variables that were affected by or might be the cause of the fault. The traditional way of interpreting a contribution plot is to examine the largest contributing process variables as the most probable faulty ones. This might result in false readings purely due to the differences in natural variation, measurement uncertainties, etc. It is more reasonable to compare variable contributions for new process runs with historical results achieved under Normal Operating Conditions, where confidence limits for contribution plots estimated from training data are used to judge new production runs. Asymptotic methods cannot provide confidence limits for contribution plots, leaving re-sampling methods as the only option. We suggest bootstrap re-sampling to build confidence limits for all contribution plots in online PCA-based MSPC. The new strategy to estimate CLs is compared to the previously reported CLs for contribution plots. An industrial batch process dataset was used to illustrate the concepts. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. 8th International Conference on Soft Methods in Probability and Statistics

    CERN Document Server

    Giordani, Paolo; Vantaggi, Barbara; Gagolewski, Marek; Gil, María; Grzegorzewski, Przemysław; Hryniewicz, Olgierd

    2017-01-01

    This proceedings volume is a collection of peer reviewed papers presented at the 8th International Conference on Soft Methods in Probability and Statistics (SMPS 2016) held in Rome (Italy). The book is dedicated to Data science which aims at developing automated methods to analyze massive amounts of data and to extract knowledge from them. It shows how Data science employs various programming techniques and methods of data wrangling, data visualization, machine learning, probability and statistics. The soft methods proposed in this volume represent a collection of tools in these fields that can also be useful for data science.

  7. Human-system interface design review guideline -- Process and guidelines: Final report. Revision 1, Volume 1

    International Nuclear Information System (INIS)

    1996-06-01

    NUREG-0700, Revision 1, provides human factors engineering (HFE) guidance to the US Nuclear Regulatory Commission staff for its: (1) review of the human system interface (HSI) design submittals prepared by licensees or applications for a license or design certification of commercial nuclear power plants, and (2) performance of HSI reviews that could be undertaken as part of an inspection or other type of regulatory review involving HSI design or incidents involving human performance. The guidance consists of a review process and HFE guidelines. The document describes those aspects of the HSI design review process that are important to the identification and resolution of human engineering discrepancies that could adversely affect plant safety. Guidance is provided that could be used by the staff to review an applicant's HSI design review process or to guide the development of an HSI design review plan, e.g., as part of an inspection activity. The document also provides detailed HFE guidelines for the assessment of HSI design implementations. NUREG-0700, Revision 1, consists of three stand-alone volumes. Volume 1 consists of two major parts. Part 1 describes those aspects of the review process of the HSI design that are important to identifying and resolving human engineering discrepancies. Part 2 contains detailed guidelines for a human factors engineering review which identify criteria for assessing the implementation of an applicant's or licensee's HSI design

  8. Multi-element neutron activation analysis and solution of classification problems using multidimensional statistics

    International Nuclear Information System (INIS)

    Vaganov, P.A.; Kol'tsov, A.A.; Kulikov, V.D.; Mejer, V.A.

    1983-01-01

    The multi-element instrumental neutron activation analysis of samples of mountain rocks (sandstones, aleurolites and shales of one of gold deposits) is performed. The spectra of irradiated samples are measured by Ge(Li) detector of the volume of 35 mm 3 . The content of 22 chemical elements is determined in each sample. The results of analysis serve as reliable basis for multi-dimensional statistic information processing, they constitute the basis for the generalized characteristics of rocks which brings about the solution of classification problem for rocks of different deposits

  9. Statistics of extreme events in Chinese stock markets

    International Nuclear Information System (INIS)

    Wu Gan-Hua; Qiu Lu; Li Xin-Li; Yang Yue; Yang Hui-Jie; Jiang Yan; Stephen Mutua

    2014-01-01

    We investigate the impact of financial factors on daily volume recurrent time intervals in the developing Chinese stock markets. The tails of probability distribution functions (PDFs) of volume recurrent intervals behave as a power-law, and the scaling exponent decreases with the increase of stock lifetime, which are similar to those in the US stock markets, and they are typical representatives of developed markets. The difference is that the power-law exponent values remain almost the same with the changes of market capitalization, mean volume, and mean trading value, respectively. These findings enrich the results for event statistics for financial markets. (interdisciplinary physics and related areas of science and technology)

  10. Fieldcrest Cannon, Inc. Advanced Technical Preparation. Statistical Process Control (SPC). PRE-SPC I. Instructor Book.

    Science.gov (United States)

    Averitt, Sallie D.

    This instructor guide, which was developed for use in a manufacturing firm's advanced technical preparation program, contains the materials required to present a learning module that is designed to prepare trainees for the program's statistical process control module by improving their basic math skills and instructing them in basic calculator…

  11. 2015 ICSA/Graybill Applied Statistics Symposium

    CERN Document Server

    Wang, Bushi; Hu, Xiaowen; Chen, Kun; Liu, Ray

    2016-01-01

    The papers in this volume represent a broad, applied swath of advanced contributions to the 2015 ICSA/Graybill Applied Statistics Symposium of the International Chinese Statistical Association, held at Colorado State University in Fort Collins. The contributions cover topics that range from statistical applications in business and finance to applications in clinical trials and biomarker analysis. Each papers was peer-reviewed by at least two referees and also by an editor. The conference was attended by over 400 participants from academia, industry, and government agencies around the world, including from North America, Asia, and Europe. Focuses on statistical applications from clinical trials, biomarker analysis, and personalized medicine to applications in finance and business analytics A unique selection of papers from broad and multi-disciplinary critical hot topics - from academic, government, and industry perspectives - to appeal to a wide variety of applied research interests All papers feature origina...

  12. Halo statistics analysis within medium volume cosmological N-body simulation

    Directory of Open Access Journals (Sweden)

    Martinović N.

    2015-01-01

    Full Text Available In this paper we present halo statistics analysis of a ΛCDM N body cosmological simulation (from first halo formation until z = 0. We study mean major merger rate as a function of time, where for time we consider both per redshift and per Gyr dependence. For latter we find that it scales as the well known power law (1 + zn for which we obtain n = 2.4. The halo mass function and halo growth function are derived and compared both with analytical and empirical fits. We analyse halo growth through out entire simulation, making it possible to continuously monitor evolution of halo number density within given mass ranges. The halo formation redshift is studied exploring possibility for a new simple preliminary analysis during the simulation run. Visualization of the simulation is portrayed as well. At redshifts z = 0−7 halos from simulation have good statistics for further analysis especially in mass range of 1011 − 1014 M./h. [176021 ’Visible and invisible matter in nearby galaxies: theory and observations

  13. Application of Statistical Model in Wastewater Treatment Process Modeling Using Data Analysis

    Directory of Open Access Journals (Sweden)

    Alireza Raygan Shirazinezhad

    2015-06-01

    Full Text Available Background: Wastewater treatment includes very complex and interrelated physical, chemical and biological processes which using data analysis techniques can be rigorously modeled by a non-complex mathematical calculation models. Materials and Methods: In this study, data on wastewater treatment processes from water and wastewater company of Kohgiluyeh and Boyer Ahmad were used. A total of 3306 data for COD, TSS, PH and turbidity were collected, then analyzed by SPSS-16 software (descriptive statistics and data analysis IBM SPSS Modeler 14.2, through 9 algorithm. Results: According to the results on logistic regression algorithms, neural networks, Bayesian networks, discriminant analysis, decision tree C5, tree C & R, CHAID, QUEST and SVM had accuracy precision of 90.16, 94.17, 81.37, 70.48, 97.89, 96.56, 96.46, 96.84 and 88.92, respectively. Discussion and conclusion: The C5 algorithm as the best and most applicable algorithms for modeling of wastewater treatment processes were chosen carefully with accuracy of 97.899 and the most influential variables in this model were PH, COD, TSS and turbidity.

  14. Feasibility study of using statistical process control to customized quality assurance in proton therapy.

    Science.gov (United States)

    Rah, Jeong-Eun; Shin, Dongho; Oh, Do Hoon; Kim, Tae Hyun; Kim, Gwe-Ya

    2014-09-01

    To evaluate and improve the reliability of proton quality assurance (QA) processes and, to provide an optimal customized tolerance level using the statistical process control (SPC) methodology. The authors investigated the consistency check of dose per monitor unit (D/MU) and range in proton beams to see whether it was within the tolerance level of the daily QA process. This study analyzed the difference between the measured and calculated ranges along the central axis to improve the patient-specific QA process in proton beams by using process capability indices. The authors established a customized tolerance level of ±2% for D/MU and ±0.5 mm for beam range in the daily proton QA process. In the authors' analysis of the process capability indices, the patient-specific range measurements were capable of a specification limit of ±2% in clinical plans. SPC methodology is a useful tool for customizing the optimal QA tolerance levels and improving the quality of proton machine maintenance, treatment delivery, and ultimately patient safety.

  15. Feasibility study of using statistical process control to customized quality assurance in proton therapy

    International Nuclear Information System (INIS)

    Rah, Jeong-Eun; Oh, Do Hoon; Shin, Dongho; Kim, Tae Hyun; Kim, Gwe-Ya

    2014-01-01

    Purpose: To evaluate and improve the reliability of proton quality assurance (QA) processes and, to provide an optimal customized tolerance level using the statistical process control (SPC) methodology. Methods: The authors investigated the consistency check of dose per monitor unit (D/MU) and range in proton beams to see whether it was within the tolerance level of the daily QA process. This study analyzed the difference between the measured and calculated ranges along the central axis to improve the patient-specific QA process in proton beams by using process capability indices. Results: The authors established a customized tolerance level of ±2% for D/MU and ±0.5 mm for beam range in the daily proton QA process. In the authors’ analysis of the process capability indices, the patient-specific range measurements were capable of a specification limit of ±2% in clinical plans. Conclusions: SPC methodology is a useful tool for customizing the optimal QA tolerance levels and improving the quality of proton machine maintenance, treatment delivery, and ultimately patient safety

  16. Batch statistical process control of a fluid bed granulation process using in-line spatial filter velocimetry and product temperature measurements.

    Science.gov (United States)

    Burggraeve, A; Van den Kerkhof, T; Hellings, M; Remon, J P; Vervaet, C; De Beer, T

    2011-04-18

    Fluid bed granulation is a batch process, which is characterized by the processing of raw materials for a predefined period of time, consisting of a fixed spraying phase and a subsequent drying period. The present study shows the multivariate statistical modeling and control of a fluid bed granulation process based on in-line particle size distribution (PSD) measurements (using spatial filter velocimetry) combined with continuous product temperature registration using a partial least squares (PLS) approach. Via the continuous in-line monitoring of the PSD and product temperature during granulation of various reference batches, a statistical batch model was developed allowing the real-time evaluation and acceptance or rejection of future batches. Continuously monitored PSD and product temperature process data of 10 reference batches (X-data) were used to develop a reference batch PLS model, regressing the X-data versus the batch process time (Y-data). Two PLS components captured 98.8% of the variation in the X-data block. Score control charts in which the average batch trajectory and upper and lower control limits are displayed were developed. Next, these control charts were used to monitor 4 new test batches in real-time and to immediately detect any deviations from the expected batch trajectory. By real-time evaluation of new batches using the developed control charts and by computation of contribution plots of deviating process behavior at a certain time point, batch losses or reprocessing can be prevented. Immediately after batch completion, all PSD and product temperature information (i.e., a batch progress fingerprint) was used to estimate some granule properties (density and flowability) at an early stage, which can improve batch release time. Individual PLS models relating the computed scores (X) of the reference PLS model (based on the 10 reference batches) and the density, respectively, flowabililty as Y-matrix, were developed. The scores of the 4 test

  17. 2014 ICSA/KISS Joint Applied Statistics Symposium

    CERN Document Server

    Liu, Mengling; Luo, Xiaolong

    2016-01-01

    The papers in this volume represent the most timely and advanced contributions to the 2014 Joint Applied Statistics Symposium of the International Chinese Statistical Association (ICSA) and the Korean International Statistical Society (KISS), held in Portland, Oregon. The contributions cover new developments in statistical modeling and clinical research: including model development, model checking, and innovative clinical trial design and analysis. Each paper was peer-reviewed by at least two referees and also by an editor. The conference was attended by over 400 participants from academia, industry, and government agencies around the world, including from North America, Asia, and Europe. It offered 3 keynote speeches, 7 short courses, 76 parallel scientific sessions, student paper sessions, and social events. The most timely and advanced contributions from the joint 2014 ICSA/KISS Applied Statistics Symposium All papers feature original, peer-reviewed content Coverage consists of new developments in statisti...

  18. Timber resource statistics for the Yakataga inventory unit, Alaska, 1976.

    Science.gov (United States)

    Willem W.S. van Hees

    1985-01-01

    Statistics on forest area, total gross and net timber volumes, and annual net growth and mortality are presented from the 1976 timber inventory of the Yakataga unit, Alaska. Timberland area is estimated at 209.3 thousand acres (84.7 thousand ha), net growing stock volume at 917.1 million cubic feet (26.0 million m3), and annual net growth and...

  19. Stochastic processes inference theory

    CERN Document Server

    Rao, Malempati M

    2014-01-01

    This is the revised and enlarged 2nd edition of the authors’ original text, which was intended to be a modest complement to Grenander's fundamental memoir on stochastic processes and related inference theory. The present volume gives a substantial account of regression analysis, both for stochastic processes and measures, and includes recent material on Ridge regression with some unexpected applications, for example in econometrics. The first three chapters can be used for a quarter or semester graduate course on inference on stochastic processes. The remaining chapters provide more advanced material on stochastic analysis suitable for graduate seminars and discussions, leading to dissertation or research work. In general, the book will be of interest to researchers in probability theory, mathematical statistics and electrical and information theory.

  20. Automation in Siemens fuel manufacturing - the basis for quality improvement by statistical process control (SPC)

    International Nuclear Information System (INIS)

    Drecker, St.; Hoff, A.; Dietrich, M.; Guldner, R.

    1999-01-01

    Statistical Process Control (SPC) is one of the systematic tools to perform a valuable contribution to the control and planning activities for manufacturing processes and product quality. Advanced Nuclear Fuels GmbH (ANF) started a program to introduce SPC in all sections of the manufacturing process of fuel assemblies. The concept phase is based on a realization of SPC in 3 pilot projects. The existing manufacturing devices are reviewed for the utilization of SPC. Subsequent modifications were made to provide the necessary interfaces. The processes 'powder/pellet manufacturing'. 'cladding tube manufacturing' and 'laser-welding of spacers' are located at the different locations of ANF. Due to the completion of the first steps and the experience obtained by the pilot projects, the introduction program for SPC has already been extended to other manufacturing processes. (authors)

  1. Squeezing, photon bunching, photon antibunching and nonclassical photon statistics in degenerate hyper Raman processes

    International Nuclear Information System (INIS)

    Sen, Biswajit; Mandal, Swapan

    2007-01-01

    An initially prepared coherent state coupled to a second-order nonlinear medium is responsible for stimulated and spontaneous hyper Raman processes. By using an intuitive approach based on perturbation theory, the Hamiltonian corresponding to the hyper Raman processes is analytically solved to obtain the temporal development of the field operators. It is true that these analytical solutions are valid for small coupling constants. However, the interesting part is that these solutions are valid for reasonably large time. Hence, the present analytical solutions are quite general and are fresh compared to those solutions under short-time approximations. By exploiting the analytical solutions of field operators for various modes, we investigate the squeezing, photon antibunching and nonclassical photon statistics for pure modes of the input coherent light responsible for hyper Raman processes. At least in one instance (stimulated hyper Raman processes for vibration phonon mode), we report the simultaneous appearance of classical (photon bunching) and nonclassical (squeezing) effects of the radiation field responsible for hyper Raman processes

  2. Performance of chip seals using local and minimally processed aggregates for preservation of low traffic volume roadways.

    Science.gov (United States)

    2013-07-01

    This report documents the performance of two low traffic volume experimental chip seals constructed using : locally available, minimally processed sand and gravel aggregates after four winters of service. The projects : were constructed by CDOT maint...

  3. ZnO crystals obtained by electrodeposition: Statistical analysis of most important process variables

    International Nuclear Information System (INIS)

    Cembrero, Jesus; Busquets-Mataix, David

    2009-01-01

    In this paper a comparative study by means of a statistical analysis of the main process variables affecting ZnO crystal electrodeposition is presented. ZnO crystals were deposited on two different substrates, silicon wafer and indium tin oxide. The control variables were substrate types, electrolyte concentration, temperature, exposition time and current density. The morphologies of the different substrates were observed using scanning electron microscopy. The percentage of substrate area covered by ZnO deposit was calculated by computational image analysis. The design of the applied experiments was based on a two-level factorial analysis involving a series of 32 experiments and an analysis of variance. Statistical results reveal that variables exerting a significant influence on the area covered by ZnO deposit are electrolyte concentration, substrate type and time of deposition, together with a combined two-factor interaction between temperature and current density. However, morphology is also influenced by surface roughness of the substrates

  4. Volume reduction outweighs biogeochemical processes in controlling phosphorus treatment in aged detention systems

    Science.gov (United States)

    Shukla, Asmita; Shukla, Sanjay; Annable, Michael D.; Hodges, Alan W.

    2017-08-01

    Stormwater detention areas (SDAs) play an important role in treating end-of-the-farm runoff in phosphorous (P) limited agroecosystems. Phosphorus transport from the SDAs, including those through subsurface pathways, are not well understood. The prevailing understanding of these systems assumes that biogeochemical processes play the primary treatment role and that subsurface losses can be neglected. Water and P fluxes from a SDA located in a row-crop farm were measured for two years (2009-2011) to assess the SDA's role in reducing downstream P loads. The SDA treated 55% (497 kg) and 95% (205 kg) of the incoming load during Year 1 (Y1, 09-10) and Year 2 (Y2, 10-11), respectively. These treatment efficiencies were similar to surface water volumetric retention (49% in Y1 and 84% in Y2) and varied primarily with rainfall. Similar water volume and P retentions indicate that volume retention is the main process controlling P loads. A limited role of biogeochemical processes was supported by low to no remaining soil P adsorption capacity due to long-term drainage P input. The fact that outflow P concentrations (Y1 = 368.3 μg L- 1, Y2 = 230.4 μg L- 1) could be approximated by using a simple mixing of rainfall and drainage P input further confirmed the near inert biogeochemical processes. Subsurface P losses through groundwater were 304 kg (27% of inflow P) indicating that they are an important source for downstream P. Including subsurface P losses reduces the treatment efficiency to 35% (from 61%). The aboveground biomass in the SDA contained 42% (240 kg) of the average incoming P load suggesting that biomass harvesting could be a cost-effective alternative for reviving the role of biogeochemical processes to enhance P treatment in aged, P-saturated SDAs. The 20-year present economic value of P removal through harvesting was estimated to be 341,000, which if covered through a cost share or a payment for P treatment services program could be a positive outcome for both

  5. The effect of thermo-mechanical processing on the mechanical properties of molybdenum - 2 volume % lanthana

    International Nuclear Information System (INIS)

    Mueller, A.J.; Shields, J.A. Jr.; Buckman, R.W. Jr.

    2001-01-01

    Variations in oxide species and consolidation method have been shown to have a significant effect on the mechanical properties of oxide dispersion strengthened (ODS) molybdenum material. The mechanical behavior of molybdenum - 2 volume % La 2 O 3 mill product forms, produced by CSM Industries by a wet doping process, were characterized over the temperature range of -150 o C to 1800 o C. The various mill product forms evaluated ranged from thin sheet stock to bar stock. Tensile properties of the material in the various product forms were not significantly affected by the vast difference in total cold work. Creep properties, however, were sensitive to the total amount of cold work as well as the starting microstructure. Stress-relieved .material had superior creep rupture properties to recrystallized material at 1200 o C, while at 1500 o C and above the opposite was observed. Thus it is necessary to match the appropriate thermo-mechanical processing and microstructure of molybdenum - 2 volume % La 2 O 3 to the demands of the application being considered. (author)

  6. Statistical tests for power-law cross-correlated processes

    Science.gov (United States)

    Podobnik, Boris; Jiang, Zhi-Qiang; Zhou, Wei-Xing; Stanley, H. Eugene

    2011-12-01

    For stationary time series, the cross-covariance and the cross-correlation as functions of time lag n serve to quantify the similarity of two time series. The latter measure is also used to assess whether the cross-correlations are statistically significant. For nonstationary time series, the analogous measures are detrended cross-correlations analysis (DCCA) and the recently proposed detrended cross-correlation coefficient, ρDCCA(T,n), where T is the total length of the time series and n the window size. For ρDCCA(T,n), we numerically calculated the Cauchy inequality -1≤ρDCCA(T,n)≤1. Here we derive -1≤ρDCCA(T,n)≤1 for a standard variance-covariance approach and for a detrending approach. For overlapping windows, we find the range of ρDCCA within which the cross-correlations become statistically significant. For overlapping windows we numerically determine—and for nonoverlapping windows we derive—that the standard deviation of ρDCCA(T,n) tends with increasing T to 1/T. Using ρDCCA(T,n) we show that the Chinese financial market's tendency to follow the U.S. market is extremely weak. We also propose an additional statistical test that can be used to quantify the existence of cross-correlations between two power-law correlated time series.

  7. Pierre Gy's sampling theory and sampling practice heterogeneity, sampling correctness, and statistical process control

    CERN Document Server

    Pitard, Francis F

    1993-01-01

    Pierre Gy's Sampling Theory and Sampling Practice, Second Edition is a concise, step-by-step guide for process variability management and methods. Updated and expanded, this new edition provides a comprehensive study of heterogeneity, covering the basic principles of sampling theory and its various applications. It presents many practical examples to allow readers to select appropriate sampling protocols and assess the validity of sampling protocols from others. The variability of dynamic process streams using variography is discussed to help bridge sampling theory with statistical process control. Many descriptions of good sampling devices, as well as descriptions of poor ones, are featured to educate readers on what to look for when purchasing sampling systems. The book uses its accessible, tutorial style to focus on professional selection and use of methods. The book will be a valuable guide for mineral processing engineers; metallurgists; geologists; miners; chemists; environmental scientists; and practit...

  8. Penultimate modeling of spatial extremes: statistical inference for max-infinitely divisible processes

    KAUST Repository

    Huser, Raphaël

    2018-01-09

    Extreme-value theory for stochastic processes has motivated the statistical use of max-stable models for spatial extremes. However, fitting such asymptotic models to maxima observed over finite blocks is problematic when the asymptotic stability of the dependence does not prevail in finite samples. This issue is particularly serious when data are asymptotically independent, such that the dependence strength weakens and eventually vanishes as events become more extreme. We here aim to provide flexible sub-asymptotic models for spatially indexed block maxima, which more realistically account for discrepancies between data and asymptotic theory. We develop models pertaining to the wider class of max-infinitely divisible processes, extending the class of max-stable processes while retaining dependence properties that are natural for maxima: max-id models are positively associated, and they yield a self-consistent family of models for block maxima defined over any time unit. We propose two parametric construction principles for max-id models, emphasizing a point process-based generalized spectral representation, that allows for asymptotic independence while keeping the max-stable extremal-$t$ model as a special case. Parameter estimation is efficiently performed by pairwise likelihood, and we illustrate our new modeling framework with an application to Dutch wind gust maxima calculated over different time units.

  9. The Use of Statistical Process Control-Charts for Person-Fit Analysis on Computerized Adaptive Testing. LSAC Research Report Series.

    Science.gov (United States)

    Meijer, Rob R.; van Krimpen-Stoop, Edith M. L. A.

    In this study a cumulative-sum (CUSUM) procedure from the theory of Statistical Process Control was modified and applied in the context of person-fit analysis in a computerized adaptive testing (CAT) environment. Six person-fit statistics were proposed using the CUSUM procedure, and three of them could be used to investigate the CAT in online test…

  10. 9th Symposium on Computational Statistics

    CERN Document Server

    Mildner, Vesna

    1990-01-01

    Although no-one is, probably, too enthused about the idea, it is a fact that the development of most empirical sciences to a great extent depends on the development of data analysis methods and techniques, which, due to the necessity of application of computers for that purpose, actually means that it practically depends on the advancement and orientation of computer statistics. Every other year the International Association for Statistical Computing sponsors the organizition of meetings of individual s professiona77y involved in computational statistics. Since these meetings attract professionals from allover the world, they are a good sample for the estimation of trends in this area which some believe is a statistics proper while others claim it is computer science. It seems, though, that an increasing number of colleagues treat it as an independent scientific or at least technical discipline. This volume contains six invited papers, 41 contributed papers and, finally, two papers which are, formally, softwa...

  11. Editor's Choice - High Annual Hospital Volume is Associated with Decreased in Hospital Mortality and Complication Rates Following Treatment of Abdominal Aortic Aneurysms: Secondary Data Analysis of the Nationwide German DRG Statistics from 2005 to 2013.

    Science.gov (United States)

    Trenner, Matthias; Kuehnl, Andreas; Salvermoser, Michael; Reutersberg, Benedikt; Geisbuesch, Sarah; Schmid, Volker; Eckstein, Hans-Henning

    2018-02-01

    The aim of this study was to analyse the association between annual hospital procedural volume and post-operative outcomes following repair of abdominal aortic aneurysms (AAA) in Germany. Data were extracted from nationwide Diagnosis Related Group (DRG) statistics provided by the German Federal Statistical Office. Cases with a diagnosis of AAA (ICD-10 GM I71.3, I71.4) and procedure codes for endovascular aortic repair (EVAR; OPS 5-38a.1*) or open aortic repair (OAR; OPS 5-38.45, 5-38.47) treated between 2005 and 2013 were included. Hospitals were empirically grouped to quartiles depending on the overall annual volume of AAA procedures. A multilevel multivariable regression model was applied to adjust for sex, medical risk, type of procedure, and type of admission. Primary outcome was in hospital mortality. Secondary outcomes were complications, use of blood products, and length of stay (LOS). The association between AAA volume and in hospital mortality was also estimated as a function of continuous volume. A total of 96,426 cases, of which 11,795 (12.6%) presented as ruptured (r)AAA, were treated in >700 hospitals (annual median: 501). The crude in hospital mortality was 3.3% after intact (i)AAA repair (OAR 5.3%; EVAR 1.7%). Volume was inversely associated with mortality after OAR and EVAR. Complication rates, LOS, and use of blood products were lower in high volume hospitals. After rAAA repair, crude mortality was 40.4% (OAR 43.2%; EVAR 27.4%). An inverse association between mortality and volume was shown for rAAA repair; the same accounts for the use of blood products. When considering volume as a continuous variate, an annual caseload of 75-100 elective cases was associated with the lowest mortality risk. In hospital mortality and complication rates following AAA repair are inversely associated with annual hospital volume. The use of blood products and the LOS are lower in high volume hospitals. A minimum annual case threshold for AAA procedures might improve

  12. Human-system interface design review guideline -- Process and guidelines: Final report. Revision 1, Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    None

    1996-06-01

    NUREG-0700, Revision 1, provides human factors engineering (HFE) guidance to the US Nuclear Regulatory Commission staff for its: (1) review of the human system interface (HSI) design submittals prepared by licensees or applications for a license or design certification of commercial nuclear power plants, and (2) performance of HSI reviews that could be undertaken as part of an inspection or other type of regulatory review involving HSI design or incidents involving human performance. The guidance consists of a review process and HFE guidelines. The document describes those aspects of the HSI design review process that are important to the identification and resolution of human engineering discrepancies that could adversely affect plant safety. Guidance is provided that could be used by the staff to review an applicant`s HSI design review process or to guide the development of an HSI design review plan, e.g., as part of an inspection activity. The document also provides detailed HFE guidelines for the assessment of HSI design implementations. NUREG-0700, Revision 1, consists of three stand-alone volumes. Volume 1 consists of two major parts. Part 1 describes those aspects of the review process of the HSI design that are important to identifying and resolving human engineering discrepancies. Part 2 contains detailed guidelines for a human factors engineering review which identify criteria for assessing the implementation of an applicant`s or licensee`s HSI design.

  13. Accurate corresponding point search using sphere-attribute-image for statistical bone model generation

    International Nuclear Information System (INIS)

    Saito, Toki; Nakajima, Yoshikazu; Sugita, Naohiko; Mitsuishi, Mamoru; Hashizume, Hiroyuki; Kuramoto, Kouichi; Nakashima, Yosio

    2011-01-01

    Statistical deformable model based two-dimensional/three-dimensional (2-D/3-D) registration is a promising method for estimating the position and shape of patient bone in the surgical space. Since its accuracy depends on the statistical model capacity, we propose a method for accurately generating a statistical bone model from a CT volume. Our method employs the Sphere-Attribute-Image (SAI) and has improved the accuracy of corresponding point search in statistical model generation. At first, target bone surfaces are extracted as SAIs from the CT volume. Then the textures of SAIs are classified to some regions using Maximally-stable-extremal-regions methods. Next, corresponding regions are determined using Normalized cross-correlation (NCC). Finally, corresponding points in each corresponding region are determined using NCC. The application of our method to femur bone models was performed, and worked well in the experiments. (author)

  14. Biometry: the principles and practice of statistics in biological research

    National Research Council Canada - National Science Library

    Sokal, R.R; Rohlf, F.J

    1969-01-01

    In this introductory textbook, with its companion volume of tables, the authors provide a balanced presentation of statistical methodology for the descriptive, experimental, and analytical study of biological phenomena...

  15. 3D CT modeling of hepatic vessel architecture and volume calculation in living donated liver transplantation

    International Nuclear Information System (INIS)

    Frericks, Bernd B.; Caldarone, Franco C.; Savellano, Dagmar Hoegemann; Stamm, Georg; Kirchhoff, Timm D.; Shin, Hoen-Oh; Galanski, Michael; Nashan, Bjoern; Klempnauer, Juergen; Schenk, Andrea; Selle, Dirk; Spindler, Wolf; Peitgen, Heinz-Otto

    2004-01-01

    The aim of this study was to evaluate a software tool for non-invasive preoperative volumetric assessment of potential donors in living donated liver transplantation (LDLT). Biphasic helical CT was performed in 56 potential donors. Data sets were post-processed using a non-commercial software tool for segmentation, volumetric analysis and visualisation of liver segments. Semi-automatic definition of liver margins allowed the segmentation of parenchyma. Hepatic vessels were delineated using a region-growing algorithm with automatically determined thresholds. Volumes and shapes of liver segments were calculated automatically based on individual portal-venous branches. Results were visualised three-dimensionally and statistically compared with conventional volumetry and the intraoperative findings in 27 transplanted cases. Image processing was easy to perform within 23 min. Of the 56 potential donors, 27 were excluded from LDLT because of inappropriate liver parenchyma or vascular architecture. Two recipients were not transplanted due to poor clinical conditions. In the 27 transplanted cases, preoperatively visualised vessels were confirmed, and only one undetected accessory hepatic vein was revealed. Calculated graft volumes were 1110±180 ml for right lobes, 820 ml for the left lobe and 270±30 ml for segments II+III. The calculated volumes and intraoperatively measured graft volumes correlated significantly. No significant differences between the presented automatic volumetry and the conventional volumetry were observed. A novel image processing technique was evaluated which allows a semi-automatic volume calculation and 3D visualisation of the different liver segments. (orig.)

  16. Statistics Clinic

    Science.gov (United States)

    Feiveson, Alan H.; Foy, Millennia; Ploutz-Snyder, Robert; Fiedler, James

    2014-01-01

    Do you have elevated p-values? Is the data analysis process getting you down? Do you experience anxiety when you need to respond to criticism of statistical methods in your manuscript? You may be suffering from Insufficient Statistical Support Syndrome (ISSS). For symptomatic relief of ISSS, come for a free consultation with JSC biostatisticians at our help desk during the poster sessions at the HRP Investigators Workshop. Get answers to common questions about sample size, missing data, multiple testing, when to trust the results of your analyses and more. Side effects may include sudden loss of statistics anxiety, improved interpretation of your data, and increased confidence in your results.

  17. Volume and Surface-Enhanced Volume Negative Ion Sources

    International Nuclear Information System (INIS)

    Stockli, M P

    2013-01-01

    H - volume sources and, especially, caesiated H - volume sources are important ion sources for generating high-intensity proton beams, which then in turn generate large quantities of other particles. This chapter discusses the physics and technology of the volume production and the caesium-enhanced (surface) production of H - ions. Starting with Bacal's discovery of the H - volume production, the chapter briefly recounts the development of some H - sources, which capitalized on this process to significantly increase the production of H - beams. Another significant increase was achieved in the 1990s by adding caesiated surfaces to supplement the volume-produced ions with surface-produced ions, as illustrated with other H - sources. Finally, the focus turns to some of the experience gained when such a source was successfully ramped up in H - output and in duty factor to support the generation of 1 MW proton beams for the Spallation Neutron Source. (author)

  18. Infant Statistical-Learning Ability Is Related to Real-Time Language Processing

    Science.gov (United States)

    Lany, Jill; Shoaib, Amber; Thompson, Abbie; Estes, Katharine Graf

    2018-01-01

    Infants are adept at learning statistical regularities in artificial language materials, suggesting that the ability to learn statistical structure may support language development. Indeed, infants who perform better on statistical learning tasks tend to be more advanced in parental reports of infants' language skills. Work with adults suggests…

  19. New municipal solid waste processing technology reduces volume and provides beneficial reuse applications for soil improvement and dust control

    Science.gov (United States)

    A garbage-processing technology has been developed that shreds, sterilizes, and separates inorganic and organic components of municipal solid waste. The technology not only greatly reduces waste volume, but the non-composted byproduct of this process, Fluff®, has the potential to be utilized as a s...

  20. Contributions to statistics

    CERN Document Server

    Mahalanobis, P C

    1965-01-01

    Contributions to Statistics focuses on the processes, methodologies, and approaches involved in statistics. The book is presented to Professor P. C. Mahalanobis on the occasion of his 70th birthday. The selection first offers information on the recovery of ancillary information and combinatorial properties of partially balanced designs and association schemes. Discussions focus on combinatorial applications of the algebra of association matrices, sample size analogy, association matrices and the algebra of association schemes, and conceptual statistical experiments. The book then examines latt