Multivariate Statistical Process Control
DEFF Research Database (Denmark)
Kulahci, Murat
2013-01-01
As sensor and computer technology continues to improve, it becomes a normal occurrence that we confront with high dimensional data sets. As in many areas of industrial statistics, this brings forth various challenges in statistical process control (SPC) and monitoring for which the aim...... is to identify “out-of-control” state of a process using control charts in order to reduce the excessive variation caused by so-called assignable causes. In practice, the most common method of monitoring multivariate data is through a statistic akin to the Hotelling’s T2. For high dimensional data with excessive...... amount of cross correlation, practitioners are often recommended to use latent structures methods such as Principal Component Analysis to summarize the data in only a few linear combinations of the original variables that capture most of the variation in the data. Applications of these control charts...
Mathematical statistics and stochastic processes
Bosq, Denis
2013-01-01
Generally, books on mathematical statistics are restricted to the case of independent identically distributed random variables. In this book however, both this case AND the case of dependent variables, i.e. statistics for discrete and continuous time processes, are studied. This second case is very important for today's practitioners.Mathematical Statistics and Stochastic Processes is based on decision theory and asymptotic statistics and contains up-to-date information on the relevant topics of theory of probability, estimation, confidence intervals, non-parametric statistics and rob
Probability, Statistics, and Stochastic Processes
Olofsson, Peter
2011-01-01
A mathematical and intuitive approach to probability, statistics, and stochastic processes This textbook provides a unique, balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. This text combines a rigorous, calculus-based development of theory with a more intuitive approach that appeals to readers' sense of reason and logic, an approach developed through the author's many years of classroom experience. The text begins with three chapters that d
Fundamentals of statistical signal processing
Kay, Steven M
1993-01-01
A unified presentation of parameter estimation for those involved in the design and implementation of statistical signal processing algorithms. Covers important approaches to obtaining an optimal estimator and analyzing its performance; and includes numerous examples as well as applications to real- world problems. MARKETS: For practicing engineers and scientists who design and analyze signal processing systems, i.e., to extract information from noisy signals — radar engineer, sonar engineer, geophysicist, oceanographer, biomedical engineer, communications engineer, economist, statistician, physicist, etc.
Statistical thermodynamics of nonequilibrium processes
Keizer, Joel
1987-01-01
The structure of the theory ofthermodynamics has changed enormously since its inception in the middle of the nineteenth century. Shortly after Thomson and Clausius enunciated their versions of the Second Law, Clausius, Maxwell, and Boltzmann began actively pursuing the molecular basis of thermo dynamics, work that culminated in the Boltzmann equation and the theory of transport processes in dilute gases. Much later, Onsager undertook the elucidation of the symmetry oftransport coefficients and, thereby, established himself as the father of the theory of nonequilibrium thermodynamics. Com bining the statistical ideas of Gibbs and Langevin with the phenomenological transport equations, Onsager and others went on to develop a consistent statistical theory of irreversible processes. The power of that theory is in its ability to relate measurable quantities, such as transport coefficients and thermodynamic derivatives, to the results of experimental measurements. As powerful as that theory is, it is linear and...
Statistical estimation of process holdup
International Nuclear Information System (INIS)
Harris, S.P.
1988-01-01
Estimates of potential process holdup and their random and systematic error variances are derived to improve the inventory difference (ID) estimate and its associated measure of uncertainty for a new process at the Savannah River Plant. Since the process is in a start-up phase, data have not yet accumulated for statistical modelling. The material produced in the facility will be a very pure, highly enriched 235U with very small isotopic variability. Therefore, data published in LANL's unclassified report on Estimation Methods for Process Holdup of a Special Nuclear Materials was used as a starting point for the modelling process. LANL's data were gathered through a series of designed measurements of special nuclear material (SNM) holdup at two of their materials-processing facilities. Also, they had taken steps to improve the quality of data through controlled, larger scale, experiments outside of LANL at highly enriched uranium processing facilities. The data they have accumulated are on an equipment component basis. Our modelling has been restricted to the wet chemistry area. We have developed predictive models for each of our process components based on the LANL data. 43 figs
Statistical inference for Cox processes
DEFF Research Database (Denmark)
Møller, Jesper; Waagepetersen, Rasmus Plenge
2002-01-01
Research has generated a number of advances in methods for spatial cluster modelling in recent years, particularly in the area of Bayesian cluster modelling. Along with these advances has come an explosion of interest in the potential applications of this work, especially in epidemiology and genome...... research. In one integrated volume, this book reviews the state-of-the-art in spatial clustering and spatial cluster modelling, bringing together research and applications previously scattered throughout the literature. It begins with an overview of the field, then presents a series of chapters...... that illuminate the nature and purpose of cluster modelling within different application areas, including astrophysics, epidemiology, ecology, and imaging. The focus then shifts to methods, with discussions on point and object process modelling, perfect sampling of cluster processes, partitioning in space...
Probability, Statistics, and Stochastic Processes
Olofsson, Peter
2012-01-01
This book provides a unique and balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. The Second Edition features new coverage of analysis of variance (ANOVA), consistency and efficiency of estimators, asymptotic theory for maximum likelihood estimators, empirical distribution function and the Kolmogorov-Smirnov test, general linear models, multiple comparisons, Markov chain Monte Carlo (MCMC), Brownian motion, martingales, and
Statistical Inference at Work: Statistical Process Control as an Example
Bakker, Arthur; Kent, Phillip; Derry, Jan; Noss, Richard; Hoyles, Celia
2008-01-01
To characterise statistical inference in the workplace this paper compares a prototypical type of statistical inference at work, statistical process control (SPC), with a type of statistical inference that is better known in educational settings, hypothesis testing. Although there are some similarities between the reasoning structure involved in…
Statistical Process Control for KSC Processing
Ford, Roger G.; Delgado, Hector; Tilley, Randy
1996-01-01
The 1996 Summer Faculty Fellowship Program and Kennedy Space Center (KSC) served as the basis for a research effort into statistical process control for KSC processing. The effort entailed several tasks and goals. The first was to develop a customized statistical process control (SPC) course for the Safety and Mission Assurance Trends Analysis Group. The actual teaching of this course took place over several weeks. In addition, an Internet version of the same course complete with animation and video excerpts from the course when it was taught at KSC was developed. The application of SPC to shuttle processing took up the rest of the summer research project. This effort entailed the evaluation of SPC use at KSC, both present and potential, due to the change in roles for NASA and the Single Flight Operations Contractor (SFOC). Individual consulting on SPC use was accomplished as well as an evaluation of SPC software for KSC use in the future. A final accomplishment of the orientation of the author to NASA changes, terminology, data format, and new NASA task definitions will allow future consultation when the needs arise.
Representative volume size: A comparison of statistical continuum mechanics and statistical physics
Energy Technology Data Exchange (ETDEWEB)
AIDUN,JOHN B.; TRUCANO,TIMOTHY G.; LO,CHI S.; FYE,RICHARD M.
1999-05-01
In this combination background and position paper, the authors argue that careful work is needed to develop accurate methods for relating the results of fine-scale numerical simulations of material processes to meaningful values of macroscopic properties for use in constitutive models suitable for finite element solid mechanics simulations. To provide a definite context for this discussion, the problem is couched in terms of the lack of general objective criteria for identifying the size of the representative volume (RV) of a material. The objective of this report is to lay out at least the beginnings of an approach for applying results and methods from statistical physics to develop concepts and tools necessary for determining the RV size, as well as alternatives to RV volume-averaging for situations in which the RV is unmanageably large. The background necessary to understand the pertinent issues and statistical physics concepts is presented.
Statistical processing of experimental data
NAVRÁTIL, Pavel
2012-01-01
This thesis contains theory of probability and statistical sets. Solved and unsolved problems of probability, random variable and distributions random variable, random vector, statistical sets, regression and correlation analysis. Unsolved problems contains solutions.
Energy Technology Data Exchange (ETDEWEB)
A.G. Crook Company
1993-04-01
This report was prepared by the A.G. Crook Company, under contract to Bonneville Power Administration, and provides statistics of seasonal volumes and streamflow for 28 selected sites in the Columbia River Basin.
Statistical aspects of determinantal point processes
DEFF Research Database (Denmark)
Lavancier, Frédéric; Møller, Jesper; Rubak, Ege
The statistical aspects of determinantal point processes (DPPs) seem largely unexplored. We review the appealing properties of DDPs, demonstrate that they are useful models for repulsiveness, detail a simulation procedure, and provide freely available software for simulation and statistical infer...
Improving Instruction Using Statistical Process Control.
Higgins, Ronald C.; Messer, George H.
1990-01-01
Two applications of statistical process control to the process of education are described. Discussed are the use of prompt feedback to teachers and prompt feedback to students. A sample feedback form is provided. (CW)
Statistical aspects of determinantal point processes
DEFF Research Database (Denmark)
Lavancier, Frédéric; Møller, Jesper; Rubak, Ege Holger
The statistical aspects of determinantal point processes (DPPs) seem largely unexplored. We review the appealing properties of DDPs, demonstrate that they are useful models for repulsiveness, detail a simulation procedure, and provide freely available software for simulation and statistical...... inference. We pay special attention to stationary DPPs, where we give a simple condition ensuring their existence, construct parametric models, describe how they can be well approximated so that the likelihood can be evaluated and realizations can be simulated, and discuss how statistical inference...
Applicability of statistical process control techniques
Schippers, W.A.J.
1998-01-01
This paper concerns the application of Process Control Techniques (PCTs) for the improvement of the technical performance of discrete production processes. Successful applications of these techniques, such as Statistical Process Control Techniques (SPC), can be found in the literature. However, some
Statistical analysis of rockfall volume distributions: Implications for rockfall dynamics
Dussauge, Carine; Grasso, Jean-Robert; Helmstetter, AgnèS.
2003-06-01
We analyze the volume distribution of natural rockfalls on different geological settings (i.e., calcareous cliffs in the French Alps, Grenoble area, and granite Yosemite cliffs, California Sierra) and different volume ranges (i.e., regional and worldwide catalogs). Contrary to previous studies that included several types of landslides, we restrict our analysis to rockfall sources which originated on subvertical cliffs. For the three data sets, we find that the rockfall volumes follow a power law distribution with a similar exponent value, within error bars. This power law distribution was also proposed for rockfall volumes that occurred along road cuts. All these results argue for a recurrent power law distribution of rockfall volumes on subvertical cliffs, for a large range of rockfall sizes (102-1010 m3), regardless of the geological settings and of the preexisting geometry of fracture patterns that are drastically different on the three studied areas. The power law distribution for rockfall volumes could emerge from two types of processes. First, the observed power law distribution of rockfall volumes is similar to the one reported for both fragmentation experiments and fragmentation models. This argues for the geometry of rock mass fragment sizes to possibly control the rockfall volumes. This way neither cascade nor avalanche processes would influence the rockfall volume distribution. Second, without any requirement of scale-invariant quenched heterogeneity patterns, the rock mass dynamics can arise from avalanche processes driven by fluctuations of the rock mass properties, e.g., cohesion or friction angle. This model may also explain the power law distribution reported for landslides involving unconsolidated materials. We find that the exponent values of rockfall volume on subvertical cliffs, 0.5 ± 0.2, is significantly smaller than the 1.2 ± 0.3 value reported for mixed landslide types. This change of exponents can be driven by the material strength, which
Statistical process control for serially correlated data
Wieringa, Jakob Edo
1999-01-01
Statistical Process Control (SPC) aims at quality improvement through reduction of variation. The best known tool of SPC is the control chart. Over the years, the control chart has proved to be a successful practical technique for monitoring process measurements. However, its usefulness in practice
On statistical analysis of compound point process
Czech Academy of Sciences Publication Activity Database
Volf, Petr
2006-01-01
Roč. 35, 2-3 (2006), s. 389-396 ISSN 1026-597X R&D Projects: GA ČR(CZ) GA402/04/1294 Institutional research plan: CEZ:AV0Z10750506 Keywords : counting process * compound process * hazard function * Cox -model Subject RIV: BB - Applied Statistics, Operational Research
Statistical process control for residential treated wood
Patricia K. Lebow; Timothy M. Young; Stan Lebow
2017-01-01
This paper is the first stage of a study that attempts to improve the process of manufacturing treated lumber through the use of statistical process control (SPC). Analysis of industrial and auditing agency data sets revealed there are differences between the industry and agency probability density functions (pdf) for normalized retention data. Resampling of batches of...
Nonparametric predictive inference in statistical process control
Arts, G.R.J.; Coolen, F.P.A.; Laan, van der P.
2000-01-01
New methods for statistical process control are presented, where the inferences have a nonparametric predictive nature. We consider several problems in process control in terms of uncertainties about future observable random quantities, and we develop inferences for these random quantities hased on
Nonparametric predictive inference in statistical process control
Arts, G.R.J.; Coolen, F.P.A.; Laan, van der P.
2004-01-01
Statistical process control (SPC) is used to decide when to stop a process as confidence in the quality of the next item(s) is low. Information to specify a parametric model is not always available, and as SPC is of a predictive nature, we present a control chart developed using nonparametric
Multivariate Statistical Process Control Charts: An Overview
Bersimis, Sotiris; Psarakis, Stelios; Panaretos, John
2006-01-01
In this paper we discuss the basic procedures for the implementation of multivariate statistical process control via control charting. Furthermore, we review multivariate extensions for all kinds of univariate control charts, such as multivariate Shewhart-type control charts, multivariate CUSUM control charts and multivariate EWMA control charts. In addition, we review unique procedures for the construction of multivariate control charts, based on multivariate statistical techniques such as p...
Applied Behavior Analysis and Statistical Process Control?
Hopkins, B. L.
1995-01-01
Incorporating statistical process control (SPC) methods into applied behavior analysis is discussed. It is claimed that SPC methods would likely reduce applied behavior analysts' intimate contacts with problems and would likely yield poor treatment and research decisions. Cases and data presented by Pfadt and Wheeler (1995) are cited as examples.…
Analysis of Variance in Statistical Image Processing
Kurz, Ludwik; Hafed Benteftifa, M.
1997-04-01
A key problem in practical image processing is the detection of specific features in a noisy image. Analysis of variance (ANOVA) techniques can be very effective in such situations, and this book gives a detailed account of the use of ANOVA in statistical image processing. The book begins by describing the statistical representation of images in the various ANOVA models. The authors present a number of computationally efficient algorithms and techniques to deal with such problems as line, edge, and object detection, as well as image restoration and enhancement. By describing the basic principles of these techniques, and showing their use in specific situations, the book will facilitate the design of new algorithms for particular applications. It will be of great interest to graduate students and engineers in the field of image processing and pattern recognition.
Statistical process control in nursing research.
Polit, Denise F; Chaboyer, Wendy
2012-02-01
In intervention studies in which randomization to groups is not possible, researchers typically use quasi-experimental designs. Time series designs are strong quasi-experimental designs but are seldom used, perhaps because of technical and analytic hurdles. Statistical process control (SPC) is an alternative analytic approach to testing hypotheses about intervention effects using data collected over time. SPC, like traditional statistical methods, is a tool for understanding variation and involves the construction of control charts that distinguish between normal, random fluctuations (common cause variation), and statistically significant special cause variation that can result from an innovation. The purpose of this article is to provide an overview of SPC and to illustrate its use in a study of a nursing practice improvement intervention. Copyright © 2011 Wiley Periodicals, Inc.
The statistical process control methods - SPC
Directory of Open Access Journals (Sweden)
Floreková Ľubica
1998-03-01
Full Text Available Methods of statistical evaluation of quality SPC (item 20 of the documentation system of quality control of ISO norm, series 900 of various processes, products and services belong amongst basic qualitative methods that enable us to analyse and compare data pertaining to various quantitative parameters. Also they enable, based on the latter, to propose suitable interventions with the aim of improving these processes, products and services. Theoretical basis and applicatibily of the principles of the: - diagnostics of a cause and effects, - Paret analysis and Lorentz curve, - number distribution and frequency curves of random variable distribution, - Shewhart regulation charts, are presented in the contribution.
Statistical image processing and multidimensional modeling
Fieguth, Paul
2010-01-01
Images are all around us! The proliferation of low-cost, high-quality imaging devices has led to an explosion in acquired images. When these images are acquired from a microscope, telescope, satellite, or medical imaging device, there is a statistical image processing task: the inference of something - an artery, a road, a DNA marker, an oil spill - from imagery, possibly noisy, blurry, or incomplete. A great many textbooks have been written on image processing. However this book does not so much focus on images, per se, but rather on spatial data sets, with one or more measurements taken over
Statistical process control for alpha spectroscopy
Energy Technology Data Exchange (ETDEWEB)
Richardson, W; Majoras, R E [Oxford Instruments, Inc. P.O. Box 2560, Oak Ridge TN 37830 (United States); Joo, I O; Seymour, R S [Accu-Labs Research, Inc. 4663 Table Mountain Drive, Golden CO 80403 (United States)
1995-10-01
Statistical process control(SPC) allows for the identification of problems in alpha spectroscopy processes before they occur, unlike standard laboratory Q C which only identifies problems after a process fails. SPC tools that are directly applicable to alpha spectroscopy include individual X-charts and X-bar charts, process capability plots, and scatter plots. Most scientists are familiar with the concepts the and methods employed by SPC. These tools allow analysis of process bias, precision, accuracy and reproducibility as well as process capability. Parameters affecting instrument performance are monitored and analyzed using SPC methods. These instrument parameters can also be compared to sampling, preparation, measurement, and analysis Q C parameters permitting the evaluation of cause effect relationships. Three examples of SPC, as applied to alpha spectroscopy , are presented. The first example investigates background contamination using averaging to show trends quickly. A second example demonstrates how SPC can identify sample processing problems, analyzing both how and why this problem occurred. A third example illustrates how SPC can predict when an alpha spectroscopy process is going to fail. This allows for an orderly and timely shutdown of the process to perform preventative maintenance, avoiding the need to repeat costly sample analyses. 7 figs., 2 tabs.
Statistical process control for alpha spectroscopy
International Nuclear Information System (INIS)
Richardson, W.; Majoras, R.E.; Joo, I.O.; Seymour, R.S.
1995-01-01
Statistical process control(SPC) allows for the identification of problems in alpha spectroscopy processes before they occur, unlike standard laboratory Q C which only identifies problems after a process fails. SPC tools that are directly applicable to alpha spectroscopy include individual X-charts and X-bar charts, process capability plots, and scatter plots. Most scientists are familiar with the concepts the and methods employed by SPC. These tools allow analysis of process bias, precision, accuracy and reproducibility as well as process capability. Parameters affecting instrument performance are monitored and analyzed using SPC methods. These instrument parameters can also be compared to sampling, preparation, measurement, and analysis Q C parameters permitting the evaluation of cause effect relationships. Three examples of SPC, as applied to alpha spectroscopy , are presented. The first example investigates background contamination using averaging to show trends quickly. A second example demonstrates how SPC can identify sample processing problems, analyzing both how and why this problem occurred. A third example illustrates how SPC can predict when an alpha spectroscopy process is going to fail. This allows for an orderly and timely shutdown of the process to perform preventative maintenance, avoiding the need to repeat costly sample analyses. 7 figs., 2 tabs
International Nuclear Information System (INIS)
Lacombe, J.P.
1985-12-01
Statistic study of Poisson non-homogeneous and spatial processes is the first part of this thesis. A Neyman-Pearson type test is defined concerning the intensity measurement of these processes. Conditions are given for which consistency of the test is assured, and others giving the asymptotic normality of the test statistics. Then some techniques of statistic processing of Poisson fields and their applications to a particle multidetector study are given. Quality tests of the device are proposed togetherwith signal extraction methods [fr
Statistical processing of technological and radiochemical data
International Nuclear Information System (INIS)
Lahodova, Zdena; Vonkova, Kateřina
2011-01-01
The project described in this article had two goals. The main goal was to compare technological and radiochemical data from two units of nuclear power plant. The other goal was to check the collection, organization and interpretation of routinely measured data. Monitoring of analytical and radiochemical data is a very valuable source of knowledge for some processes in the primary circuit. Exploratory analysis of one-dimensional data was performed to estimate location and variability and to find extreme values, data trends, distribution, autocorrelation etc. This process allowed for the cleaning and completion of raw data. Then multiple analyses such as multiple comparisons, multiple correlation, variance analysis, and so on were performed. Measured data was organized into a data matrix. The results and graphs such as Box plots, Mahalanobis distance, Biplot, Correlation, and Trend graphs are presented in this article as statistical analysis tools. Tables of data were replaced with graphs because graphs condense large amounts of information into easy-to-understand formats. The significant conclusion of this work is that the collection and comprehension of data is a very substantial part of statistical processing. With well-prepared and well-understood data, its accurate evaluation is possible. Cooperation between the technicians who collect data and the statistician who processes it is also very important. (author)
PROCESS VARIABILITY REDUCTION THROUGH STATISTICAL PROCESS CONTROL FOR QUALITY IMPROVEMENT
Directory of Open Access Journals (Sweden)
B.P. Mahesh
2010-09-01
Full Text Available Quality has become one of the most important customer decision factors in the selection among the competing product and services. Consequently, understanding and improving quality is a key factor leading to business success, growth and an enhanced competitive position. Hence quality improvement program should be an integral part of the overall business strategy. According to TQM, the effective way to improve the Quality of the product or service is to improve the process used to build the product. Hence, TQM focuses on process, rather than results as the results are driven by the processes. Many techniques are available for quality improvement. Statistical Process Control (SPC is one such TQM technique which is widely accepted for analyzing quality problems and improving the performance of the production process. This article illustrates the step by step procedure adopted at a soap manufacturing company to improve the Quality by reducing process variability using Statistical Process Control.
Statistical process control for radiotherapy quality assurance
International Nuclear Information System (INIS)
Pawlicki, Todd; Whitaker, Matthew; Boyer, Arthur L.
2005-01-01
Every quality assurance process uncovers random and systematic errors. These errors typically consist of many small random errors and a very few number of large errors that dominate the result. Quality assurance practices in radiotherapy do not adequately differentiate between these two sources of error. The ability to separate these types of errors would allow the dominant source(s) of error to be efficiently detected and addressed. In this work, statistical process control is applied to quality assurance in radiotherapy for the purpose of setting action thresholds that differentiate between random and systematic errors. The theoretical development and implementation of process behavior charts are described. We report on a pilot project is which these techniques are applied to daily output and flatness/symmetry quality assurance for a 10 MV photon beam in our department. This clinical case was followed over 52 days. As part of our investigation, we found that action thresholds set using process behavior charts were able to identify systematic changes in our daily quality assurance process. This is in contrast to action thresholds set using the standard deviation, which did not identify the same systematic changes in the process. The process behavior thresholds calculated from a subset of the data detected a 2% change in the process whereas with a standard deviation calculation, no change was detected. Medical physicists must make decisions on quality assurance data as it is acquired. Process behavior charts help decide when to take action and when to acquire more data before making a change in the process
Radiographic rejection index using statistical process control
International Nuclear Information System (INIS)
Savi, M.B.M.B.; Camozzato, T.S.C.; Soares, F.A.P.; Nandi, D.M.
2015-01-01
The Repeat Analysis Index (IRR) is one of the items contained in the Quality Control Program dictated by brazilian law of radiological protection and should be performed frequently, at least every six months. In order to extract more and better information of IRR, this study presents the Statistical Quality Control applied to reject rate through Statistical Process Control (Control Chart for Attributes ρ - GC) and the Pareto Chart (GP). Data collection was performed for 9 months and the last four months of collection was given on a daily basis. The Limits of Control (LC) were established and Minitab 16 software used to create the charts. IRR obtained for the period was corresponding to 8.8% ± 2,3% and the generated charts analyzed. Relevant information such as orders for X-ray equipment and processors were crossed to identify the relationship between the points that exceeded the control limits and the state of equipment at the time. The GC demonstrated ability to predict equipment failures, as well as the GP showed clearly what causes are recurrent in IRR. (authors) [pt
Statistical shape modeling based renal volume measurement using tracked ultrasound
Pai Raikar, Vipul; Kwartowitz, David M.
2017-03-01
Autosomal dominant polycystic kidney disease (ADPKD) is the fourth most common cause of kidney transplant worldwide accounting for 7-10% of all cases. Although ADPKD usually progresses over many decades, accurate risk prediction is an important task.1 Identifying patients with progressive disease is vital to providing new treatments being developed and enable them to enter clinical trials for new therapy. Among other factors, total kidney volume (TKV) is a major biomarker predicting the progression of ADPKD. Consortium for Radiologic Imaging Studies in Polycystic Kidney Disease (CRISP)2 have shown that TKV is an early, and accurate measure of cystic burden and likely growth rate. It is strongly associated with loss of renal function.3 While ultrasound (US) has proven as an excellent tool for diagnosing the disease; monitoring short-term changes using ultrasound has been shown to not be accurate. This is attributed to high operator variability and reproducibility as compared to tomographic modalities such as CT and MR (Gold standard). Ultrasound has emerged as one of the standout modality for intra-procedural imaging and with methods for spatial localization has afforded us the ability to track 2D ultrasound in physical space which it is being used. In addition to this, the vast amount of recorded tomographic data can be used to generate statistical shape models that allow us to extract clinical value from archived image sets. In this work, we aim at improving the prognostic value of US in managing ADPKD by assessing the accuracy of using statistical shape model augmented US data, to predict TKV, with the end goal of monitoring short-term changes.
Mathematical SETI Statistics, Signal Processing, Space Missions
Maccone, Claudio
2012-01-01
This book introduces the Statistical Drake Equation where, from a simple product of seven positive numbers, the Drake Equation is turned into the product of seven positive random variables. The mathematical consequences of this transformation are demonstrated and it is proven that the new random variable N for the number of communicating civilizations in the Galaxy must follow the lognormal probability distribution when the number of factors in the Drake equation is allowed to increase at will. Mathematical SETI also studies the proposed FOCAL (Fast Outgoing Cyclopean Astronomical Lens) space mission to the nearest Sun Focal Sphere at 550 AU and describes its consequences for future interstellar precursor missions and truly interstellar missions. In addition the author shows how SETI signal processing may be dramatically improved by use of the Karhunen-Loève Transform (KLT) rather than Fast Fourier Transform (FFT). Finally, he describes the efforts made to persuade the United Nations to make the central part...
Spherical Process Models for Global Spatial Statistics
Jeong, Jaehong
2017-11-28
Statistical models used in geophysical, environmental, and climate science applications must reflect the curvature of the spatial domain in global data. Over the past few decades, statisticians have developed covariance models that capture the spatial and temporal behavior of these global data sets. Though the geodesic distance is the most natural metric for measuring distance on the surface of a sphere, mathematical limitations have compelled statisticians to use the chordal distance to compute the covariance matrix in many applications instead, which may cause physically unrealistic distortions. Therefore, covariance functions directly defined on a sphere using the geodesic distance are needed. We discuss the issues that arise when dealing with spherical data sets on a global scale and provide references to recent literature. We review the current approaches to building process models on spheres, including the differential operator, the stochastic partial differential equation, the kernel convolution, and the deformation approaches. We illustrate realizations obtained from Gaussian processes with different covariance structures and the use of isotropic and nonstationary covariance models through deformations and geographical indicators for global surface temperature data. To assess the suitability of each method, we compare their log-likelihood values and prediction scores, and we end with a discussion of related research problems.
Statistical process control for electron beam monitoring.
López-Tarjuelo, Juan; Luquero-Llopis, Naika; García-Mollá, Rafael; Quirós-Higueras, Juan David; Bouché-Babiloni, Ana; Juan-Senabre, Xavier Jordi; de Marco-Blancas, Noelia; Ferrer-Albiach, Carlos; Santos-Serra, Agustín
2015-07-01
To assess the electron beam monitoring statistical process control (SPC) in linear accelerator (linac) daily quality control. We present a long-term record of our measurements and evaluate which SPC-led conditions are feasible for maintaining control. We retrieved our linac beam calibration, symmetry, and flatness daily records for all electron beam energies from January 2008 to December 2013, and retrospectively studied how SPC could have been applied and which of its features could be used in the future. A set of adjustment interventions designed to maintain these parameters under control was also simulated. All phase I data was under control. The dose plots were characterized by rising trends followed by steep drops caused by our attempts to re-center the linac beam calibration. Where flatness and symmetry trends were detected they were less-well defined. The process capability ratios ranged from 1.6 to 9.3 at a 2% specification level. Simulated interventions ranged from 2% to 34% of the total number of measurement sessions. We also noted that if prospective SPC had been applied it would have met quality control specifications. SPC can be used to assess the inherent variability of our electron beam monitoring system. It can also indicate whether a process is capable of maintaining electron parameters under control with respect to established specifications by using a daily checking device, but this is not practical unless a method to establish direct feedback from the device to the linac can be devised. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
Statistical physics of media processes: Mediaphysics
Kuznetsov, Dmitri V.; Mandel, Igor
2007-04-01
The processes of mass communications in complicated social or sociobiological systems such as marketing, economics, politics, animal populations, etc. as a subject for the special scientific subbranch-“mediaphysics”-are considered in its relation with sociophysics. A new statistical physics approach to analyze these phenomena is proposed. A keystone of the approach is an analysis of population distribution between two or many alternatives: brands, political affiliations, or opinions. Relative distances between a state of a “person's mind” and the alternatives are measures of propensity to buy (to affiliate, or to have a certain opinion). The distribution of population by those relative distances is time dependent and affected by external (economic, social, marketing, natural) and internal (influential propagation of opinions, “word of mouth”, etc.) factors, considered as fields. Specifically, the interaction and opinion-influence field can be generalized to incorporate important elements of Ising-spin-based sociophysical models and kinetic-equation ones. The distributions were described by a Schrödinger-type equation in terms of Green's functions. The developed approach has been applied to a real mass-media efficiency problem for a large company and generally demonstrated very good results despite low initial correlations of factors and the target variable.
Statistical Processing Algorithms for Human Population Databases
Directory of Open Access Journals (Sweden)
Camelia COLESCU
2012-01-01
Full Text Available The article is describing some algoritms for statistic functions aplied to a human population database. The samples are specific for the most interesting periods, when the evolution of statistical datas has spectacolous value. The article describes the most usefull form of grafical prezentation of the results
DWARF GALAXY STARBURST STATISTICS IN THE LOCAL VOLUME
International Nuclear Information System (INIS)
Lee, Janice C.; Kennicutt, Robert C.; Akiyama, Sanae; Funes, S. J. Jose G.; Sakai, Shoko
2009-01-01
An unresolved question in galaxy evolution is whether the star formation histories (SFHs) of low-mass systems are preferentially dominated by starbursts or modes that are more quiescent and continuous. Here, we quantify the prevalence of global starbursts in dwarf galaxies at the present epoch and infer their characteristic durations and amplitudes. The analysis is based on the Hα component of the 11 Mpc Hα UV Galaxy Survey (11HUGS), which provides Hα and Galaxy Evolution Explorer UV imaging for an approximately volume-limited sample of ∼ 300 star-forming galaxies within 11 Mpc. We first examine the completeness properties of the sample, and then directly tally the number of bursting dwarfs and compute the fraction of star formation that is concentrated in such systems. To identify starbursting dwarfs, we use an integrated Hα equivalent width (EW) threshold of 100 A, which corresponds to a stellar birthrate of ∼ 2.5, and also explore the use of empirical starburst definitions based on σ thresholds of the observed logarithmic EW distributions. Our results are robust to the exact choice of the threshold, and are consistent with a picture where dwarfs that are currently experiencing massive global bursts are just the ∼ 6% tip of a low-mass galaxy iceberg. Moreover, bursts are only responsible for about a quarter of the total star formation in the overall dwarf population, so the majority of stars in low-mass systems are not formed in this mode today. Spirals and irregulars devoid of Hα emission are rare, indicating that the complete cessation of star formation generally does not occur in such galaxies and is not characteristic of the interburst state, at least for the more luminous systems with M B < -15. The starburst statistics presented here directly constrain the duty cycle and the average burst amplitude under the simplest assumptions where all dwarf irregulars share a common SFH and undergo similar burst cycles with equal probability. Uncertainties
Spherical Process Models for Global Spatial Statistics
Jeong, Jaehong; Jun, Mikyoung; Genton, Marc G.
2017-01-01
Statistical models used in geophysical, environmental, and climate science applications must reflect the curvature of the spatial domain in global data. Over the past few decades, statisticians have developed covariance models that capture
Parametric statistical inference for discretely observed diffusion processes
DEFF Research Database (Denmark)
Pedersen, Asger Roer
Part 1: Theoretical results Part 2: Statistical applications of Gaussian diffusion processes in freshwater ecology......Part 1: Theoretical results Part 2: Statistical applications of Gaussian diffusion processes in freshwater ecology...
Statistical data processing with automatic system for environmental radiation monitoring
International Nuclear Information System (INIS)
Zarkh, V.G.; Ostroglyadov, S.V.
1986-01-01
Practice of statistical data processing for radiation monitoring is exemplified, and some results obtained are presented. Experience in practical application of mathematical statistics methods for radiation monitoring data processing allowed to develop a concrete algorithm of statistical processing realized in M-6000 minicomputer. The suggested algorithm by its content is divided into 3 parts: parametrical data processing and hypotheses test, pair and multiple correlation analysis. Statistical processing programms are in a dialogue operation. The above algorithm was used to process observed data over radioactive waste disposal control region. Results of surface waters monitoring processing are presented
Computing Science and Statistics: Volume 24. Graphics and Visualization
1993-03-20
Models Mike West Institute of Statistics & Decision Sciences Duke University, Durham NC 27708, USA Abstract density estimation techniques. With an...ratio-of-uniforms halter, D. J., Best, N. G., McNeil, A. method. Statistics and Computing, 1, (in J., Sharples , L. D. and Kirby, A. J. press). (1992b...Dept of Act. Math & Stats Box 13040 SFA Riccarton Edinburgh, Scotland EH 14 4AS Nacognoches, TX 75962 mike @cara.ma.hw.ac.uk Allen McIntosh Michael T
Statistical properties of several models of fractional random point processes
Bendjaballah, C.
2011-08-01
Statistical properties of several models of fractional random point processes have been analyzed from the counting and time interval statistics points of view. Based on the criterion of the reduced variance, it is seen that such processes exhibit nonclassical properties. The conditions for these processes to be treated as conditional Poisson processes are examined. Numerical simulations illustrate part of the theoretical calculations.
Single photon laser altimeter simulator and statistical signal processing
Vacek, Michael; Prochazka, Ivan
2013-05-01
Spaceborne altimeters are common instruments onboard the deep space rendezvous spacecrafts. They provide range and topographic measurements critical in spacecraft navigation. Simultaneously, the receiver part may be utilized for Earth-to-satellite link, one way time transfer, and precise optical radiometry. The main advantage of single photon counting approach is the ability of processing signals with very low signal-to-noise ratio eliminating the need of large telescopes and high power laser source. Extremely small, rugged and compact microchip lasers can be employed. The major limiting factor, on the other hand, is the acquisition time needed to gather sufficient volume of data in repetitive measurements in order to process and evaluate the data appropriately. Statistical signal processing is adopted to detect signals with average strength much lower than one photon per measurement. A comprehensive simulator design and range signal processing algorithm are presented to identify a mission specific altimeter configuration. Typical mission scenarios (celestial body surface landing and topographical mapping) are simulated and evaluated. The high interest and promising single photon altimeter applications are low-orbit (˜10 km) and low-radial velocity (several m/s) topographical mapping (asteroids, Phobos and Deimos) and landing altimetry (˜10 km) where range evaluation repetition rates of ˜100 Hz and 0.1 m precision may be achieved. Moon landing and asteroid Itokawa topographical mapping scenario simulations are discussed in more detail.
Stamhuis, I.H.; Klep, P.M.M.; Maarseveen, J.G.S.J. van
2008-01-01
In the period 1850-1940 statistics developed as a new combination of theory and practice. A wide range of phenomena were looked at in a novel way and this statistical mindset had a pervasive influence in contemporary society. This development of statistics is closely interlinked with the process of
Maarseveen, J.G.S.J. van; Klep, P.M.M.; Stamhuis, I.H.
2008-01-01
In the period 1850-1940 statistics developed as a new combination of theory and practice. A wide range of phenomena were looked at in a novel way and this statistical mindset had a pervasive influence in contemporary society. This development of statistics is closely interlinked with the process of
STATISTICAL OPTIMIZATION OF PROCESS VARIABLES FOR ...
African Journals Online (AJOL)
2012-11-03
Nov 3, 2012 ... The osmotic dehydration process was optimized for water loss and solutes gain. ... basis) with safe moisture content for storage (10% wet basis) [3]. Due to ... sucrose, glucose, fructose, corn syrup and sodium chlo- ride have ...
Computing Science and Statistics. Volume 24. Graphics and Visualization
1993-03-01
Mike West Institute of Statistics & Decision Sciences Duke University, Durham NC 27708, USA Abstract density estimation techniques. With an importance...in J., Sharples , L. D. and Kirby, A. J. press). (1992b) Modelling complexity: applica- Wakefield J. C., Smith, A. F. M., Racine- tions of Gibbs...Math & Stats Box 13040 SFA Riccarton Edinburgh, Scotland EH 14 4AS Nacognoches, TX 75962 mike @cara.ma.hw.ac.uk Allen McIntosh Michael T. Longnecker
Modern Statistics for Spatial Point Processes
DEFF Research Database (Denmark)
Møller, Jesper; Waagepetersen, Rasmus
2007-01-01
We summarize and discuss the current state of spatial point process theory and directions for future research, making an analogy with generalized linear models and random effect models, and illustrating the theory with various examples of applications. In particular, we consider Poisson, Gibbs...
Modern statistics for spatial point processes
DEFF Research Database (Denmark)
Møller, Jesper; Waagepetersen, Rasmus
We summarize and discuss the current state of spatial point process theory and directions for future research, making an analogy with generalized linear models and random effect models, and illustrating the theory with various examples of applications. In particular, we consider Poisson, Gibbs...
Robust control charts in statistical process control
Nazir, H.Z.
2014-01-01
The presence of outliers and contaminations in the output of the process highly affects the performance of the design structures of commonly used control charts and hence makes them of less practical use. One of the solutions to deal with this problem is to use control charts which are robust
Statistical process control in wine industry using control cards
Dimitrieva, Evica; Atanasova-Pacemska, Tatjana; Pacemska, Sanja
2013-01-01
This paper is based on the research of the technological process of automatic filling of bottles of wine in winery in Stip, Republic of Macedonia. The statistical process control using statistical control card is created. The results and recommendations for improving the process are discussed.
Statistical Inference for Partially Observed Diffusion Processes
DEFF Research Database (Denmark)
Jensen, Anders Christian
This thesis is concerned with parameter estimation for multivariate diffusion models. It gives a short introduction to diffusion models, and related mathematical concepts. we then introduce the method of prediction-based estimating functions and describe in detail the application for a two......-Uhlenbeck process, while chapter eight describes the detials of an R-package that was developed in relations to the application of the estimationprocedure of chapters five and six....
Using Statistical Process Control to Enhance Student Progression
Hanna, Mark D.; Raichura, Nilesh; Bernardes, Ednilson
2012-01-01
Public interest in educational outcomes has markedly increased in the most recent decade; however, quality management and statistical process control have not deeply penetrated the management of academic institutions. This paper presents results of an attempt to use Statistical Process Control (SPC) to identify a key impediment to continuous…
Applying Statistical Process Control to Clinical Data: An Illustration.
Pfadt, Al; And Others
1992-01-01
Principles of statistical process control are applied to a clinical setting through the use of control charts to detect changes, as part of treatment planning and clinical decision-making processes. The logic of control chart analysis is derived from principles of statistical inference. Sample charts offer examples of evaluating baselines and…
Jain, Lakhmi
2012-01-01
Data mining is one of the most rapidly growing research areas in computer science and statistics. In Volume 2 of this three volume series, we have brought together contributions from some of the most prestigious researchers in theoretical data mining. Each of the chapters is self contained. Statisticians and applied scientists/ engineers will find this volume valuable. Additionally, it provides a sourcebook for graduate students interested in the current direction of research in data mining.
International Nuclear Information System (INIS)
2001-01-01
For the year 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions from the use of fossil fuels, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in 2000, Energy exports by recipient country in 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
International Nuclear Information System (INIS)
2000-01-01
For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g., Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-March 2000, Energy exports by recipient country in January-March 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
International Nuclear Information System (INIS)
1999-01-01
For the year 1998 and the year 1999, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 1999, Energy exports by recipient country in January-June 1999, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
Precision Measurement and Calibration. Volume 1. Statistical Concepts and Procedures
1969-02-01
1950. When Both Variables are Subject to 11. R. D. Stiehler, G. G. Richey, and J. Man- Error," Biometrics , Vol. 5, No. 3, pp. del, "Measurement of...34Analysis of extreme values," Ann. Math. Stat., 21 (1950), 488-506. 3. W. J. DixoN, "Processing data for outliers," Biometrics , 9 (1953), 74-89. 4...standards written by John Perry-" and Figure 1-- hematic Riepesentatitl of Hierarhies ml lftarRalph W. Smith ( ’. Staldards Laberatorles Using
International Nuclear Information System (INIS)
2000-01-01
For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy also includes historical time series over a longer period (see e.g., Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 2000, Energy exports by recipient country in January-June 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
Applying Statistical Process Quality Control Methodology to Educational Settings.
Blumberg, Carol Joyce
A subset of Statistical Process Control (SPC) methodology known as Control Charting is introduced. SPC methodology is a collection of graphical and inferential statistics techniques used to study the progress of phenomena over time. The types of control charts covered are the null X (mean), R (Range), X (individual observations), MR (moving…
Reaming process improvement and control: An application of statistical engineering
DEFF Research Database (Denmark)
Müller, Pavel; Genta, G.; Barbato, G.
2012-01-01
A reaming operation had to be performed within given technological and economical constraints. Process improvement under realistic conditions was the goal of a statistical engineering project, supported by a comprehensive experimental investigation providing detailed information on single...
Using Statistical Process Control Methods to Classify Pilot Mental Workloads
National Research Council Canada - National Science Library
Kudo, Terence
2001-01-01
.... These include cardiac, ocular, respiratory, and brain activity measures. The focus of this effort is to apply statistical process control methodology on different psychophysiological features in an attempt to classify pilot mental workload...
Hadronic electroweak processes in a finite volume
International Nuclear Information System (INIS)
Agadjanov, Andria
2017-01-01
In the present thesis, we study a number of hadronic electroweak processes in a finite volume. Our work is motivated by the ongoing and future lattice simulations of the strong interaction theory called quantum chromodynamics. According to the available computational resources, the numerical calculations are necessarily performed on lattices with a finite spatial extension. The first part of the thesis is based on the finite volume formalism which is a standard method to investigate the processes with the final state interactions, and in particular, the elastic hadron resonances, on the lattice. Throughout the work, we systematically apply the non-relativistic effective field theory. The great merit of this approach is that it encodes the low-energy dynamics directly in terms of the effective range expansion parameters. After a brief introduction into the subject, we formulate a framework for the extraction of the ΔNγ * as well as the B→K * transition form factors from lattice data. Both processes are of substantial phenomenological interest, including the search for physics beyond the Standard Model. Moreover, we provide a proper field-theoretical definition of the resonance matrix elements, and advocate it in comparison to the one based on the infinitely narrow width approximation. In the second part we consider certain aspects of the doubly virtual nucleon Compton scattering. The main objective of the work is to answer the question whether there is, in the Regge language, a so-called fixed pole in the process. To answer this question, the unknown subtraction function, which enters one of the dispersion relations for the invariant amplitudes, has to be determined. The external field method provides a feasible approach to tackle this problem on the lattice. Considering the nucleon in a periodic magnetic field, we derive a simple relation for the ground state energy shift up to a second order in the field strength. The obtained result encodes the value of the
Hadronic electroweak processes in a finite volume
Energy Technology Data Exchange (ETDEWEB)
Agadjanov, Andria
2017-11-07
In the present thesis, we study a number of hadronic electroweak processes in a finite volume. Our work is motivated by the ongoing and future lattice simulations of the strong interaction theory called quantum chromodynamics. According to the available computational resources, the numerical calculations are necessarily performed on lattices with a finite spatial extension. The first part of the thesis is based on the finite volume formalism which is a standard method to investigate the processes with the final state interactions, and in particular, the elastic hadron resonances, on the lattice. Throughout the work, we systematically apply the non-relativistic effective field theory. The great merit of this approach is that it encodes the low-energy dynamics directly in terms of the effective range expansion parameters. After a brief introduction into the subject, we formulate a framework for the extraction of the ΔNγ{sup *} as well as the B→K{sup *} transition form factors from lattice data. Both processes are of substantial phenomenological interest, including the search for physics beyond the Standard Model. Moreover, we provide a proper field-theoretical definition of the resonance matrix elements, and advocate it in comparison to the one based on the infinitely narrow width approximation. In the second part we consider certain aspects of the doubly virtual nucleon Compton scattering. The main objective of the work is to answer the question whether there is, in the Regge language, a so-called fixed pole in the process. To answer this question, the unknown subtraction function, which enters one of the dispersion relations for the invariant amplitudes, has to be determined. The external field method provides a feasible approach to tackle this problem on the lattice. Considering the nucleon in a periodic magnetic field, we derive a simple relation for the ground state energy shift up to a second order in the field strength. The obtained result encodes the
Statistic techniques of process control for MTR type
International Nuclear Information System (INIS)
Oliveira, F.S.; Ferrufino, F.B.J.; Santos, G.R.T.; Lima, R.M.
2002-01-01
This work aims at introducing some improvements on the fabrication of MTR type fuel plates, applying statistic techniques of process control. The work was divided into four single steps and their data were analyzed for: fabrication of U 3 O 8 fuel plates; fabrication of U 3 Si 2 fuel plates; rolling of small lots of fuel plates; applying statistic tools and standard specifications to perform a comparative study of these processes. (author)
HUMAN GLOMERULAR VOLUME QUANTIFICATIONDURING THE AGING PROCESS
Directory of Open Access Journals (Sweden)
Dejan Zdravković
2004-12-01
Full Text Available Kidney function is directly related to the changes of renal tissue, especially glomeruli, which is particularly distinct during the aging process. The impossibility of kidney function substitution points to the need for glomerular morphologic and functional characteristics estimation during the aging process.Human cadaveric kidney tissue samples were used as material during research. Age of cadavers ranged from 20 to 70 years and they were classified according to the scheme: I (20–29; II (30–39; III (40–49; IV (50–59; V (60–69 i VI (older than 70. After the routine histologic preparation of the renal tissue the slices were analized stereologicaly under the light microscope with projection screen (Reichert Visopan with 40 x lens magnification. M42 test system was used and 100, by unbased method selected glomeruli, were analyzed.Average glomerular capillary network volume shows significant increase (p< 0,001 as far as to the age of 50 years in regard to the age of 20 to 29 years. This parameter shows insignificant decrease after the age of 50 until the age of 70 years. This decrease was significant after the age of 70 years in regard to the period of the 20 to 29 (p< 0,05 and the period of 40 to 49 years (p<0,01.
Statistical Data Processing with R – Metadata Driven Approach
Directory of Open Access Journals (Sweden)
Rudi SELJAK
2016-06-01
Full Text Available In recent years the Statistical Office of the Republic of Slovenia has put a lot of effort into re-designing its statistical process. We replaced the classical stove-pipe oriented production system with general software solutions, based on the metadata driven approach. This means that one general program code, which is parametrized with process metadata, is used for data processing for a particular survey. Currently, the general program code is entirely based on SAS macros, but in the future we would like to explore how successfully statistical software R can be used for this approach. Paper describes the metadata driven principle for data validation, generic software solution and main issues connected with the use of statistical software R for this approach.
The application of bayesian statistic in data fit processing
International Nuclear Information System (INIS)
Guan Xingyin; Li Zhenfu; Song Zhaohui
2010-01-01
The rationality and disadvantage of least squares fitting that is usually used in data processing is analyzed, and the theory and commonly method that Bayesian statistic is applied in data processing is shown in detail. As it is proved in analysis, Bayesian approach avoid the limitative hypothesis that least squares fitting has in data processing, and the result has traits that it is more scientific and more easily understood, may replace the least squares fitting to apply in data processing. (authors)
Use of statistical process control in the production of blood components
DEFF Research Database (Denmark)
Magnussen, K; Quere, S; Winkel, P
2008-01-01
Introduction of statistical process control in the setting of a small blood centre was tested, both on the regular red blood cell production and specifically to test if a difference was seen in the quality of the platelets produced, when a change was made from a relatively large inexperienced...... by an experienced staff with four technologists. We applied statistical process control to examine if time series of quality control values were in statistical control. Leucocyte count in red blood cells was out of statistical control. Platelet concentration and volume of the platelets produced by the occasional...... occasional component manufacturing staff to an experienced regular manufacturing staff. Production of blood products is a semi-automated process in which the manual steps may be difficult to control. This study was performed in an ongoing effort to improve the control and optimize the quality of the blood...
Statistical Analysis of CMC Constituent and Processing Data
Fornuff, Jonathan
2004-01-01
Ceramic Matrix Composites (CMCs) are the next "big thing" in high-temperature structural materials. In the case of jet engines, it is widely believed that the metallic superalloys currently being utilized for hot structures (combustors, shrouds, turbine vanes and blades) are nearing their potential limits of improvement. In order to allow for increased turbine temperatures to increase engine efficiency, material scientists have begun looking toward advanced CMCs and SiC/SiC composites in particular. Ceramic composites provide greater strength-to-weight ratios at higher temperatures than metallic alloys, but at the same time require greater challenges in micro-structural optimization that in turn increases the cost of the material as well as increases the risk of variability in the material s thermo-structural behavior. to model various potential CMC engine materials and examines the current variability in these properties due to variability in component processing conditions and constituent materials; then, to see how processing and constituent variations effect key strength, stiffness, and thermal properties of the finished components. Basically, this means trying to model variations in the component s behavior by knowing what went into creating it. inter-phase and manufactured by chemical vapor infiltration (CVI) and melt infiltration (MI) were considered. Examinations of: (1) the percent constituents by volume, (2) the inter-phase thickness, (3) variations in the total porosity, and (4) variations in the chemical composition of the Sic fiber are carried out and modeled using various codes used here at NASA-Glenn (PCGina, NASALife, CEMCAN, etc...). The effects of these variations and the ranking of their respective influences on the various thermo-mechanical material properties are studied and compared to available test data. The properties of the materials as well as minor changes to geometry are then made to the computer model and the detrimental effects
Using Paper Helicopters to Teach Statistical Process Control
Johnson, Danny J.
2011-01-01
This hands-on project uses a paper helicopter to teach students how to distinguish between common and special causes of variability when developing and using statistical process control charts. It allows the student to experience a process that is out-of-control due to imprecise or incomplete product design specifications and to discover how the…
Memory-type control charts in statistical process control
Abbas, N.
2012-01-01
Control chart is the most important statistical tool to manage the business processes. It is a graph of measurements on a quality characteristic of the process on the vertical axis plotted against time on the horizontal axis. The graph is completed with control limits that cause variation mark. Once
Manufacturing Squares: An Integrative Statistical Process Control Exercise
Coy, Steven P.
2016-01-01
In the exercise, students in a junior-level operations management class are asked to manufacture a simple product. Given product specifications, they must design a production process, create roles and design jobs for each team member, and develop a statistical process control plan that efficiently and effectively controls quality during…
Energy Technology Data Exchange (ETDEWEB)
Moore, James [U.S. Army Corps of Engineers - New York District 26 Federal Plaza, New York, New York 10278 (United States); Hays, David [U.S. Army Corps of Engineers - Kansas City District 601 E. 12th Street, Kansas City, Missouri 64106 (United States); Quinn, John; Johnson, Robert; Durham, Lisa [Argonne National Laboratory, Environmental Science Division 9700 S. Cass Ave., Argonne, Illinois 60439 (United States)
2013-07-01
As part of the ongoing remediation process at the Maywood Formerly Utilized Sites Remedial Action Program (FUSRAP) properties, Argonne National Laboratory (Argonne) assisted the U.S. Army Corps of Engineers (USACE) New York District by providing contaminated soil volume estimates for the main site area, much of which is fully or partially remediated. As part of the volume estimation process, an initial conceptual site model (ICSM) was prepared for the entire site that captured existing information (with the exception of soil sampling results) pertinent to the possible location of surface and subsurface contamination above cleanup requirements. This ICSM was based on historical anecdotal information, aerial photographs, and the logs from several hundred soil cores that identified the depth of fill material and the depth to bedrock under the site. Specialized geostatistical software developed by Argonne was used to update the ICSM with historical sampling results and down-hole gamma survey information for hundreds of soil core locations. The updating process yielded both a best guess estimate of contamination volumes and a conservative upper bound on the volume estimate that reflected the estimate's uncertainty. Comparison of model results to actual removed soil volumes was conducted on a parcel-by-parcel basis. Where sampling data density was adequate, the actual volume matched the model's average or best guess results. Where contamination was un-characterized and unknown to the model, the actual volume exceeded the model's conservative estimate. Factors affecting volume estimation were identified to assist in planning further excavations. (authors)
Thiessen, Erik D
2017-01-05
Statistical learning has been studied in a variety of different tasks, including word segmentation, object identification, category learning, artificial grammar learning and serial reaction time tasks (e.g. Saffran et al. 1996 Science 274: , 1926-1928; Orban et al. 2008 Proceedings of the National Academy of Sciences 105: , 2745-2750; Thiessen & Yee 2010 Child Development 81: , 1287-1303; Saffran 2002 Journal of Memory and Language 47: , 172-196; Misyak & Christiansen 2012 Language Learning 62: , 302-331). The difference among these tasks raises questions about whether they all depend on the same kinds of underlying processes and computations, or whether they are tapping into different underlying mechanisms. Prior theoretical approaches to statistical learning have often tried to explain or model learning in a single task. However, in many cases these approaches appear inadequate to explain performance in multiple tasks. For example, explaining word segmentation via the computation of sequential statistics (such as transitional probability) provides little insight into the nature of sensitivity to regularities among simultaneously presented features. In this article, we will present a formal computational approach that we believe is a good candidate to provide a unifying framework to explore and explain learning in a wide variety of statistical learning tasks. This framework suggests that statistical learning arises from a set of processes that are inherent in memory systems, including activation, interference, integration of information and forgetting (e.g. Perruchet & Vinter 1998 Journal of Memory and Language 39: , 246-263; Thiessen et al. 2013 Psychological Bulletin 139: , 792-814). From this perspective, statistical learning does not involve explicit computation of statistics, but rather the extraction of elements of the input into memory traces, and subsequent integration across those memory traces that emphasize consistent information (Thiessen and Pavlik
Statistical model for grain boundary and grain volume oxidation kinetics in UO2 spent fuel
International Nuclear Information System (INIS)
Stout, R.B.; Shaw, H.F.; Einziger, R.E.
1989-09-01
This paper addresses statistical characteristics for the simplest case of grain boundary/grain volume oxidation kinetics of UO 2 to U 3 O 7 for a fragment of a spent fuel pellet. It also presents a limited discussion of future extensions to this simple case to represent the more complex cases of oxidation kinetics in spent fuels. 17 refs., 1 fig
Statistical evaluation of the mechanical properties of high-volume class F fly ash concretes
Yoon, Seyoon; Monteiro, Paulo J.M.; Macphee, Donald E.; Glasser, Fredrik P.; Imbabi, Mohammed Salah-Eldin
2014-01-01
the authors experimentally and statistically investigated the effects of mix-design factors on the mechanical properties of high-volume class F fly ash concretes. A total of 240 and 32 samples were produced and tested in the laboratory to measure compressive
Beneficiation-hydroretort processing of US oil shales: Volume 2
Energy Technology Data Exchange (ETDEWEB)
None
1989-01-01
This report has been divided into three volumes. Volume I describes the MRI beneficiation work. In addition, Volume I presents the results of joint beneficiation-hydroretorting studies and provides an economic analysis of the combined beneficiation-hydroretorting approach for processing Eastern oil shales. Volume II presents detailed results of hydroretorting tests made by HYCRUDE/IGT on raw and beneficiated oil shales prepared by MRI. Volume III comprises detailed engineering design drawings and supporting data developed by the Roberts and Schaefer Company, Engineers and Contractors, Salt Lake City, Utah, in support of the capital and operating costs for a conceptual beneficiation plant processing an Alabama oil shale.
Miller, John
1994-01-01
Presents an approach to document numbering, document titling, and process measurement which, when used with fundamental techniques of statistical process control, reveals meaningful process-element variation as well as nominal productivity models. (SR)
Statistical Process Control: Going to the Limit for Quality.
Training, 1987
1987-01-01
Defines the concept of statistical process control, a quality control method used especially in manufacturing. Generally, concept users set specific standard levels that must be met. Makes the point that although employees work directly with the method, management is responsible for its success within the plant. (CH)
Statistical Process Control in the Practice of Program Evaluation.
Posavac, Emil J.
1995-01-01
A technique developed to monitor the quality of manufactured products, statistical process control (SPC), incorporates several features that may prove attractive to evaluators. This paper reviews the history of SPC, suggests how the approach can enrich program evaluation, and illustrates its use in a hospital-based example. (SLD)
Statistical Process Control. Impact and Opportunities for Ohio.
Brown, Harold H.
The first purpose of this study is to help the reader become aware of the evolution of Statistical Process Control (SPC) as it is being implemented and used in industry today. This is approached through the presentation of a brief historical account of SPC, from its inception through the technological miracle that has occurred in Japan. The…
Statistical Process Control. A Summary. FEU/PICKUP Project Report.
Owen, M.; Clark, I.
A project was conducted to develop a curriculum and training materials to be used in training industrial operatives in statistical process control (SPC) techniques. During the first phase of the project, questionnaires were sent to 685 companies (215 of which responded) to determine where SPC was being used, what type of SPC firms needed, and how…
Frequency-volume Statistics of Rock Falls: Examples From France, Italy and California
Dussauge-Peisser, C.; Guzzetti, F.; Wieczorek, G. F.
There is accumulating evidence that the distribution of rock-fall volume exhibits power law (fractal) statistics in different physiographic and geologic environments. We have studied the frequency-volume statistics of rock falls in three areas: Grenoble, France; Umbria, Italy; and Yosemite Valley, California, USA. We present a compari- son of the datasets currently available. For the Grenoble area a catalogue of rock falls between 1248 and 1995 occurred along a 120 km long limestone cliff. The dataset contains information on 105 rock-fall events ranging in size from 3xE-2 to 5xE8 m3. Only the time window 1935-1995 is considered in the study, involving 87 events from 1E-2 to 1E6 m3. The cumulative frequency-volume statistics follow a power-law (frac- tal) relationship with exponent b = -0.4 over the range 50 m3 Yosemite Valley the database contains information on historical (1851-2001) rock falls (122), rock slides (251) and prehistoric rock avalanches (5). For Yosemite, the non-cumulative frequency-volume statistics of rock falls and rock slides are very sim- ilar and correlate well with a power-law (fractal) relation with exponent beta = -1.4, over the range 30 m3
Processes for an Architecture of Volume
DEFF Research Database (Denmark)
Mcgee, Wes; Feringa, Jelle; Søndergaard, Asbjørn
2013-01-01
This paper addresses both the architectural, conceptual motivations and the tools and techniques necessary for the digital production of an architecture of volume. The robotic manufacturing techniques of shaping volumetric materials by hot wire and abrasive wire cutting are discussed through...
Statistical Modeling of Ultrawideband Body-Centric Wireless Channels Considering Room Volume
Directory of Open Access Journals (Sweden)
Miyuki Hirose
2012-01-01
Full Text Available This paper presents the results of a statistical modeling of onbody ultrawideband (UWB radio channels for wireless body area network (WBAN applications. Measurements were conducted in five different rooms. A measured delay profile can be divided into two domains; in the first domain (04 ns has multipath components that are dominant and dependent on room volume. The first domain was modeled with a conventional power decay law model, and the second domain with a modified Saleh-Valenzuela model considering the room volume. Realizations of the impulse responses are presented based on the composite model and compared with the measured average power delay profiles.
International Nuclear Information System (INIS)
McLoughlin, R.F.; Ryan, M.V.; Heuston, P.M.; McCoy, C.T.; Masterson, J.B.
1992-01-01
The purpose of this study was to construct and evaluate a statistical model for the quantitative analysis of computed tomographic brain images. Data were derived from standard sections in 34 normal studies. A model representing the intercranial pure tissue and partial volume areas, with allowance for beam hardening, was developed. The average percentage error in estimation of areas, derived from phantom tests using the model, was 28.47%. We conclude that our model is not sufficiently accurate to be of clinical use, even though allowance was made for partial volume and beam hardening effects. (author)
Limiting processes in non-equilibrium classical statistical mechanics
International Nuclear Information System (INIS)
Jancel, R.
1983-01-01
After a recall of the basic principles of the statistical mechanics, the results of ergodic theory, the transient at the thermodynamic limit and his link with the transport theory near the equilibrium are analyzed. The fundamental problems put by the description of non-equilibrium macroscopic systems are investigated and the kinetic methods are stated. The problems of the non-equilibrium statistical mechanics are analyzed: irreversibility and coarse-graining, macroscopic variables and kinetic description, autonomous reduced descriptions, limit processes, BBGKY hierarchy, limit theorems [fr
A new instrument for statistical process control of thermoset molding
International Nuclear Information System (INIS)
Day, D.R.; Lee, H.L.; Shepard, D.D.; Sheppard, N.F.
1991-01-01
The recent development of a rugged ceramic mold mounted dielectric sensor and high speed dielectric instrumentation now enables monitoring and statistical process control of production molding over thousands of runs. In this work special instrumentation and software (ICAM-1000) was utilized that automatically extracts critical point during the molding process including flow point, viscosity minimum gel inflection, and reaction endpoint. In addition, other sensors were incorporated to measure temperature and pressure. The critical point as well as temperature and pressure were then recorded during normal production and then plotted in the form of statistical process control (SPC) charts. Experiments have been carried out in RIM, SMC, and RTM type molding operations. The influence of temperature, pressure chemistry, and other variables has been investigated. In this paper examples of both RIM and SMC are discussed
Fracture criterion for brittle materials based on statistical cells of finite volume
International Nuclear Information System (INIS)
Cords, H.; Kleist, G.; Zimmermann, R.
1986-06-01
An analytical consideration of the Weibull Statistical Analysis of brittle materials established the necessity of including one additional material constant for a more comprehensive description of the failure behaviour. The Weibull analysis is restricted to infinitesimal volume elements in consequence of the differential calculus applied. It was found that infinitesimally small elements are in conflict with the basic statistical assumption and that the differential calculus is not needed in fact since nowadays most of the stress analyses are based on finite element calculations, and these are most suitable for a subsequent statistical analysis of strength. The size of a finite statistical cell has been introduced as the third material parameter. It should represent the minimum volume containing all statistical features of the material such as distribution of pores, flaws and grains. The new approach also contains a unique treatment of failure under multiaxial stresses. The quantity responsible for failure under multiaxial stresses is introduced as a modified strain energy. Sixteen different tensile specimens including CT-specimens have been investigated experimentally and analyzed with the probabilistic fracture criterion. As a result it can be stated that the failure rates of all types of specimens made from three different grades of graphite are predictable. The accuracy of the prediction is one standard deviation. (orig.) [de
USAF Logistics Process Optimization Study for the Aircraft Asset Sustainment Process. Volume 2
National Research Council Canada - National Science Library
Adamson, Anthony
1998-01-01
.... It is published as three separate volumes. Volume I, USAF Logistics Process Optimization Study for the Aircraft Asset Sustainment Process -- Phase II Report, discusses the result and cost/benefit analysis of testing three initiatives...
USAF Logistics Process Optimization Study for the Aircraft Asset Sustainment Process. Volume 1
National Research Council Canada - National Science Library
Adamson, Anthony
1998-01-01
.... It is published as three separate volumes. Volume I, USAF Logistics Process Optimization Study for the Aircraft Asset Sustainment Process -- Phase II Report, discusses the result and cost/benefit analysis of testing three initiatives...
Davis, B. J.; Feiveson, A. H.
1975-01-01
Results are presented of CITARS data processing in raw form. Tables of descriptive statistics are given along with descriptions and results of inferential analyses. The inferential results are organized by questions which CITARS was designed to answer.
Statistical convergence of a non-positive approximation process
International Nuclear Information System (INIS)
Agratini, Octavian
2011-01-01
Highlights: → A general class of approximation processes is introduced. → The A-statistical convergence is studied. → Applications in quantum calculus are delivered. - Abstract: Starting from a general sequence of linear and positive operators of discrete type, we associate its r-th order generalization. This construction involves high order derivatives of a signal and it looses the positivity property. Considering that the initial approximation process is A-statistically uniform convergent, we prove that the property is inherited by the new sequence. Also, our result includes information about the uniform convergence. Two applications in q-Calculus are presented. We study q-analogues both of Meyer-Koenig and Zeller operators and Stancu operators.
Statistical Process Control in a Modern Production Environment
DEFF Research Database (Denmark)
Windfeldt, Gitte Bjørg
gathered here and standard statistical software. In Paper 2 a new method for process monitoring is introduced. The method uses a statistical model of the quality characteristic and a sliding window of observations to estimate the probability that the next item will not respect the specications......Paper 1 is aimed at practicians to help them test the assumption that the observations in a sample are independent and identically distributed. An assumption that is essential when using classical Shewhart charts. The test can easily be performed in the control chart setup using the samples....... If the estimated probability exceeds a pre-determined threshold the process will be stopped. The method is exible, allowing a complexity in modeling that remains invisible to the end user. Furthermore, the method allows to build diagnostic plots based on the parameters estimates that can provide valuable insight...
Statistical features of pre-compound processes in nuclear reactions
International Nuclear Information System (INIS)
Hussein, M.S.; Rego, R.A.
1983-04-01
Several statistical aspects of multistep compound processes are discussed. The connection between the cross-section auto-correlation function and the average number of maxima is emphasized. The restrictions imposed by the non-zero value of the energy step used in measuring the excitation fuction and the experimental error are discussed. Applications are made to the system 25 Mg( 3 He,p) 27 Al. (Author) [pt
Application of statistical process control to qualitative molecular diagnostic assays.
Directory of Open Access Journals (Sweden)
Cathal P O'brien
2014-11-01
Full Text Available Modern pathology laboratories and in particular high throughput laboratories such as clinical chemistry have developed a reliable system for statistical process control. Such a system is absent from the majority of molecular laboratories and where present is confined to quantitative assays. As the inability to apply statistical process control to assay is an obvious disadvantage this study aimed to solve this problem by using a frequency estimate coupled with a confidence interval calculation to detect deviations from an expected mutation frequency. The results of this study demonstrate the strengths and weaknesses of this approach and highlight minimum sample number requirements. Notably, assays with low mutation frequencies and detection of small deviations from an expected value require greater samples with a resultant protracted time to detection. Modelled laboratory data was also used to highlight how this approach might be applied in a routine molecular laboratory. This article is the first to describe the application of statistical process control to qualitative laboratory data.
Some properties of point processes in statistical optics
International Nuclear Information System (INIS)
Picinbono, B.; Bendjaballah, C.
2010-01-01
The analysis of the statistical properties of the point process (PP) of photon detection times can be used to determine whether or not an optical field is classical, in the sense that its statistical description does not require the methods of quantum optics. This determination is, however, more difficult than ordinarily admitted and the first aim of this paper is to illustrate this point by using some results of the PP theory. For example, it is well known that the analysis of the photodetection of classical fields exhibits the so-called bunching effect. But this property alone cannot be used to decide the nature of a given optical field. Indeed, we have presented examples of point processes for which a bunching effect appears and yet they cannot be obtained from a classical field. These examples are illustrated by computer simulations. Similarly, it is often admitted that for fields with very low light intensity the bunching or antibunching can be described by using the statistical properties of the distance between successive events of the point process, which simplifies the experimental procedure. We have shown that, while this property is valid for classical PPs, it has no reason to be true for nonclassical PPs, and we have presented some examples of this situation also illustrated by computer simulations.
An introduction to statistical process control in research proteomics.
Bramwell, David
2013-12-16
Statistical process control is a well-established and respected method which provides a general purpose, and consistent framework for monitoring and improving the quality of a process. It is routinely used in many industries where the quality of final products is critical and is often required in clinical diagnostic laboratories [1,2]. To date, the methodology has been little utilised in research proteomics. It has been shown to be capable of delivering quantitative QC procedures for qualitative clinical assays [3] making it an ideal methodology to apply to this area of biological research. To introduce statistical process control as an objective strategy for quality control and show how it could be used to benefit proteomics researchers and enhance the quality of the results they generate. We demonstrate that rules which provide basic quality control are easy to derive and implement and could have a major impact on data quality for many studies. Statistical process control is a powerful tool for investigating and improving proteomics research work-flows. The process of characterising measurement systems and defining control rules forces the exploration of key questions that can lead to significant improvements in performance. This work asserts that QC is essential to proteomics discovery experiments. Every experimenter must know the current capabilities of their measurement system and have an objective means for tracking and ensuring that performance. Proteomic analysis work-flows are complicated and multi-variate. QC is critical for clinical chemistry measurements and huge strides have been made in ensuring the quality and validity of results in clinical biochemistry labs. This work introduces some of these QC concepts and works to bridge their use from single analyte QC to applications in multi-analyte systems. This article is part of a Special Issue entitled: Standardization and Quality Control in Proteomics. Copyright © 2013 The Author. Published by Elsevier
Bruns, S.; Stipp, S. L. S.; Sørensen, H. O.
2017-09-01
Digital rock physics carries the dogmatic concept of having to segment volume images for quantitative analysis but segmentation rejects huge amounts of signal information. Information that is essential for the analysis of difficult and marginally resolved samples, such as materials with very small features, is lost during segmentation. In X-ray nanotomography reconstructions of Hod chalk we observed partial volume voxels with an abundance that limits segmentation based analysis. Therefore, we investigated the suitability of greyscale analysis for establishing statistical representative elementary volumes (sREV) for the important petrophysical parameters of this type of chalk, namely porosity, specific surface area and diffusive tortuosity, by using volume images without segmenting the datasets. Instead, grey level intensities were transformed to a voxel level porosity estimate using a Gaussian mixture model. A simple model assumption was made that allowed formulating a two point correlation function for surface area estimates using Bayes' theory. The same assumption enables random walk simulations in the presence of severe partial volume effects. The established sREVs illustrate that in compacted chalk, these simulations cannot be performed in binary representations without increasing the resolution of the imaging system to a point where the spatial restrictions of the represented sample volume render the precision of the measurement unacceptable. We illustrate this by analyzing the origins of variance in the quantitative analysis of volume images, i.e. resolution dependence and intersample and intrasample variance. Although we cannot make any claims on the accuracy of the approach, eliminating the segmentation step from the analysis enables comparative studies with higher precision and repeatability.
77 FR 46096 - Statistical Process Controls for Blood Establishments; Public Workshop
2012-08-02
...] Statistical Process Controls for Blood Establishments; Public Workshop AGENCY: Food and Drug Administration... workshop entitled: ``Statistical Process Controls for Blood Establishments.'' The purpose of this public workshop is to discuss the implementation of statistical process controls to validate and monitor...
Competent statistical programmer: Need of business process outsourcing industry
Khan, Imran
2014-01-01
Over the last two decades Business Process Outsourcing (BPO) has evolved as much mature practice. India is looked as preferred destination for pharmaceutical outsourcing over a cost arbitrage. Among the biometrics outsourcing, statistical programming and analysis required very niche skill for service delivery. The demand and supply ratios are imbalance due to high churn out rate and less supply of competent programmer. Industry is moving from task delivery to ownership and accountability. The paradigm shift from an outsourcing to consulting is triggering the need for competent statistical programmer. Programmers should be trained in technical, analytical, problem solving, decision making and soft skill as the expectations from the customer are changing from task delivery to accountability of the project. This paper will highlight the common issue SAS programming service industry is facing and skills the programmers need to develop to cope up with these changes. PMID:24987578
Competent statistical programmer: Need of business process outsourcing industry.
Khan, Imran
2014-07-01
Over the last two decades Business Process Outsourcing (BPO) has evolved as much mature practice. India is looked as preferred destination for pharmaceutical outsourcing over a cost arbitrage. Among the biometrics outsourcing, statistical programming and analysis required very niche skill for service delivery. The demand and supply ratios are imbalance due to high churn out rate and less supply of competent programmer. Industry is moving from task delivery to ownership and accountability. The paradigm shift from an outsourcing to consulting is triggering the need for competent statistical programmer. Programmers should be trained in technical, analytical, problem solving, decision making and soft skill as the expectations from the customer are changing from task delivery to accountability of the project. This paper will highlight the common issue SAS programming service industry is facing and skills the programmers need to develop to cope up with these changes.
Competent statistical programmer: Need of business process outsourcing industry
Directory of Open Access Journals (Sweden)
Imran Khan
2014-01-01
Full Text Available Over the last two decades Business Process Outsourcing (BPO has evolved as much mature practice. India is looked as preferred destination for pharmaceutical outsourcing over a cost arbitrage. Among the biometrics outsourcing, statistical programming and analysis required very niche skill for service delivery. The demand and supply ratios are imbalance due to high churn out rate and less supply of competent programmer. Industry is moving from task delivery to ownership and accountability. The paradigm shift from an outsourcing to consulting is triggering the need for competent statistical programmer. Programmers should be trained in technical, analytical, problem solving, decision making and soft skill as the expectations from the customer are changing from task delivery to accountability of the project. This paper will highlight the common issue SAS programming service industry is facing and skills the programmers need to develop to cope up with these changes.
Design and Statistics in Quantitative Translation (Process) Research
DEFF Research Database (Denmark)
Balling, Laura Winther; Hvelplund, Kristian Tangsgaard
2015-01-01
Traditionally, translation research has been qualitative, but quantitative research is becoming increasingly important, especially in translation process research but also in other areas of translation studies. This poses problems to many translation scholars since this way of thinking...... is unfamiliar. In this article, we attempt to mitigate these problems by outlining our approach to good quantitative research, all the way from research questions and study design to data preparation and statistics. We concentrate especially on the nature of the variables involved, both in terms of their scale...... and their role in the design; this has implications for both design and choice of statistics. Although we focus on quantitative research, we also argue that such research should be supplemented with qualitative analyses and considerations of the translation product....
Statistical representation of a spray as a point process
International Nuclear Information System (INIS)
Subramaniam, S.
2000-01-01
The statistical representation of a spray as a finite point process is investigated. One objective is to develop a better understanding of how single-point statistical information contained in descriptions such as the droplet distribution function (ddf), relates to the probability density functions (pdfs) associated with the droplets themselves. Single-point statistical information contained in the droplet distribution function (ddf) is shown to be related to a sequence of single surrogate-droplet pdfs, which are in general different from the physical single-droplet pdfs. It is shown that the ddf contains less information than the fundamental single-point statistical representation of the spray, which is also described. The analysis shows which events associated with the ensemble of spray droplets can be characterized by the ddf, and which cannot. The implications of these findings for the ddf approach to spray modeling are discussed. The results of this study also have important consequences for the initialization and evolution of direct numerical simulations (DNS) of multiphase flows, which are usually initialized on the basis of single-point statistics such as the droplet number density in physical space. If multiphase DNS are initialized in this way, this implies that even the initial representation contains certain implicit assumptions concerning the complete ensemble of realizations, which are invalid for general multiphase flows. Also the evolution of a DNS initialized in this manner is shown to be valid only if an as yet unproven commutation hypothesis holds true. Therefore, it is questionable to what extent DNS that are initialized in this manner constitute a direct simulation of the physical droplets. Implications of these findings for large eddy simulations of multiphase flows are also discussed. (c) 2000 American Institute of Physics
Ironmaking Process Alternative Screening Study, Volume 1
Energy Technology Data Exchange (ETDEWEB)
Lockwood Greene, . .
2005-01-06
Iron in the United States is largely produced from iron ore mined in the United States or imported from Canada or South America. The iron ore is typically smelted in Blast Furnaces that use primarily iron ore, iron concentrate pellets metallurgical coke, limestone and lime as the raw materials. Under current operating scenarios, the iron produced from these Blast Furnaces is relatively inexpensive as compared to current alternative iron sources, e.g. direct iron reduction, imported pig iron, etc. The primary problem the Blast Furnace Ironmaking approach is that many of these Blast furnaces are relatively small, as compared to the newer, larger Blast Furnaces; thus are relatively costly and inefficient to operate. An additional problem is also that supplies of high-grade metallurgical grade coke are becoming increasingly in short supply and costs are also increasing. In part this is due to the short supply and costs of high-grade metallurgical coals, but also this is due to the increasing necessity for environmental controls for coke production. After year 2003 new regulations for coke product environmental requirement will likely be promulgated. It is likely that this also will either increase the cost of high-quality coke production or will reduce the available domestic U.S. supply. Therefore, iron production in the United States utilizing the current, predominant Blast Furnace process will be more costly and would likely be curtailed due to a coke shortage. Therefore, there is a significant need to develop or extend the economic viability of Alternate Ironmaking Processes to at least partially replace current and declining blast furnace iron sources and to provide incentives for new capacity expansion. The primary conclusions of this comparative Study of Alternative Ironmaking Process scenarios are: (1) The processes with the best combined economics (CAPEX and OPEX impacts in the I.R.R. calculation) can be grouped into those Fine Ore based processes with no scrap
On the joint statistics of stable random processes
International Nuclear Information System (INIS)
Hopcraft, K I; Jakeman, E
2011-01-01
A utilitarian continuous bi-variate random process whose first-order probability density function is a stable random variable is constructed. Results paralleling some of those familiar from the theory of Gaussian noise are derived. In addition to the joint-probability density for the process, these include fractional moments and structure functions. Although the correlation functions for stable processes other than Gaussian do not exist, we show that there is coherence between values adopted by the process at different times, which identifies a characteristic evolution with time. The distribution of the derivative of the process, and the joint-density function of the value of the process and its derivative measured at the same time are evaluated. These enable properties to be calculated analytically such as level crossing statistics and those related to the random telegraph wave. When the stable process is fractal, the proportion of time it spends at zero is finite and some properties of this quantity are evaluated, an optical interpretation for which is provided. (paper)
Statistical characterization of pitting corrosion process and life prediction
International Nuclear Information System (INIS)
Sheikh, A.K.; Younas, M.
1995-01-01
In order to prevent corrosion failures of machines and structures, it is desirable to know in advance when the corrosion damage will take place, and appropriate measures are needed to mitigate the damage. The corrosion predictions are needed both at development as well as operational stage of machines and structures. There are several forms of corrosion process through which varying degrees of damage can occur. Under certain conditions these corrosion processes at alone and in other set of conditions, several of these processes may occur simultaneously. For a certain type of machine elements and structures, such as gears, bearing, tubes, pipelines, containers, storage tanks etc., are particularly prone to pitting corrosion which is an insidious form of corrosion. The corrosion predictions are usually based on experimental results obtained from test coupons and/or field experiences of similar machines or parts of a structure. Considerable scatter is observed in corrosion processes. The probabilities nature and kinetics of pitting process makes in necessary to use statistical method to forecast the residual life of machine of structures. The focus of this paper is to characterization pitting as a time-dependent random process, and using this characterization the prediction of life to reach a critical level of pitting damage can be made. Using several data sets from literature on pitting corrosion, the extreme value modeling of pitting corrosion process, the evolution of the extreme value distribution in time, and their relationship to the reliability of machines and structure are explained. (author)
Fast Quantum Algorithm for Predicting Descriptive Statistics of Stochastic Processes
Williams Colin P.
1999-01-01
Stochastic processes are used as a modeling tool in several sub-fields of physics, biology, and finance. Analytic understanding of the long term behavior of such processes is only tractable for very simple types of stochastic processes such as Markovian processes. However, in real world applications more complex stochastic processes often arise. In physics, the complicating factor might be nonlinearities; in biology it might be memory effects; and in finance is might be the non-random intentional behavior of participants in a market. In the absence of analytic insight, one is forced to understand these more complex stochastic processes via numerical simulation techniques. In this paper we present a quantum algorithm for performing such simulations. In particular, we show how a quantum algorithm can predict arbitrary descriptive statistics (moments) of N-step stochastic processes in just O(square root of N) time. That is, the quantum complexity is the square root of the classical complexity for performing such simulations. This is a significant speedup in comparison to the current state of the art.
Statistical process control using optimized neural networks: a case study.
Addeh, Jalil; Ebrahimzadeh, Ata; Azarbad, Milad; Ranaee, Vahid
2014-09-01
The most common statistical process control (SPC) tools employed for monitoring process changes are control charts. A control chart demonstrates that the process has altered by generating an out-of-control signal. This study investigates the design of an accurate system for the control chart patterns (CCPs) recognition in two aspects. First, an efficient system is introduced that includes two main modules: feature extraction module and classifier module. In the feature extraction module, a proper set of shape features and statistical feature are proposed as the efficient characteristics of the patterns. In the classifier module, several neural networks, such as multilayer perceptron, probabilistic neural network and radial basis function are investigated. Based on an experimental study, the best classifier is chosen in order to recognize the CCPs. Second, a hybrid heuristic recognition system is introduced based on cuckoo optimization algorithm (COA) algorithm to improve the generalization performance of the classifier. The simulation results show that the proposed algorithm has high recognition accuracy. Copyright © 2013 ISA. Published by Elsevier Ltd. All rights reserved.
Advanced statistics to improve the physical interpretation of atomization processes
International Nuclear Information System (INIS)
Panão, Miguel R.O.; Radu, Lucian
2013-01-01
Highlights: ► Finite pdf mixtures improves physical interpretation of sprays. ► Bayesian approach using MCMC algorithm is used to find the best finite mixture. ► Statistical method identifies multiple droplet clusters in a spray. ► Multiple drop clusters eventually associated with multiple atomization mechanisms. ► Spray described by drop size distribution and not only its moments. -- Abstract: This paper reports an analysis of the physics of atomization processes using advanced statistical tools. Namely, finite mixtures of probability density functions, which best fitting is found using a Bayesian approach based on a Markov chain Monte Carlo (MCMC) algorithm. This approach takes into account eventual multimodality and heterogeneities in drop size distributions. Therefore, it provides information about the complete probability density function of multimodal drop size distributions and allows the identification of subgroups in the heterogeneous data. This allows improving the physical interpretation of atomization processes. Moreover, it also overcomes the limitations induced by analyzing the spray droplets characteristics through moments alone, particularly, the hindering of different natures of droplet formation. Finally, the method is applied to physically interpret a case-study based on multijet atomization processes
Statistical process control charts for monitoring military injuries.
Schuh, Anna; Canham-Chervak, Michelle; Jones, Bruce H
2017-12-01
An essential aspect of an injury prevention process is surveillance, which quantifies and documents injury rates in populations of interest and enables monitoring of injury frequencies, rates and trends. To drive progress towards injury reduction goals, additional tools are needed. Statistical process control charts, a methodology that has not been previously applied to Army injury monitoring, capitalise on existing medical surveillance data to provide information to leadership about injury trends necessary for prevention planning and evaluation. Statistical process control Shewhart u-charts were created for 49 US Army installations using quarterly injury medical encounter rates, 2007-2015, for active duty soldiers obtained from the Defense Medical Surveillance System. Injuries were defined according to established military injury surveillance recommendations. Charts display control limits three standard deviations (SDs) above and below an installation-specific historical average rate determined using 28 data points, 2007-2013. Charts are available in Army strategic management dashboards. From 2007 to 2015, Army injury rates ranged from 1254 to 1494 unique injuries per 1000 person-years. Installation injury rates ranged from 610 to 2312 injuries per 1000 person-years. Control charts identified four installations with injury rates exceeding the upper control limits at least once during 2014-2015, rates at three installations exceeded the lower control limit at least once and 42 installations had rates that fluctuated around the historical mean. Control charts can be used to drive progress towards injury reduction goals by indicating statistically significant increases and decreases in injury rates. Future applications to military subpopulations, other health outcome metrics and chart enhancements are suggested. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Fuel quality processing study, volume 1
Ohara, J. B.; Bela, A.; Jentz, N. E.; Syverson, H. T.; Klumpe, H. W.; Kessler, R. E.; Kotzot, H. T.; Loran, B. L.
1981-01-01
A fuel quality processing study to provide a data base for an intelligent tradeoff between advanced turbine technology and liquid fuel quality, and also, to guide the development of specifications of future synthetic fuels anticipated for use in the time period 1985 to 2000 is given. Four technical performance tests are discussed: on-site pretreating, existing refineries to upgrade fuels, new refineries to upgrade fuels, and data evaluation. The base case refinery is a modern Midwest refinery processing 200,000 BPD of a 60/40 domestic/import petroleum crude mix. The synthetic crudes used for upgrading to marketable products and turbine fuel are shale oil and coal liquids. Of these syncrudes, 50,000 BPD are processed in the existing petroleum refinery, requiring additional process units and reducing petroleum feed, and in a new refinery designed for processing each syncrude to produce gasoline, distillate fuels, resid fuels, and turbine fuel, JPGs and coke. An extensive collection of synfuel properties and upgrading data was prepared for the application of a linear program model to investigate the most economical production slate meeting petroleum product specifications and turbine fuels of various quality grades. Technical and economic projections were developed for 36 scenarios, based on 4 different crude feeds to either modified existing or new refineries operated in 2 different modes to produce 7 differing grades of turbine fuels. A required product selling price of turbine fuel for each processing route was calculated. Procedures and projected economics were developed for on-site treatment of turbine fuel to meet limitations of impurities and emission of pollutants.
Statistical process control applied to the manufacturing of beryllia ceramics
International Nuclear Information System (INIS)
Ferguson, G.P.; Jech, D.E.; Sepulveda, J.L.
1991-01-01
To compete effectively in an international market, scrap and re-work costs must be minimized. Statistical Process Control (SPC) provides powerful tools to optimize production performance. These techniques are currently being applied to the forming, metallizing, and brazing of beryllia ceramic components. This paper describes specific examples of applications of SPC to dry-pressing of beryllium oxide 2x2 substrates, to Mo-Mn refractory metallization, and to metallization and brazing of plasma tubes used in lasers where adhesion strength is critical
National Research Council Canada - National Science Library
Adamson, Anthony
1998-01-01
.... It is published as three separate volumes. Volume I, USAF Logistics Process Optimization Study for the Aircraft Asset Sustainment Process -- Phase II Report, discusses the result and cost/benefit analysis of testing three initiatives...
Extension of the direct statistical approach to a volume parameter model (non-integer splitting)
International Nuclear Information System (INIS)
Burn, K.W.
1990-01-01
The Direct Statistical Approach is a rigorous mathematical derivation of the second moment for surface splitting and Russian Roulette games attached to the Monte Carlo modelling of fixed source particle transport. It has been extended to a volume parameter model (involving non-integer ''expected value'' splitting), and then to a cell model. The cell model gives second moment and time functions that have a closed form. This suggests the possibility of two different methods of solution of the optimum splitting/Russian Roulette parameters. (author)
Statistical dynamics of transient processes in a gas discharge plasma
International Nuclear Information System (INIS)
Smirnov, G.I.; Telegin, G.G.
1991-01-01
The properties of a gas discharge plasma to a great extent depend on random processes whose study has recently become particularly important. The present work is concerned with analyzing the statistical phenomena that occur during the prebreakdown stage in a gas discharge. Unlike other studies of breakdown in the discharge gap, in which secondary electron effects and photon processes at the electrodes must be considered, here the authors treat the case of an electrodeless rf discharge or a laser photoresonant plasma. The analysis is based on the balance between the rates of electron generation and recombination in the plasma. The fluctuation kinetics for ionization of atoms in the hot plasma may also play an important role when the electron temperature changes abruptly, as occurs during adiabatic pinching of the plasma or during electron cyclotron heating
Errors in patient specimen collection: application of statistical process control.
Dzik, Walter Sunny; Beckman, Neil; Selleng, Kathleen; Heddle, Nancy; Szczepiorkowski, Zbigniew; Wendel, Silvano; Murphy, Michael
2008-10-01
Errors in the collection and labeling of blood samples for pretransfusion testing increase the risk of transfusion-associated patient morbidity and mortality. Statistical process control (SPC) is a recognized method to monitor the performance of a critical process. An easy-to-use SPC method was tested to determine its feasibility as a tool for monitoring quality in transfusion medicine. SPC control charts were adapted to a spreadsheet presentation. Data tabulating the frequency of mislabeled and miscollected blood samples from 10 hospitals in five countries from 2004 to 2006 were used to demonstrate the method. Control charts were produced to monitor process stability. The participating hospitals found the SPC spreadsheet very suitable to monitor the performance of the sample labeling and collection and applied SPC charts to suit their specific needs. One hospital monitored subcategories of sample error in detail. A large hospital monitored the number of wrong-blood-in-tube (WBIT) events. Four smaller-sized facilities, each following the same policy for sample collection, combined their data on WBIT samples into a single control chart. One hospital used the control chart to monitor the effect of an educational intervention. A simple SPC method is described that can monitor the process of sample collection and labeling in any hospital. SPC could be applied to other critical steps in the transfusion processes as a tool for biovigilance and could be used to develop regional or national performance standards for pretransfusion sample collection. A link is provided to download the spreadsheet for free.
Chopra, Vikram; Bairagi, Mukesh; Trivedi, P; Nagar, Mona
2012-01-01
Statistical process control is the application of statistical methods to the measurement and analysis of variation process. Various regulatory authorities such as Validation Guidance for Industry (2011), International Conference on Harmonisation ICH Q10 (2009), the Health Canada guidelines (2009), Health Science Authority, Singapore: Guidance for Product Quality Review (2008), and International Organization for Standardization ISO-9000:2005 provide regulatory support for the application of statistical process control for better process control and understanding. In this study risk assessments, normal probability distributions, control charts, and capability charts are employed for selection of critical quality attributes, determination of normal probability distribution, statistical stability, and capability of production processes, respectively. The objective of this study is to determine tablet production process quality in the form of sigma process capability. By interpreting data and graph trends, forecasting of critical quality attributes, sigma process capability, and stability of process were studied. The overall study contributes to an assessment of process at the sigma level with respect to out-of-specification attributes produced. Finally, the study will point to an area where the application of quality improvement and quality risk assessment principles for achievement of six sigma-capable processes is possible. Statistical process control is the most advantageous tool for determination of the quality of any production process. This tool is new for the pharmaceutical tablet production process. In the case of pharmaceutical tablet production processes, the quality control parameters act as quality assessment parameters. Application of risk assessment provides selection of critical quality attributes among quality control parameters. Sequential application of normality distributions, control charts, and capability analyses provides a valid statistical
Graphene growth process modeling: a physical-statistical approach
Wu, Jian; Huang, Qiang
2014-09-01
As a zero-band semiconductor, graphene is an attractive material for a wide variety of applications such as optoelectronics. Among various techniques developed for graphene synthesis, chemical vapor deposition on copper foils shows high potential for producing few-layer and large-area graphene. Since fabrication of high-quality graphene sheets requires the understanding of growth mechanisms, and methods of characterization and control of grain size of graphene flakes, analytical modeling of graphene growth process is therefore essential for controlled fabrication. The graphene growth process starts with randomly nucleated islands that gradually develop into complex shapes, grow in size, and eventually connect together to cover the copper foil. To model this complex process, we develop a physical-statistical approach under the assumption of self-similarity during graphene growth. The growth kinetics is uncovered by separating island shapes from area growth rate. We propose to characterize the area growth velocity using a confined exponential model, which not only has clear physical explanation, but also fits the real data well. For the shape modeling, we develop a parametric shape model which can be well explained by the angular-dependent growth rate. This work can provide useful information for the control and optimization of graphene growth process on Cu foil.
International Nuclear Information System (INIS)
Sempau, J.; Bielajew, A.F.
2000-01-01
The Monte Carlo calculation of dose for radiotherapy treatment planning purposes introduces unavoidable statistical noise into the prediction of dose in a given volume element (voxel). When the doses in these voxels are summed to produce dose volume histograms (DVHs), this noise translates into a broadening of differential DVHs and correspondingly flatter DVHs. A brute force approach would entail calculating dose for long periods of time - enough to ensure that the DVHs had converged. In this paper we introduce an approach for deconvolving the statistical noise from DVHs, thereby obtaining estimates for converged DVHs obtained about 100 times faster than the brute force approach described above. There are two important implications of this work: (a) decisions based upon DVHs may be made much more economically using the new approach and (b) inverse treatment planning or optimization methods may employ Monte Carlo dose calculations at all stages of the iterative procedure since the prohibitive cost of Monte Carlo calculations at the intermediate calculation steps can be practically eliminated. (author)
Hayslett, H T
1991-01-01
Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the
Statistical reliability analyses of two wood plastic composite extrusion processes
International Nuclear Information System (INIS)
Crookston, Kevin A.; Mark Young, Timothy; Harper, David; Guess, Frank M.
2011-01-01
Estimates of the reliability of wood plastic composites (WPC) are explored for two industrial extrusion lines. The goal of the paper is to use parametric and non-parametric analyses to examine potential differences in the WPC metrics of reliability for the two extrusion lines that may be helpful for use by the practitioner. A parametric analysis of the extrusion lines reveals some similarities and disparities in the best models; however, a non-parametric analysis reveals unique and insightful differences between Kaplan-Meier survival curves for the modulus of elasticity (MOE) and modulus of rupture (MOR) of the WPC industrial data. The distinctive non-parametric comparisons indicate the source of the differences in strength between the 10.2% and 48.0% fractiles [3,183-3,517 MPa] for MOE and for MOR between the 2.0% and 95.1% fractiles [18.9-25.7 MPa]. Distribution fitting as related to selection of the proper statistical methods is discussed with relevance to estimating the reliability of WPC. The ability to detect statistical differences in the product reliability of WPC between extrusion processes may benefit WPC producers in improving product reliability and safety of this widely used house-decking product. The approach can be applied to many other safety and complex system lifetime comparisons.
Application of statistical process control to qualitative molecular diagnostic assays
LENUS (Irish Health Repository)
O'Brien, Cathal P.
2014-11-01
Modern pathology laboratories and in particular high throughput laboratories such as clinical chemistry have developed a reliable system for statistical process control (SPC). Such a system is absent from the majority of molecular laboratories and where present is confined to quantitative assays. As the inability to apply SPC to an assay is an obvious disadvantage this study aimed to solve this problem by using a frequency estimate coupled with a confidence interval calculation to detect deviations from an expected mutation frequency. The results of this study demonstrate the strengths and weaknesses of this approach and highlight minimum sample number requirements. Notably, assays with low mutation frequencies and detection of small deviations from an expected value require greater sample numbers to mitigate a protracted time to detection. Modeled laboratory data was also used to highlight how this approach might be applied in a routine molecular laboratory. This article is the first to describe the application of SPC to qualitative laboratory data.
Application of statistical process control to qualitative molecular diagnostic assays.
O'Brien, Cathal P; Finn, Stephen P
2014-01-01
Modern pathology laboratories and in particular high throughput laboratories such as clinical chemistry have developed a reliable system for statistical process control (SPC). Such a system is absent from the majority of molecular laboratories and where present is confined to quantitative assays. As the inability to apply SPC to an assay is an obvious disadvantage this study aimed to solve this problem by using a frequency estimate coupled with a confidence interval calculation to detect deviations from an expected mutation frequency. The results of this study demonstrate the strengths and weaknesses of this approach and highlight minimum sample number requirements. Notably, assays with low mutation frequencies and detection of small deviations from an expected value require greater sample numbers to mitigate a protracted time to detection. Modeled laboratory data was also used to highlight how this approach might be applied in a routine molecular laboratory. This article is the first to describe the application of SPC to qualitative laboratory data.
The application of statistical process control in linac quality assurance
International Nuclear Information System (INIS)
Li Dingyu; Dai Jianrong
2009-01-01
Objective: To improving linac quality assurance (QA) program with statistical process control (SPC) method. Methods: SPC is applied to set the control limit of QA data, draw charts and differentiate the random and systematic errors. A SPC quality assurance software named QA M ANAGER has been developed by VB programming for clinical use. Two clinical cases are analyzed with SPC to study daily output QA of a 6MV photon beam. Results: In the clinical case, the SPC is able to identify the systematic errors. Conclusion: The SPC application may be assistant to detect systematic errors in linac quality assurance thus it alarms the abnormal trend to eliminate the systematic errors and improves quality control. (authors)
Statistical evaluation of the mechanical properties of high-volume class F fly ash concretes
Yoon, Seyoon
2014-03-01
High-Volume Fly Ash (HVFA) concretes are seen by many as a feasible solution for sustainable, low embodied carbon construction. At the moment, fly ash is classified as a waste by-product, primarily of thermal power stations. In this paper the authors experimentally and statistically investigated the effects of mix-design factors on the mechanical properties of high-volume class F fly ash concretes. A total of 240 and 32 samples were produced and tested in the laboratory to measure compressive strength and Young\\'s modulus respectively. Applicability of the CEB-FIP (Comite Euro-international du Béton - Fédération Internationale de la Précontrainte) and ACI (American Concrete Institute) Building Model Code (Thomas, 2010; ACI Committee 209, 1982) [1,2] to the experimentally-derived mechanical property data for HVFA concretes was established. Furthermore, using multiple linear regression analysis, Mean Squared Residuals (MSRs) were obtained to determine whether a weight- or volume-based mix proportion is better to predict the mechanical properties of HVFA concrete. The significance levels of the design factors, which indicate how significantly the factors affect the HVFA concrete\\'s mechanical properties, were determined using analysis of variance (ANOVA) tests. The results show that a weight-based mix proportion is a slightly better predictor of mechanical properties than volume-based one. The significance level of fly ash substitution rate was higher than that of w/b ratio initially but reduced over time. © 2014 Elsevier Ltd. All rights reserved.
Directory of Open Access Journals (Sweden)
Gmel Gerrit
2010-03-01
Full Text Available Abstract Background Alcohol consumption is a major risk factor in the global burden of disease, with overall volume of exposure as the principal underlying dimension. Two main sources of data on volume of alcohol exposure are available: surveys and per capita consumption derived from routine statistics such as taxation. As both sources have significant problems, this paper presents an approach that triangulates information from both sources into disaggregated estimates in line with the overall level of per capita consumption. Methods A modeling approach was applied to the US using data from a large and representative survey, the National Epidemiologic Survey on Alcohol and Related Conditions. Different distributions (log-normal, gamma, Weibull were used to model consumption among drinkers in subgroups defined by sex, age, and ethnicity. The gamma distribution was used to shift the fitted distributions in line with the overall volume as derived from per capita estimates. Implications for alcohol-attributable fractions were presented, using liver cirrhosis as an example. Results The triangulation of survey data with aggregated per capita consumption data proved feasible and allowed for modeling of alcohol exposure disaggregated by sex, age, and ethnicity. These models can be used in combination with risk relations for burden of disease calculations. Sensitivity analyses showed that the gamma distribution chosen yielded very similar results in terms of fit and alcohol-attributable mortality as the other tested distributions. Conclusions Modeling alcohol consumption via the gamma distribution was feasible. To further refine this approach, research should focus on the main assumptions underlying the approach to explore differences between volume estimates derived from surveys and per capita consumption figures.
Monitoring a PVC batch process with multivariate statistical process control charts
Tates, A. A.; Louwerse, D. J.; Smilde, A. K.; Koot, G. L. M.; Berndt, H.
1999-01-01
Multivariate statistical process control charts (MSPC charts) are developed for the industrial batch production process of poly(vinyl chloride) (PVC). With these MSPC charts different types of abnormal batch behavior were detected on-line. With batch contribution plots, the probable causes of these
Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.
Kruger, Uwe
2012-01-01
The development and application of multivariate statistical techniques in process monitoring has gained substantial interest over the past two decades in academia and industry alike. Initially developed for monitoring and fault diagnosis in complex systems, such techniques have been refined and applied in various engineering areas, for example mechanical and manufacturing, chemical, electrical and electronic, and power engineering. The recipe for the tremendous interest in multivariate statistical techniques lies in its simplicity and adaptability for developing monitoring applica
DEFF Research Database (Denmark)
Hansen, Jens Zangenberg; Brøndsted, Povl
2013-01-01
In a previous study, Trias et al. [1] determined the minimum size of a statistical representative volume element (SRVE) of a unidirectional fibre-reinforced composite primarily based on numerical analyses of the stress/strain field. In continuation of this, the present study determines the minimu...... size of an SRVE based on a statistical analysis on the spatial statistics of the fibre packing patterns found in genuine laminates, and those generated numerically using a microstructure generator. © 2012 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved....
National Research Council Canada - National Science Library
Willsky, Alan
2004-01-01
.... Our research blends methods from several fields-statistics and probability, signal and image processing, mathematical physics, scientific computing, statistical learning theory, and differential...
Directory of Open Access Journals (Sweden)
Fahad A Al-Hussein
2009-01-01
Conclusions: A process of audits in the context of statistical process control is necessary for any improvement in the implementation of guidelines in primary care. Statistical process control charts are an effective means of visual feedback to the care providers.
Effect of drop volume and surface statistics on the superhydrophobicity of randomly rough substrates
Afferrante, L.; Carbone, G.
2018-01-01
In this paper, a simple theoretical approach is developed with the aim of evaluating shape, interfacial pressure, apparent contact angle and contact area of liquid drops gently deposed on randomly rough surfaces. This method can be useful to characterize the superhydrophobic properties of rough substrates, and to investigate the contact behavior of impacting drops. We assume that (i) the size of the apparent liquid-solid contact area is much larger than the micromorphology of the substrate, and (ii) a composite interface is always formed at the microscale. Results show apparent contact angle and liquid-solid area fraction are slightly influenced by the drop volume only at relatively high values of the root mean square roughness h rms, whereas the effect of volume is practically negligible at small h rms. The main statistical quantity affecting the superhydrophobic properties is found to be the Wenzel roughness parameter r W, which depends on the average slope of the surface heights. Moreover, transition from the Cassie-Baxter state to the Wenzel one is observed when r W reduces below a certain critical value, and theoretical predictions are found to be in good agreement with experimental data. Finally, the present method can be conveniently exploited to evaluate the occurrence of pinning phenomena in the case of impacting drops, as the Wenzel critical pressure for liquid penetration gives an estimation of the maximum impact pressure tolerated by the surface without pinning occurring.
International Nuclear Information System (INIS)
2005-01-01
For the years 2004 and 2005 the figures shown in the tables of Energy Review are partly preliminary. The annual statistics published in Energy Review are presented in more detail in a publication called Energy Statistics that comes out yearly. Energy Statistics also includes historical time-series over a longer period of time (see e.g. Energy Statistics, Statistics Finland, Helsinki 2004.) The applied energy units and conversion coefficients are shown in the back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes, precautionary stock fees and oil pollution fees
International Nuclear Information System (INIS)
Matsuda, Izuru; Hanaoka, Shohei; Akahane, Masaaki
2010-01-01
Adaptive statistical iterative reconstruction (ASIR) is a reconstruction technique for computed tomography (CT) that reduces image noise. The purpose of our study was to investigate whether ASIR improves the quality of volume-rendered (VR) CT portovenography. Institutional review board approval, with waived consent, was obtained. A total of 19 patients (12 men, 7 women; mean age 69.0 years; range 25-82 years) suspected of having liver lesions underwent three-phase enhanced CT. VR image sets were prepared with both the conventional method and ASIR. The required time to make VR images was recorded. Two radiologists performed independent qualitative evaluations of the image sets. The Wilcoxon signed-rank test was used for statistical analysis. Contrast-noise ratios (CNRs) of the portal and hepatic vein were also evaluated. Overall image quality was significantly improved by ASIR (P<0.0001 and P=0.0155 for each radiologist). ASIR enhanced CNRs of the portal and hepatic vein significantly (P<0.0001). The time required to create VR images was significantly shorter with ASIR (84.7 vs. 117.1 s; P=0.014). ASIR enhances CNRs and improves image quality in VR CT portovenography. It also shortens the time required to create liver VR CT portovenographs. (author)
Discussion of "Modern statistics for spatial point processes"
DEFF Research Database (Denmark)
Jensen, Eva Bjørn Vedel; Prokesová, Michaela; Hellmund, Gunnar
2007-01-01
ABSTRACT. The paper ‘Modern statistics for spatial point processes’ by Jesper Møller and Rasmus P. Waagepetersen is based on a special invited lecture given by the authors at the 21st Nordic Conference on Mathematical Statistics, held at Rebild, Denmark, in June 2006. At the conference, Antti...
Analyzing a Mature Software Inspection Process Using Statistical Process Control (SPC)
Barnard, Julie; Carleton, Anita; Stamper, Darrell E. (Technical Monitor)
1999-01-01
This paper presents a cooperative effort where the Software Engineering Institute and the Space Shuttle Onboard Software Project could experiment applying Statistical Process Control (SPC) analysis to inspection activities. The topics include: 1) SPC Collaboration Overview; 2) SPC Collaboration Approach and Results; and 3) Lessons Learned.
International Nuclear Information System (INIS)
2003-01-01
For the year 2002, part of the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot 2001, Statistics Finland, Helsinki 2002). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supply and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees on energy products
International Nuclear Information System (INIS)
2004-01-01
For the year 2003 and 2004, the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot, Statistics Finland, Helsinki 2003, ISSN 0785-3165). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-March 2004, Energy exports by recipient country in January-March 2004, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees
Statistical 21-cm Signal Separation via Gaussian Process Regression Analysis
Mertens, F. G.; Ghosh, A.; Koopmans, L. V. E.
2018-05-01
Detecting and characterizing the Epoch of Reionization and Cosmic Dawn via the redshifted 21-cm hyperfine line of neutral hydrogen will revolutionize the study of the formation of the first stars, galaxies, black holes and intergalactic gas in the infant Universe. The wealth of information encoded in this signal is, however, buried under foregrounds that are many orders of magnitude brighter. These must be removed accurately and precisely in order to reveal the feeble 21-cm signal. This requires not only the modeling of the Galactic and extra-galactic emission, but also of the often stochastic residuals due to imperfect calibration of the data caused by ionospheric and instrumental distortions. To stochastically model these effects, we introduce a new method based on `Gaussian Process Regression' (GPR) which is able to statistically separate the 21-cm signal from most of the foregrounds and other contaminants. Using simulated LOFAR-EoR data that include strong instrumental mode-mixing, we show that this method is capable of recovering the 21-cm signal power spectrum across the entire range k = 0.07 - 0.3 {h cMpc^{-1}}. The GPR method is most optimal, having minimal and controllable impact on the 21-cm signal, when the foregrounds are correlated on frequency scales ≳ 3 MHz and the rms of the signal has σ21cm ≳ 0.1 σnoise. This signal separation improves the 21-cm power-spectrum sensitivity by a factor ≳ 3 compared to foreground avoidance strategies and enables the sensitivity of current and future 21-cm instruments such as the Square Kilometre Array to be fully exploited.
A comprehensive analysis of the IMRT dose delivery process using statistical process control (SPC)
Energy Technology Data Exchange (ETDEWEB)
Gerard, Karine; Grandhaye, Jean-Pierre; Marchesi, Vincent; Kafrouni, Hanna; Husson, Francois; Aletti, Pierre [Research Center for Automatic Control (CRAN), Nancy University, CNRS, 54516 Vandoeuvre-les-Nancy (France); Department of Medical Physics, Alexis Vautrin Cancer Center, 54511 Vandoeuvre-les-Nancy Cedex (France) and DOSIsoft SA, 94230 Cachan (France); Research Laboratory for Innovative Processes (ERPI), Nancy University, EA 3767, 5400 Nancy Cedex (France); Department of Medical Physics, Alexis Vautrin Cancer Center, 54511 Vandoeuvre-les-Nancy Cedex (France); DOSIsoft SA, 94230 Cachan (France); Research Center for Automatic Control (CRAN), Nancy University, CNRS, 54516 Vandoeuvre-les-Nancy, France and Department of Medical Physics, Alexis Vautrin Cancer Center, 54511 Vandoeuvre-les-Nancy Cedex (France)
2009-04-15
The aim of this study is to introduce tools to improve the security of each IMRT patient treatment by determining action levels for the dose delivery process. To achieve this, the patient-specific quality control results performed with an ionization chamber--and which characterize the dose delivery process--have been retrospectively analyzed using a method borrowed from industry: Statistical process control (SPC). The latter consisted in fulfilling four principal well-structured steps. The authors first quantified the short term variability of ionization chamber measurements regarding the clinical tolerances used in the cancer center ({+-}4% of deviation between the calculated and measured doses) by calculating a control process capability (C{sub pc}) index. The C{sub pc} index was found superior to 4, which implies that the observed variability of the dose delivery process is not biased by the short term variability of the measurement. Then, the authors demonstrated using a normality test that the quality control results could be approximated by a normal distribution with two parameters (mean and standard deviation). Finally, the authors used two complementary tools--control charts and performance indices--to thoroughly analyze the IMRT dose delivery process. Control charts aim at monitoring the process over time using statistical control limits to distinguish random (natural) variations from significant changes in the process, whereas performance indices aim at quantifying the ability of the process to produce data that are within the clinical tolerances, at a precise moment. The authors retrospectively showed that the analysis of three selected control charts (individual value, moving-range, and EWMA control charts) allowed efficient drift detection of the dose delivery process for prostate and head-and-neck treatments before the quality controls were outside the clinical tolerances. Therefore, when analyzed in real time, during quality controls, they should
A comprehensive analysis of the IMRT dose delivery process using statistical process control (SPC).
Gérard, Karine; Grandhaye, Jean-Pierre; Marchesi, Vincent; Kafrouni, Hanna; Husson, François; Aletti, Pierre
2009-04-01
The aim of this study is to introduce tools to improve the security of each IMRT patient treatment by determining action levels for the dose delivery process. To achieve this, the patient-specific quality control results performed with an ionization chamber--and which characterize the dose delivery process--have been retrospectively analyzed using a method borrowed from industry: Statistical process control (SPC). The latter consisted in fulfilling four principal well-structured steps. The authors first quantified the short-term variability of ionization chamber measurements regarding the clinical tolerances used in the cancer center (+/- 4% of deviation between the calculated and measured doses) by calculating a control process capability (C(pc)) index. The C(pc) index was found superior to 4, which implies that the observed variability of the dose delivery process is not biased by the short-term variability of the measurement. Then, the authors demonstrated using a normality test that the quality control results could be approximated by a normal distribution with two parameters (mean and standard deviation). Finally, the authors used two complementary tools--control charts and performance indices--to thoroughly analyze the IMRT dose delivery process. Control charts aim at monitoring the process over time using statistical control limits to distinguish random (natural) variations from significant changes in the process, whereas performance indices aim at quantifying the ability of the process to produce data that are within the clinical tolerances, at a precise moment. The authors retrospectively showed that the analysis of three selected control charts (individual value, moving-range, and EWMA control charts) allowed efficient drift detection of the dose delivery process for prostate and head-and-neck treatments before the quality controls were outside the clinical tolerances. Therefore, when analyzed in real time, during quality controls, they should improve the
An Automated Energy Detection Algorithm Based on Morphological and Statistical Processing Techniques
2018-01-09
100 kHz, 1 MHz 100 MHz–1 GHz 1 100 kHz 3. Statistical Processing 3.1 Statistical Analysis Statistical analysis is the mathematical science...quantitative terms. In commercial prognostics and diagnostic vibrational monitoring applications , statistical techniques that are mainly used for alarm...Balakrishnan N, editors. Handbook of statistics . Amsterdam (Netherlands): Elsevier Science; 1998. p 555–602; (Order statistics and their applications
Radioactive waste package assay facility. Volume 3. Data processing
International Nuclear Information System (INIS)
Creamer, S.C.; Lalies, A.A.; Wise, M.O.
1992-01-01
This report, in three volumes, covers the work carried out by Taylor Woodrow Construction Ltd, and two major sub-contractors: Harwell Laboratory (AEA Technology) and Siemens Plessey Controls Ltd, on the development of a radioactive waste package assay facility, for cemented 500 litre intermediate level waste drums. Volume 3, describes the work carried out by Siemens Plessey Controls Ltd on the data-processing aspects of an integrated waste assay facility. It introduces the need for a mathematical model of the assay process and develops a deterministic model which could be tested using Harwell experimental data. Relevant nuclear reactions are identified. Full implementation of the model was not possible within the scope of the Harwell experimental work, although calculations suggested that the model behaved as predicted by theory. 34 figs., 52 refs., 5 tabs
Computationally efficient algorithms for statistical image processing : implementation in R
Langovoy, M.; Wittich, O.
2010-01-01
In the series of our earlier papers on the subject, we proposed a novel statistical hypothesis testing method for detection of objects in noisy images. The method uses results from percolation theory and random graph theory. We developed algorithms that allowed to detect objects of unknown shapes in
Spatio-temporal statistical models with applications to atmospheric processes
International Nuclear Information System (INIS)
Wikle, C.K.
1996-01-01
This doctoral dissertation is presented as three self-contained papers. An introductory chapter considers traditional spatio-temporal statistical methods used in the atmospheric sciences from a statistical perspective. Although this section is primarily a review, many of the statistical issues considered have not been considered in the context of these methods and several open questions are posed. The first paper attempts to determine a means of characterizing the semiannual oscillation (SAO) spatial variation in the northern hemisphere extratropical height field. It was discovered that the midlatitude SAO in 500hPa geopotential height could be explained almost entirely as a result of spatial and temporal asymmetries in the annual variation of stationary eddies. It was concluded that the mechanism for the SAO in the northern hemisphere is a result of land-sea contrasts. The second paper examines the seasonal variability of mixed Rossby-gravity waves (MRGW) in lower stratospheric over the equatorial Pacific. Advanced cyclostationary time series techniques were used for analysis. It was found that there are significant twice-yearly peaks in MRGW activity. Analyses also suggested a convergence of horizontal momentum flux associated with these waves. In the third paper, a new spatio-temporal statistical model is proposed that attempts to consider the influence of both temporal and spatial variability. This method is mainly concerned with prediction in space and time, and provides a spatially descriptive and temporally dynamic model
The Pearson diffusions: A class of statistically tractable diffusion processes
DEFF Research Database (Denmark)
Forman, Julie Lyng; Sørensen, Michael
The Pearson diffusions is a flexible class of diffusions defined by having linear drift and quadratic squared diffusion coefficient. It is demonstrated that for this class explicit statistical inference is feasible. Explicit optimal martingale estimating func- tions are found, and the corresponding...
[AN OVERALL SOUND PROCESS] Syntactic parameters, statistic parameters, and universals
Directory of Open Access Journals (Sweden)
Nicolas Meeùs
2016-05-01
My paper intends to show that comparative musicology, in facts if not in principles, appears inherently linked to the syntactic elements of music – and so also any encyclopedic project aiming at uncovering universals in music. Not that statistic elements cannot be universal, but that they cannot be commented as such, because they remain largely unquantifiable.
Simulation of the radiography formation process from CT patient volume
Energy Technology Data Exchange (ETDEWEB)
Bifulco, P; Cesarelli, M; Verso, E; Roccasalva Firenze, M; Sansone, M; Bracale, M [University of Naples, Federico II, Electronic Engineering Department, Bioengineering Unit, Via Claudio, 21 - 80125 Naples (Italy)
1999-12-31
The aim of this work is to develop an algorithm to simulate the radiographic image formation process using volumetric anatomical data of the patient, obtained from 3D diagnostic CT images. Many applications, including radiographic driven surgery, virtual reality in medicine and radiologist teaching and training, may take advantage of such technique. The designed algorithm has been developed to simulate a generic radiographic equipment, whatever oriented respect to the patient. The simulated radiography is obtained considering a discrete number of X-ray paths departing from the focus, passing through the patient volume and reaching the radiographic plane. To evaluate a generic pixel of the simulated radiography, the cumulative absorption along the corresponding X-ray is computed. To estimate X-ray absorption in a generic point of the patient volume, 3D interpolation of CT data has been adopted. The proposed technique is quite similar to those employed in Ray Tracing. A computer designed test volume has been used to assess the reliability of the radiography simulation algorithm as a measuring tool. From the errors analysis emerges that the accuracy achieved by the radiographic simulation algorithm is largely confined within the sampling step of the CT volume. (authors) 16 refs., 12 figs., 1 tabs.
Simulation of the radiography formation process from CT patient volume
International Nuclear Information System (INIS)
Bifulco, P.; Cesarelli, M.; Verso, E.; Roccasalva Firenze, M.; Sansone, M.; Bracale, M.
1998-01-01
The aim of this work is to develop an algorithm to simulate the radiographic image formation process using volumetric anatomical data of the patient, obtained from 3D diagnostic CT images. Many applications, including radiographic driven surgery, virtual reality in medicine and radiologist teaching and training, may take advantage of such technique. The designed algorithm has been developed to simulate a generic radiographic equipment, whatever oriented respect to the patient. The simulated radiography is obtained considering a discrete number of X-ray paths departing from the focus, passing through the patient volume and reaching the radiographic plane. To evaluate a generic pixel of the simulated radiography, the cumulative absorption along the corresponding X-ray is computed. To estimate X-ray absorption in a generic point of the patient volume, 3D interpolation of CT data has been adopted. The proposed technique is quite similar to those employed in Ray Tracing. A computer designed test volume has been used to assess the reliability of the radiography simulation algorithm as a measuring tool. From the errors analysis emerges that the accuracy achieved by the radiographic simulation algorithm is largely confined within the sampling step of the CT volume. (authors)
Multivariate statistical analysis of a multi-step industrial processes
DEFF Research Database (Denmark)
Reinikainen, S.P.; Høskuldsson, Agnar
2007-01-01
Monitoring and quality control of industrial processes often produce information on how the data have been obtained. In batch processes, for instance, the process is carried out in stages; some process or control parameters are set at each stage. However, the obtained data might not be utilized...... efficiently, even if this information may reveal significant knowledge about process dynamics or ongoing phenomena. When studying the process data, it may be important to analyse the data in the light of the physical or time-wise development of each process step. In this paper, a unified approach to analyse...... multivariate multi-step processes, where results from each step are used to evaluate future results, is presented. The methods presented are based on Priority PLS Regression. The basic idea is to compute the weights in the regression analysis for given steps, but adjust all data by the resulting score vectors...
About statistical process contribution to elastic diffraction scattering
International Nuclear Information System (INIS)
Ismanov, E.I.; Dzhuraev, Sh. Kh.; Paluanov, B.K.
1999-01-01
The experimental data on angular distribution show two basic properties. The first one is the presence of back and front peaks. The second one is the angular isotropic distribution near 90 degree, and has a big energy dependence. Different models for partial amplitudes a dl of the diffraction statistical scattering, particularly the model with Gaussian and exponential density distribution, were considered. The experimental data on pp-scattering were analyzed using the examined models
Bayesian Nonparametric Statistical Inference for Shock Models and Wear Processes.
1979-12-01
also note that the results in Section 2 do not depend on the support of F .) This shock model have been studied by Esary, Marshall and Proschan (1973...Barlow and Proschan (1975), among others. The analogy of the shock model in risk and acturial analysis has been given by BUhlmann (1970, Chapter 2... Mathematical Statistics, Vol. 4, pp. 894-906. Billingsley, P. (1968), CONVERGENCE OF PROBABILITY MEASURES, John Wiley, New York. BUhlmann, H. (1970
Statistical data processing of mobility curves of univalent weak bases
Czech Academy of Sciences Publication Activity Database
Šlampová, Andrea; Boček, Petr
2008-01-01
Roč. 29, č. 2 (2008), s. 538-541 ISSN 0173-0835 R&D Projects: GA AV ČR IAA400310609; GA ČR GA203/05/2106 Institutional research plan: CEZ:AV0Z40310501 Keywords : mobility curve * univalent weak bases * statistical evaluation Subject RIV: CB - Analytical Chemistry, Separation Impact factor: 3.509, year: 2008
Statistical tests for power-law cross-correlated processes
Podobnik, Boris; Jiang, Zhi-Qiang; Zhou, Wei-Xing; Stanley, H. Eugene
2011-12-01
For stationary time series, the cross-covariance and the cross-correlation as functions of time lag n serve to quantify the similarity of two time series. The latter measure is also used to assess whether the cross-correlations are statistically significant. For nonstationary time series, the analogous measures are detrended cross-correlations analysis (DCCA) and the recently proposed detrended cross-correlation coefficient, ρDCCA(T,n), where T is the total length of the time series and n the window size. For ρDCCA(T,n), we numerically calculated the Cauchy inequality -1≤ρDCCA(T,n)≤1. Here we derive -1≤ρDCCA(T,n)≤1 for a standard variance-covariance approach and for a detrending approach. For overlapping windows, we find the range of ρDCCA within which the cross-correlations become statistically significant. For overlapping windows we numerically determine—and for nonoverlapping windows we derive—that the standard deviation of ρDCCA(T,n) tends with increasing T to 1/T. Using ρDCCA(T,n) we show that the Chinese financial market's tendency to follow the U.S. market is extremely weak. We also propose an additional statistical test that can be used to quantify the existence of cross-correlations between two power-law correlated time series.
Alternating event processes during lifetimes: population dynamics and statistical inference.
Shinohara, Russell T; Sun, Yifei; Wang, Mei-Cheng
2018-01-01
In the literature studying recurrent event data, a large amount of work has been focused on univariate recurrent event processes where the occurrence of each event is treated as a single point in time. There are many applications, however, in which univariate recurrent events are insufficient to characterize the feature of the process because patients experience nontrivial durations associated with each event. This results in an alternating event process where the disease status of a patient alternates between exacerbations and remissions. In this paper, we consider the dynamics of a chronic disease and its associated exacerbation-remission process over two time scales: calendar time and time-since-onset. In particular, over calendar time, we explore population dynamics and the relationship between incidence, prevalence and duration for such alternating event processes. We provide nonparametric estimation techniques for characteristic quantities of the process. In some settings, exacerbation processes are observed from an onset time until death; to account for the relationship between the survival and alternating event processes, nonparametric approaches are developed for estimating exacerbation process over lifetime. By understanding the population dynamics and within-process structure, the paper provide a new and general way to study alternating event processes.
Statistical process control support during Defense Waste Processing Facility chemical runs
International Nuclear Information System (INIS)
Brown, K.G.
1994-01-01
The Product Composition Control System (PCCS) has been developed to ensure that the wasteforms produced by the Defense Waste Processing Facility (DWPF) at the Savannah River Site (SRS) will satisfy the regulatory and processing criteria that will be imposed. The PCCS provides rigorous, statistically-defensible management of a noisy, multivariate system subject to multiple constraints. The system has been successfully tested and has been used to control the production of the first two melter feed batches during DWPF Chemical Runs. These operations will demonstrate the viability of the DWPF process. This paper provides a brief discussion of the technical foundation for the statistical process control algorithms incorporated into PCCS, and describes the results obtained and lessons learned from DWPF Cold Chemical Run operations. The DWPF will immobilize approximately 130 million liters of high-level nuclear waste currently stored at the Site in 51 carbon steel tanks. Waste handling operations separate this waste into highly radioactive sludge and precipitate streams and less radioactive water soluble salts. (In a separate facility, soluble salts are disposed of as low-level waste in a mixture of cement slag, and flyash.) In DWPF, the precipitate steam (Precipitate Hydrolysis Aqueous or PHA) is blended with the insoluble sludge and ground glass frit to produce melter feed slurry which is continuously fed to the DWPF melter. The melter produces a molten borosilicate glass which is poured into stainless steel canisters for cooling and, ultimately, shipment to and storage in a geologic repository
Statistical process control of cocrystallization processes: A comparison between OPLS and PLS.
Silva, Ana F T; Sarraguça, Mafalda Cruz; Ribeiro, Paulo R; Santos, Adenilson O; De Beer, Thomas; Lopes, João Almeida
2017-03-30
Orthogonal partial least squares regression (OPLS) is being increasingly adopted as an alternative to partial least squares (PLS) regression due to the better generalization that can be achieved. Particularly in multivariate batch statistical process control (BSPC), the use of OPLS for estimating nominal trajectories is advantageous. In OPLS, the nominal process trajectories are expected to be captured in a single predictive principal component while uncorrelated variations are filtered out to orthogonal principal components. In theory, OPLS will yield a better estimation of the Hotelling's T 2 statistic and corresponding control limits thus lowering the number of false positives and false negatives when assessing the process disturbances. Although OPLS advantages have been demonstrated in the context of regression, its use on BSPC was seldom reported. This study proposes an OPLS-based approach for BSPC of a cocrystallization process between hydrochlorothiazide and p-aminobenzoic acid monitored on-line with near infrared spectroscopy and compares the fault detection performance with the same approach based on PLS. A series of cocrystallization batches with imposed disturbances were used to test the ability to detect abnormal situations by OPLS and PLS-based BSPC methods. Results demonstrated that OPLS was generally superior in terms of sensibility and specificity in most situations. In some abnormal batches, it was found that the imposed disturbances were only detected with OPLS. Copyright © 2017 Elsevier B.V. All rights reserved.
Multivariate Statistical Process Optimization in the Industrial Production of Enzymes
DEFF Research Database (Denmark)
Klimkiewicz, Anna
of productyield. The potential of NIR technology to monitor the activity of the enzyme has beenthe subject of a feasibility study presented in PAPER I. It included (a) evaluation onwhich of the two real-time NIR flow cell configurations is the preferred arrangementfor monitoring of the retentate stream downstream...... strategies for theorganization of these datasets, with varying number of timestamps, into datastructures fit for latent variable (LV) modeling, have been compared. The ultimateaim of the data mining steps is the construction of statistical ‘soft models’ whichcapture the principle or latent behavior...
Signal processing and statistical analysis of spaced-based measurements
International Nuclear Information System (INIS)
Iranpour, K.
1996-05-01
The reports deals with data obtained by the ROSE rocket project. This project was designed to investigate the low altitude auroral instabilities in the electrojet region. The spectral and statistical analyses indicate the existence of unstable waves in the ionized gas in the region. An experimentally obtained dispersion relation for these waves were established. It was demonstrated that the characteristic phase velocities are much lower than what is expected from the standard theoretical results. This analysis of the ROSE data indicate the cascading of energy from lower to higher frequencies. 44 refs., 54 figs
Statistical and signal-processing concepts in surface metrology
International Nuclear Information System (INIS)
Church, E.L.; Takacs, P.Z.
1986-03-01
This paper proposes the use of a simple two-scale model of surface roughness for testing and specifying the topographic figure and finish of synchrotron-radiation mirrors. In this approach the effects of figure and finish are described in terms of their slope distribution and power spectrum, respectively, which are then combined with the system point spread function to produce a composite image. The result can be used to predict mirror performance or to translate design requirements into manufacturing specifications. Pacing problems in this approach are the development of a practical long-trace slope-profiling instrument and realistic statistical models for figure and finish errors
Statistical and signal-processing concepts in surface metrology
Energy Technology Data Exchange (ETDEWEB)
Church, E.L.; Takacs, P.Z.
1986-03-01
This paper proposes the use of a simple two-scale model of surface roughness for testing and specifying the topographic figure and finish of synchrotron-radiation mirrors. In this approach the effects of figure and finish are described in terms of their slope distribution and power spectrum, respectively, which are then combined with the system point spread function to produce a composite image. The result can be used to predict mirror performance or to translate design requirements into manufacturing specifications. Pacing problems in this approach are the development of a practical long-trace slope-profiling instrument and realistic statistical models for figure and finish errors.
Ready-to-Use Simulation: Demystifying Statistical Process Control
Sumukadas, Narendar; Fairfield-Sonn, James W.; Morgan, Sandra
2005-01-01
Business students are typically introduced to the concept of process management in their introductory course on operations management. A very important learning outcome here is an appreciation that the management of processes is a key to the management of quality. Some of the related concepts are qualitative, such as strategic and behavioral…
An easy and low cost option for economic statistical process control ...
African Journals Online (AJOL)
An easy and low cost option for economic statistical process control using Excel. ... in both economic and economic statistical designs of the X-control chart. ... in this paper and the numerical examples illustrated are executed on this program.
Process innovations to minimize waste volumes at Savannah River
International Nuclear Information System (INIS)
Doherty, J.P.
1986-01-01
In 1983 approximately 1.6 x 10 3 m 3 (427,000 gallons) of radioactive salt solution were decontaminated in a full-scale demonstration. The cesium decontamination factor (DF) was in excess of 4 x 10 4 vs. a goal of 1 x 10 4 . Data from this test were combined with pilot data and used to design the permanent facilities currently under construction. Startup of the Salt Decontamination Process is scheduled for 1987 and will decontaminate 2 x 10 4 m 3 (5.2 million gallons) of radioactive salt solution and generate 2 x 10 3 m 3 (520,000 gallons) of concentrated and washed precipitate per year. The Defense Waste Processing Facility (DWPF) will begin processing this concentrate in the Precipitate Hydrolysis Process starting in 1989. Laboratory data using simulated salt solution and nonradioactive cesium are being used to design this process. A 1/5-scale pilot plant is under construction and will be used to gain large-scale operating experience using nonradioactive simulants. This pilot plant is scheduled to startup in early 1987. The incentives to reduce the volume of waste that must be treated are self-evident. At Savannah River process development innovations to minimize the DWPF feed volumes have directly improved the economics of the process. The integrity of the final borosilicate glass water form has not been compromised by these developments. Many of the unit operations are familiar to chemical engineers and were put to use in a unique environment. As a result, tax dollars have been saved, and the objective of safely disposing of the nation's high-level defense waste has moved forward
The Use of Statistical Methods in Dimensional Process Control
National Research Council Canada - National Science Library
Krajcsik, Stephen
1985-01-01
... erection. To achieve this high degree of unit accuracy, we have begun a pilot dimensional control program that has set the guidelines for systematically monitoring each stage of the production process prior to erection...
Radar Derived Spatial Statistics of Summer Rain. Volume 2; Data Reduction and Analysis
Konrad, T. G.; Kropfli, R. A.
1975-01-01
Data reduction and analysis procedures are discussed along with the physical and statistical descriptors used. The statistical modeling techniques are outlined and examples of the derived statistical characterization of rain cells in terms of the several physical descriptors are presented. Recommendations concerning analyses which can be pursued using the data base collected during the experiment are included.
Automated force volume image processing for biological samples.
Directory of Open Access Journals (Sweden)
Pavel Polyakov
2011-04-01
Full Text Available Atomic force microscopy (AFM has now become a powerful technique for investigating on a molecular level, surface forces, nanomechanical properties of deformable particles, biomolecular interactions, kinetics, and dynamic processes. This paper specifically focuses on the analysis of AFM force curves collected on biological systems, in particular, bacteria. The goal is to provide fully automated tools to achieve theoretical interpretation of force curves on the basis of adequate, available physical models. In this respect, we propose two algorithms, one for the processing of approach force curves and another for the quantitative analysis of retraction force curves. In the former, electrostatic interactions prior to contact between AFM probe and bacterium are accounted for and mechanical interactions operating after contact are described in terms of Hertz-Hooke formalism. Retraction force curves are analyzed on the basis of the Freely Jointed Chain model. For both algorithms, the quantitative reconstruction of force curves is based on the robust detection of critical points (jumps, changes of slope or changes of curvature which mark the transitions between the various relevant interactions taking place between the AFM tip and the studied sample during approach and retraction. Once the key regions of separation distance and indentation are detected, the physical parameters describing the relevant interactions operating in these regions are extracted making use of regression procedure for fitting experiments to theory. The flexibility, accuracy and strength of the algorithms are illustrated with the processing of two force-volume images, which collect a large set of approach and retraction curves measured on a single biological surface. For each force-volume image, several maps are generated, representing the spatial distribution of the searched physical parameters as estimated for each pixel of the force-volume image.
Goodman, Joseph W.
2000-07-01
The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson The Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences Robert G. Bartle The Elements of Integration and Lebesgue Measure George E. P. Box & Norman R. Draper Evolutionary Operation: A Statistical Method for Process Improvement George E. P. Box & George C. Tiao Bayesian Inference in Statistical Analysis R. W. Carter Finite Groups of Lie Type: Conjugacy Classes and Complex Characters R. W. Carter Simple Groups of Lie Type William G. Cochran & Gertrude M. Cox Experimental Designs, Second Edition Richard Courant Differential and Integral Calculus, Volume I RIchard Courant Differential and Integral Calculus, Volume II Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume I Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume II D. R. Cox Planning of Experiments Harold S. M. Coxeter Introduction to Geometry, Second Edition Charles W. Curtis & Irving Reiner Representation Theory of Finite Groups and Associative Algebras Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume I Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume II Cuthbert Daniel Fitting Equations to Data: Computer Analysis of Multifactor Data, Second Edition Bruno de Finetti Theory of Probability, Volume I Bruno de Finetti Theory of Probability, Volume 2 W. Edwards Deming Sample Design in Business Research
Processing and statistical analysis of soil-root images
Razavi, Bahar S.; Hoang, Duyen; Kuzyakov, Yakov
2016-04-01
Importance of the hotspots such as rhizosphere, the small soil volume that surrounds and is influenced by plant roots, calls for spatially explicit methods to visualize distribution of microbial activities in this active site (Kuzyakov and Blagodatskaya, 2015). Zymography technique has previously been adapted to visualize the spatial dynamics of enzyme activities in rhizosphere (Spohn and Kuzyakov, 2014). Following further developing of soil zymography -to obtain a higher resolution of enzyme activities - we aimed to 1) quantify the images, 2) determine whether the pattern (e.g. distribution of hotspots in space) is clumped (aggregated) or regular (dispersed). To this end, we incubated soil-filled rhizoboxes with maize Zea mays L. and without maize (control box) for two weeks. In situ soil zymography was applied to visualize enzymatic activity of β-glucosidase and phosphatase at soil-root interface. Spatial resolution of fluorescent images was improved by direct application of a substrate saturated membrane to the soil-root system. Furthermore, we applied "spatial point pattern analysis" to determine whether the pattern (e.g. distribution of hotspots in space) is clumped (aggregated) or regular (dispersed). Our results demonstrated that distribution of hotspots at rhizosphere is clumped (aggregated) compare to control box without plant which showed regular (dispersed) pattern. These patterns were similar in all three replicates and for both enzymes. We conclude that improved zymography is promising in situ technique to identify, analyze, visualize and quantify spatial distribution of enzyme activities in the rhizosphere. Moreover, such different patterns should be considered in assessments and modeling of rhizosphere extension and the corresponding effects on soil properties and functions. Key words: rhizosphere, spatial point pattern, enzyme activity, zymography, maize.
Counting statistics of non-markovian quantum stochastic processes
DEFF Research Database (Denmark)
Flindt, Christian; Novotny, T.; Braggio, A.
2008-01-01
We derive a general expression for the cumulant generating function (CGF) of non-Markovian quantum stochastic transport processes. The long-time limit of the CGF is determined by a single dominating pole of the resolvent of the memory kernel from which we extract the zero-frequency cumulants...
Statistical optimization of process parameters for the production of ...
African Journals Online (AJOL)
In this study, optimization of process parameters such as moisture content, incubation temperature and initial pH (fixed) for the improvement of citric acid production from oil palm empty fruit bunches through solid state bioconversion was carried out using traditional one-factor-at-a-time (OFAT) method and response surface ...
International Nuclear Information System (INIS)
Ono, A.; Horiuchi, H.
1996-01-01
Statistical properties of antisymmetrized molecular dynamics (AMD) are classical in the case of nucleon-emission processes, while they are quantum mechanical for the processes without nucleon emission. In order to understand this situation, we first clarify that there coexist mutually opposite two statistics in the AMD framework: One is the classical statistics of the motion of wave packet centroids and the other is the quantum statistics of the motion of wave packets which is described by the AMD wave function. We prove the classical statistics of wave packet centroids by using the framework of the microcanonical ensemble of the nuclear system with a realistic effective two-nucleon interaction. We show that the relation between the classical statistics of wave packet centroids and the quantum statistics of wave packets can be obtained by taking into account the effects of the wave packet spread. This relation clarifies how the quantum statistics of wave packets emerges from the classical statistics of wave packet centroids. It is emphasized that the temperature of the classical statistics of wave packet centroids is different from the temperature of the quantum statistics of wave packets. We then explain that the statistical properties of AMD for nucleon-emission processes are classical because nucleon-emission processes in AMD are described by the motion of wave packet centroids. We further show that when we improve the description of the nucleon-emission process so as to take into account the momentum fluctuation due to the wave packet spread, the AMD statistical properties for nucleon-emission processes change drastically into quantum statistics. Our study of nucleon-emission processes can be conversely regarded as giving another kind of proof of the fact that the statistics of wave packets is quantum mechanical while that of wave packet centroids is classical. copyright 1996 The American Physical Society
Safford, Robert R.; Jackson, Andrew E.; Swart, William W.; Barth, Timothy S.
1994-01-01
Successful ground processing at KSC requires that flight hardware and ground support equipment conform to specifications at tens of thousands of checkpoints. Knowledge of conformance is an essential requirement for launch. That knowledge of conformance at every requisite point does not, however, enable identification of past problems with equipment, or potential problem areas. This paper describes how the introduction of Statistical Process Control and Process Capability Analysis identification procedures into existing shuttle processing procedures can enable identification of potential problem areas and candidates for improvements to increase processing performance measures. Results of a case study describing application of the analysis procedures to Thermal Protection System processing are used to illustrate the benefits of the approaches described in the paper.
A statistical approach to define some tofu processing conditions
Directory of Open Access Journals (Sweden)
Vera de Toledo Benassi
2011-12-01
Full Text Available The aim of this work was to make tofu from soybean cultivar BRS 267 under different processing conditions in order to evaluate the influence of each treatment on the product quality. A fractional factorial 2(5-1 design was used, in which independent variables (thermal treatment, coagulant concentration, coagulation time, curd cutting, and draining time were tested at two different levels. The response variables studied were hardness, yield, total solids, and protein content of tofu. Polynomial models were generated for each response. To obtain tofu with desirable characteristics (hardness ~4 N, yield 306 g tofu.100 g-1 soybeans, 12 g proteins.100 g-1 tofu and 22 g solids.100 g-1 tofu, the following processing conditions were selected: heating until boiling plus 10 minutes in water bath, 2% dihydrated CaSO4 w/w, 10 minutes coagulation, curd cutting, and 30 minutes draining time.
Use of statistical process control in evaluation of academic performance
Directory of Open Access Journals (Sweden)
Ezequiel Gibbon Gautério
2014-05-01
Full Text Available The aim of this article was to study some indicators of academic performance (number of students per class, dropout rate, failure rate and scores obtained by the students to identify a pattern of behavior that would enable to implement improvements in the teaching-learning process. The sample was composed of five classes of undergraduate courses in Engineering. The data were collected for three years. Initially an exploratory analysis with analytical and graphical techniques was performed. An analysis of variance and Tukey’s test investigated some sources of variability. This information was used in the construction of control charts. We have found evidence that classes with more students are associated with higher failure rates and lower mean. Moreover, when the course was later in the curriculum, the students had higher scores. The results showed that although they have been detected some special causes interfering in the process, it was possible to stabilize it and to monitor it.
Statistical and dynamical aspects in fission process: The rotational ...
Indian Academy of Sciences (India)
the fission process, during the evolution from compound nucleus to the ..... For fission induced by light particles like n, p, and α, the total angular momenta ... 96 MeV. 16O+232Th. SaddleTSM. 72 MeV. 10B+232Th. 1.2. 1.4. 1.6. 1.8. 80 ... Systematic investigations in both light- and heavy-ion-induced fissions have shown that.
International Nuclear Information System (INIS)
1980-08-01
This Volume II presents engineering feasibility evaluations of the eleven processes for solidification of nuclear high-level liquid wastes (HHLW) described in Volume I of this report. Each evaluation was based in a systematic assessment of the process in respect to six principal evaluation criteria: complexity of process; state of development; safety; process requirements; development work required; and facility requirements. The principal criteria were further subdivided into a total of 22 subcriteria, each of which was assigned a weight. Each process was then assigned a figure of merit, on a scale of 1 to 10, for each of the subcriteria. A total rating was obtained for each process by summing the products of the subcriteria ratings and the subcriteria weights. The evaluations were based on the process descriptions presented in Volume I of this report, supplemented by information obtained from the literature, including publications by the originators of the various processes. Waste form properties were, in general, not evaluated. This document describes the approach which was taken, the developent and application of the rating criteria and subcriteria, and the evaluation results. A series of appendices set forth summary descriptions of the processes and the ratings, together with the complete numerical ratings assigned; two appendices present further technical details on the rating process
Method of volume-reducing processing for radioactive wastes
International Nuclear Information System (INIS)
Sato, Koei; Yamauchi, Noriyuki; Hirayama, Toshihiko.
1985-01-01
Purpose: To process the processing products of radioactive liquid wastes and burnable solid wastes produced from nuclear facilities into stable solidification products by heat melting. Method: At first, glass fiber wastes of contaminated air filters are charged in a melting furnace. Then, waste products obtained through drying, sintering, incineration, etc. are mixed with a proper amount of glass fibers and charged into the melting furnace. Both of the charged components are heated to a temperature at which the glass fibers are melted. The burnable materials are burnt out to provide a highly volume-reduced products. When the products are further heated to a temperature at which metals or metal oxides of a higher melting point than the glass fiber, the glass fibers and the metals or metal oxides are fused to each other to be combined in a molecular structure into more stabilized products. The products are excellent in strength, stability, durability and leaching resistance at ambient temperature. (Kamimura, M.)
Intertime jump statistics of state-dependent Poisson processes.
Daly, Edoardo; Porporato, Amilcare
2007-01-01
A method to obtain the probability distribution of the interarrival times of jump occurrences in systems driven by state-dependent Poisson noise is proposed. Such a method uses the survivor function obtained by a modified version of the master equation associated to the stochastic process under analysis. A model for the timing of human activities shows the capability of state-dependent Poisson noise to generate power-law distributions. The application of the method to a model for neuron dynamics and to a hydrological model accounting for land-atmosphere interaction elucidates the origin of characteristic recurrence intervals and possible persistence in state-dependent Poisson models.
Nonlinear Statistical Signal Processing: A Particle Filtering Approach
International Nuclear Information System (INIS)
Candy, J.
2007-01-01
A introduction to particle filtering is discussed starting with an overview of Bayesian inference from batch to sequential processors. Once the evolving Bayesian paradigm is established, simulation-based methods using sampling theory and Monte Carlo realizations are discussed. Here the usual limitations of nonlinear approximations and non-gaussian processes prevalent in classical nonlinear processing algorithms (e.g. Kalman filters) are no longer a restriction to perform Bayesian inference. It is shown how the underlying hidden or state variables are easily assimilated into this Bayesian construct. Importance sampling methods are then discussed and shown how they can be extended to sequential solutions implemented using Markovian state-space models as a natural evolution. With this in mind, the idea of a particle filter, which is a discrete representation of a probability distribution, is developed and shown how it can be implemented using sequential importance sampling/resampling methods. Finally, an application is briefly discussed comparing the performance of the particle filter designs with classical nonlinear filter implementations
Statistical problems raised by data processing of food surveys
International Nuclear Information System (INIS)
Lacourly, Nancy
1974-01-01
The methods used for the analysis of dietary habits of national populations - food surveys - have been studied. S. Lederman's linear model for the estimation of the average individual consumptions from the total family diets was in the light of a food survey carried on with 250 Roman families in 1969. An important bias in the estimates thus obtained was shown out by a simulation assuming 'housewife's dictatorship'; these assumptions should contribute to set up an unbiased model. Several techniques of multidimensional analysis were therefore used and the theoretical aspect of linear regression for some particular situations had to be investigated: quasi-colinear 'independent variables', measurements with errors, positive constraints on regression coefficients. A new survey methodology was developed taking account of the new 'Integrated Information Systems', which have incidence on all the stages of a consumption survey: organization, data collection, constitution of an information bank and data processing. (author) [fr
Energy Technology Data Exchange (ETDEWEB)
Beane, S R; Detmold, W; Lin, H W; Luu, T C; Orginos, K; Parreno, A; Savage, M J; Torok, A; Walker-Loud, A
2011-07-01
The volume dependence of the octet baryon masses and relations among them are explored with Lattice QCD. Calculations are performed with nf = 2 + 1 clover fermion discretization in four lattice volumes, with spatial extent L ? 2.0, 2.5, 3.0 and 4.0 fm, with an anisotropic lattice spacing of b_s ? 0.123 fm in the spatial direction, and b_t = b_s/3.5 in the time direction, and at a pion mass of m_\\pi ? 390 MeV. The typical precision of the ground-state baryon mass determination is volume dependence of the masses, the Gell-Mann Okubo mass-relation, and of other mass combinations. A comparison with the predictions of heavy baryon chiral perturbation theory is performed in both the SU(2)L ? SU(2)R and SU(3)L ? SU(3)R expansions. Predictions of the three-flavor expansion for the hadron masses are found to describe the observed volume dependences reasonably well. Further, the ?N? axial coupling constant is extracted from the volume dependence of the nucleon mass in the two-flavor expansion, with only small modifications in the three-flavor expansion from the inclusion of kaons and eta's. At a given value of m?L, the finite-volume contributions to the nucleon mass are predicted to be significantly smaller at m_\\pi ? 140 MeV than at m_\\pi ? 390 MeV due to a coefficient that scales as ? m_\\pi^3. This is relevant for the design of future ensembles of lattice gauge-field configurations. Finally, the volume dependence of the pion and kaon masses are analyzed with two-flavor and three-flavor chiral perturbation theory.
Hall, Michelle G; Mattingley, Jason B; Dux, Paul E
2015-08-01
The brain exploits redundancies in the environment to efficiently represent the complexity of the visual world. One example of this is ensemble processing, which provides a statistical summary of elements within a set (e.g., mean size). Another is statistical learning, which involves the encoding of stable spatial or temporal relationships between objects. It has been suggested that ensemble processing over arrays of oriented lines disrupts statistical learning of structure within the arrays (Zhao, Ngo, McKendrick, & Turk-Browne, 2011). Here we asked whether ensemble processing and statistical learning are mutually incompatible, or whether this disruption might occur because ensemble processing encourages participants to process the stimulus arrays in a way that impedes statistical learning. In Experiment 1, we replicated Zhao and colleagues' finding that ensemble processing disrupts statistical learning. In Experiments 2 and 3, we found that statistical learning was unimpaired by ensemble processing when task demands necessitated (a) focal attention to individual items within the stimulus arrays and (b) the retention of individual items in working memory. Together, these results are consistent with an account suggesting that ensemble processing and statistical learning can operate over the same stimuli given appropriate stimulus processing demands during exposure to regularities. (c) 2015 APA, all rights reserved).
Multiplicative Process in Turbulent Velocity Statistics: A Simplified Analysis
Chillà, F.; Peinke, J.; Castaing, B.
1996-04-01
A lot of models in turbulence links the energy cascade process and intermittency, the characteristic of which being the shape evolution of the probability density functions (pdf) for longitudinal velocity increments. Using recent models and experimental results, we show that the flatness factor of these pdf gives a simple and direct estimate for what is called the deepness of the cascade. We analyse in this way the published data of a Direct Numerical Simulation and show that the deepness of the cascade presents the same Reynolds number dependence as in laboratory experiments. Plusieurs modèles de turbulence relient la cascade d'énergie et l'intermittence, caractérisée par l'évolution des densités de probabilité (pdf) des incréments longitudinaux de vitesse. Nous appuyant aussi bien sur des modèles récents que sur des résultats expérimentaux, nous montrons que la Curtosis de ces pdf permet une estimation simple et directe de la profondeur de la cascade. Cela nous permet de réanalyser les résultats publiés d'une simulation numérique et de montrer que la profondeur de la cascade y évolue de la même façon que pour les expériences de laboratoire en fonction du nombre de Reynolds.
E.W. Fobes; R.W. Rowe
1968-01-01
A system for classifying wood-using industries and recording pertinent statistics for automatic data processing is described. Forms and coding instructions for recording data of primary processing plants are included.
Statistical Inference on Memory Structure of Processes and Its Applications to Information Theory
2016-05-12
Distribution Unlimited UU UU UU UU 12-05-2016 15-May-2014 14-Feb-2015 Final Report: Statistical Inference on Memory Structure of Processes and Its Applications ...ES) U.S. Army Research Office P.O. Box 12211 Research Triangle Park, NC 27709-2211 mathematical statistics ; time series; Markov chains; random...journals: Final Report: Statistical Inference on Memory Structure of Processes and Its Applications to Information Theory Report Title Three areas
Initial uncertainty impacts statistical learning in sound sequence processing.
Todd, Juanita; Provost, Alexander; Whitson, Lisa; Mullens, Daniel
2016-11-01
This paper features two studies confirming a lasting impact of first learning on how subsequent experience is weighted in early relevance-filtering processes. In both studies participants were exposed to sequences of sound that contained a regular pattern on two different timescales. Regular patterning in sound is readily detected by the auditory system and used to form "prediction models" that define the most likely properties of sound to be encountered in a given context. The presence and strength of these prediction models is inferred from changes in automatically elicited components of auditory evoked potentials. Both studies employed sound sequences that contained both a local and longer-term pattern. The local pattern was defined by a regular repeating pure tone occasionally interrupted by a rare deviating tone (p=0.125) that was physically different (a 30msvs. 60ms duration difference in one condition and a 1000Hz vs. 1500Hz frequency difference in the other). The longer-term pattern was defined by the rate at which the two tones alternated probabilities (i.e., the tone that was first rare became common and the tone that was first common became rare). There was no task related to the tones and participants were asked to ignore them while focussing attention on a movie with subtitles. Auditory-evoked potentials revealed long lasting modulatory influences based on whether the tone was initially encountered as rare and unpredictable or common and predictable. The results are interpreted as evidence that probability (or indeed predictability) assigns a differential information-value to the two tones that in turn affects the extent to which prediction models are updated and imposed. These effects are exposed for both common and rare occurrences of the tones. The studies contribute to a body of work that reveals that probabilistic information is not faithfully represented in these early evoked potentials and instead exposes that predictability (or conversely
Shiraishi, Satomi; Grams, Michael P; Fong de Los Santos, Luis E
2018-05-01
The purpose of this study was to demonstrate an objective quality control framework for the image review process. A total of 927 cone-beam computed tomography (CBCT) registrations were retrospectively analyzed for 33 bilateral head and neck cancer patients who received definitive radiotherapy. Two registration tracking volumes (RTVs) - cervical spine (C-spine) and mandible - were defined, within which a similarity metric was calculated and used as a registration quality tracking metric over the course of treatment. First, sensitivity to large misregistrations was analyzed for normalized cross-correlation (NCC) and mutual information (MI) in the context of statistical analysis. The distribution of metrics was obtained for displacements that varied according to a normal distribution with standard deviation of σ = 2 mm, and the detectability of displacements greater than 5 mm was investigated. Then, similarity metric control charts were created using a statistical process control (SPC) framework to objectively monitor the image registration and review process. Patient-specific control charts were created using NCC values from the first five fractions to set a patient-specific process capability limit. Population control charts were created using the average of the first five NCC values for all patients in the study. For each patient, the similarity metrics were calculated as a function of unidirectional translation, referred to as the effective displacement. Patient-specific action limits corresponding to 5 mm effective displacements were defined. Furthermore, effective displacements of the ten registrations with the lowest similarity metrics were compared with a three dimensional (3DoF) couch displacement required to align the anatomical landmarks. Normalized cross-correlation identified suboptimal registrations more effectively than MI within the framework of SPC. Deviations greater than 5 mm were detected at 2.8σ and 2.1σ from the mean for NCC and MI
Weibull statistics effective area and volume in the ball-on-ring testing method
DEFF Research Database (Denmark)
Frandsen, Henrik Lund
2014-01-01
The ball-on-ring method is together with other biaxial bending methods often used for measuring the strength of plates of brittle materials, because machining defects are remote from the high stresses causing the failure of the specimens. In order to scale the measured Weibull strength...... to geometries relevant for the application of the material, the effective area or volume for the test specimen must be evaluated. In this work analytical expressions for the effective area and volume of the ball-on-ring test specimen is derived. In the derivation the multiaxial stress field has been accounted...
The extraction and integration framework: a two-process account of statistical learning.
Thiessen, Erik D; Kronstein, Alexandra T; Hufnagle, Daniel G
2013-07-01
The term statistical learning in infancy research originally referred to sensitivity to transitional probabilities. Subsequent research has demonstrated that statistical learning contributes to infant development in a wide array of domains. The range of statistical learning phenomena necessitates a broader view of the processes underlying statistical learning. Learners are sensitive to a much wider range of statistical information than the conditional relations indexed by transitional probabilities, including distributional and cue-based statistics. We propose a novel framework that unifies learning about all of these kinds of statistical structure. From our perspective, learning about conditional relations outputs discrete representations (such as words). Integration across these discrete representations yields sensitivity to cues and distributional information. To achieve sensitivity to all of these kinds of statistical structure, our framework combines processes that extract segments of the input with processes that compare across these extracted items. In this framework, the items extracted from the input serve as exemplars in long-term memory. The similarity structure of those exemplars in long-term memory leads to the discovery of cues and categorical structure, which guides subsequent extraction. The extraction and integration framework provides a way to explain sensitivity to both conditional statistical structure (such as transitional probabilities) and distributional statistical structure (such as item frequency and variability), and also a framework for thinking about how these different aspects of statistical learning influence each other. 2013 APA, all rights reserved
Hu, Anqi; Li, Xiaolin; Ajdari, Amin; Jiang, Bing; Burkhart, Craig; Chen, Wei; Brinson, L. Catherine
2018-05-01
The concept of representative volume element (RVE) is widely used to determine the effective material properties of random heterogeneous materials. In the present work, the RVE is investigated for the viscoelastic response of particle-reinforced polymer nanocomposites in the frequency domain. The smallest RVE size and the minimum number of realizations at a given volume size for both structural and mechanical properties are determined for a given precision using the concept of margin of error. It is concluded that using the mean of many realizations of a small RVE instead of a single large RVE can retain the desired precision of a result with much lower computational cost (up to three orders of magnitude reduced computation time) for the property of interest. Both the smallest RVE size and the minimum number of realizations for a microstructure with higher volume fraction (VF) are larger compared to those of one with lower VF at the same desired precision. Similarly, a clustered structure is shown to require a larger minimum RVE size as well as a larger number of realizations at a given volume size compared to the well-dispersed microstructures.
Predicting uncertainty in future marine ice sheet volume using Bayesian statistical methods
Davis, A. D.
2015-12-01
The marine ice instability can trigger rapid retreat of marine ice streams. Recent observations suggest that marine ice systems in West Antarctica have begun retreating. However, unknown ice dynamics, computationally intensive mathematical models, and uncertain parameters in these models make predicting retreat rate and ice volume difficult. In this work, we fuse current observational data with ice stream/shelf models to develop probabilistic predictions of future grounded ice sheet volume. Given observational data (e.g., thickness, surface elevation, and velocity) and a forward model that relates uncertain parameters (e.g., basal friction and basal topography) to these observations, we use a Bayesian framework to define a posterior distribution over the parameters. A stochastic predictive model then propagates uncertainties in these parameters to uncertainty in a particular quantity of interest (QoI)---here, the volume of grounded ice at a specified future time. While the Bayesian approach can in principle characterize the posterior predictive distribution of the QoI, the computational cost of both the forward and predictive models makes this effort prohibitively expensive. To tackle this challenge, we introduce a new Markov chain Monte Carlo method that constructs convergent approximations of the QoI target density in an online fashion, yielding accurate characterizations of future ice sheet volume at significantly reduced computational cost.Our second goal is to attribute uncertainty in these Bayesian predictions to uncertainties in particular parameters. Doing so can help target data collection, for the purpose of constraining the parameters that contribute most strongly to uncertainty in the future volume of grounded ice. For instance, smaller uncertainties in parameters to which the QoI is highly sensitive may account for more variability in the prediction than larger uncertainties in parameters to which the QoI is less sensitive. We use global sensitivity
Directory of Open Access Journals (Sweden)
Guez F.
2006-11-01
the distribution of block volumes. But it is precisely this distribution that qoverns the choice of one or several successive recovery methods. Therefore, this article describes an original method for statistically computing the distribution law for matrix-block volumes. This method con be applied of any point in a reservoir. The reservoir portion involved with blocks having a given volume is deduced from this method. A general understanding of the fracturing phenomenon acts as the basis for the model. Subsurface observations on reservoir fracturing provide the data (histogrom of fracture direction and spacing. An application to the Eschau field (Alsace, France is described here to illustrate the method.
International Nuclear Information System (INIS)
Montgomery, David W. G.; Amira, Abbes; Zaidi, Habib
2007-01-01
The widespread application of positron emission tomography (PET) in clinical oncology has driven this imaging technology into a number of new research and clinical arenas. Increasing numbers of patient scans have led to an urgent need for efficient data handling and the development of new image analysis techniques to aid clinicians in the diagnosis of disease and planning of treatment. Automatic quantitative assessment of metabolic PET data is attractive and will certainly revolutionize the practice of functional imaging since it can lower variability across institutions and may enhance the consistency of image interpretation independent of reader experience. In this paper, a novel automated system for the segmentation of oncological PET data aiming at providing an accurate quantitative analysis tool is proposed. The initial step involves expectation maximization (EM)-based mixture modeling using a k-means clustering procedure, which varies voxel order for initialization. A multiscale Markov model is then used to refine this segmentation by modeling spatial correlations between neighboring image voxels. An experimental study using an anthropomorphic thorax phantom was conducted for quantitative evaluation of the performance of the proposed segmentation algorithm. The comparison of actual tumor volumes to the volumes calculated using different segmentation methodologies including standard k-means, spatial domain Markov Random Field Model (MRFM), and the new multiscale MRFM proposed in this paper showed that the latter dramatically reduces the relative error to less than 8% for small lesions (7 mm radii) and less than 3.5% for larger lesions (9 mm radii). The analysis of the resulting segmentations of clinical oncologic PET data seems to confirm that this methodology shows promise and can successfully segment patient lesions. For problematic images, this technique enables the identification of tumors situated very close to nearby high normal physiologic uptake. The
International Nuclear Information System (INIS)
Iman, R.L.; Prairie, R.R.; Cramond, W.R.
1985-08-01
This course is intended to provide the necessary probabilistic and statistical skills to perform a PRA. Fundamental background information is reviewed, but the principal purpose is to address specific techniques used in PRAs and to illustrate them with applications. Specific examples and problems are presented for most of the topics
Introduction to modern theoretical physics. Volume II. Quantum theory and statistical physics
International Nuclear Information System (INIS)
Harris, E.G.
1975-01-01
The topics discussed include the history and principles, some solvable problems, and symmetry in quantum mechanics, interference phenomena, approximation methods, some applications of nonrelativistic quantum mechanics, relativistic wave equations, quantum theory of radiation, second quantization, elementary particles and their interactions, thermodynamics, equilibrium statistical mechanics and its applications, the kinetic theory of gases, and collective phenomena
Statistical process control methods allow the analysis and improvement of anesthesia care.
Fasting, Sigurd; Gisvold, Sven E
2003-10-01
Quality aspects of the anesthetic process are reflected in the rate of intraoperative adverse events. The purpose of this report is to illustrate how the quality of the anesthesia process can be analyzed using statistical process control methods, and exemplify how this analysis can be used for quality improvement. We prospectively recorded anesthesia-related data from all anesthetics for five years. The data included intraoperative adverse events, which were graded into four levels, according to severity. We selected four adverse events, representing important quality and safety aspects, for statistical process control analysis. These were: inadequate regional anesthesia, difficult emergence from general anesthesia, intubation difficulties and drug errors. We analyzed the underlying process using 'p-charts' for statistical process control. In 65,170 anesthetics we recorded adverse events in 18.3%; mostly of lesser severity. Control charts were used to define statistically the predictable normal variation in problem rate, and then used as a basis for analysis of the selected problems with the following results: Inadequate plexus anesthesia: stable process, but unacceptably high failure rate; Difficult emergence: unstable process, because of quality improvement efforts; Intubation difficulties: stable process, rate acceptable; Medication errors: methodology not suited because of low rate of errors. By applying statistical process control methods to the analysis of adverse events, we have exemplified how this allows us to determine if a process is stable, whether an intervention is required, and if quality improvement efforts have the desired effect.
Pengendalian Kualitas Kertas Dengan Menggunakan Statistical Process Control di Paper Machine 3
Directory of Open Access Journals (Sweden)
Vera Devani
2017-01-01
Full Text Available Purpose of this research is to determine types and causes of defects commonly found in Paper Machine 3 by using statistical process control (SPC method. Statistical process control (SPC is a technique for solving problems and is used to monitor, control, analyze, manage and improve products and processes using statistical methods. Based on Pareto Diagrams, wavy defect is found as the most frequent defect, which is 81.7%. Human factor, meanwhile, is found as the main cause of defect, primarily due to lack of understanding on machinery and lack of training both leading to errors in data input.
Using Statistical Process Control to Make Data-Based Clinical Decisions.
Pfadt, Al; Wheeler, Donald J.
1995-01-01
Statistical process control (SPC), which employs simple statistical tools and problem-solving techniques such as histograms, control charts, flow charts, and Pareto charts to implement continual product improvement procedures, can be incorporated into human service organizations. Examples illustrate use of SPC procedures to analyze behavioral data…
Statistical Process Control Charts for Measuring and Monitoring Temporal Consistency of Ratings
Omar, M. Hafidz
2010-01-01
Methods of statistical process control were briefly investigated in the field of educational measurement as early as 1999. However, only the use of a cumulative sum chart was explored. In this article other methods of statistical quality control are introduced and explored. In particular, methods in the form of Shewhart mean and standard deviation…
Protecting the Force: Application of Statistical Process Control for Force Protection in Bosnia
National Research Council Canada - National Science Library
Finken, Paul
2000-01-01
.... In Operations Other Than War (OOTW), environments where the enemy is disorganized and incapable of mounting a deception plan, staffs could model hostile events as stochastic events and use statistical methods to detect changes to the process...
Huser, Raphaë l; Opitz, Thomas; Thibaud, Emeric
2018-01-01
Extreme-value theory for stochastic processes has motivated the statistical use of max-stable models for spatial extremes. However, fitting such asymptotic models to maxima observed over finite blocks is problematic when the asymptotic stability
International Nuclear Information System (INIS)
Ross, W.A.; Lokken, R.O.; May, R.P.; Roberts, F.P.; Thornhill, R.E.; Timmerman, C.L.; Treat, R.L.; Westsik, J.H. Jr.
1982-09-01
This volume contains supporting information for the comparative assessment of the transuranic waste forms and processes summarized in Volume I. Detailed data on the characterization of the waste forms selected for the assessment, process descriptions, and cost information are provided. The purpose of this volume is to provide additional information that may be useful when using the data in Volume I and to provide greater detail on particular waste forms and processes. Volume II is divided into two sections and two appendixes. The first section provides information on the preparation of the waste form specimens used in this study and additional characterization data in support of that in Volume I. The second section includes detailed process descriptions for the eight processes evaluated. Appendix A lists the results of MCC-1 leach test and Appendix B lists additional cost data. 56 figures, 12 tables
Statistics to the Rescue!: Using Data to Evaluate a Manufacturing Process
Keithley, Michael G.
2009-01-01
The use of statistics and process controls is too often overlooked in educating students. This article describes an activity appropriate for high school students who have a background in material processing. It gives them a chance to advance their knowledge by determining whether or not a manufacturing process works well. The activity follows a…
Impact of Autocorrelation on Principal Components and Their Use in Statistical Process Control
DEFF Research Database (Denmark)
Vanhatalo, Erik; Kulahci, Murat
2015-01-01
A basic assumption when using principal component analysis (PCA) for inferential purposes, such as in statistical process control (SPC), is that the data are independent in time. In many industrial processes, frequent sampling and process dynamics make this assumption unrealistic rendering sampled...
Brain Volume Estimation Enhancement by Morphological Image Processing Tools
Directory of Open Access Journals (Sweden)
Zeinali R.
2017-12-01
Full Text Available Background: Volume estimation of brain is important for many neurological applications. It is necessary in measuring brain growth and changes in brain in normal/ abnormal patients. Thus, accurate brain volume measurement is very important. Magnetic resonance imaging (MRI is the method of choice for volume quantification due to excellent levels of image resolution and between-tissue contrast. Stereology method is a good method for estimating volume but it requires to segment enough MRI slices and have a good resolution. In this study, it is desired to enhance stereology method for volume estimation of brain using less MRI slices with less resolution. Methods: In this study, a program for calculating volume using stereology method has been introduced. After morphologic method, dilation was applied and the stereology method enhanced. For the evaluation of this method, we used T1-wighted MR images from digital phantom in BrainWeb which had ground truth. Results: The volume of 20 normal brain extracted from BrainWeb, was calculated. The volumes of white matter, gray matter and cerebrospinal fluid with given dimension were estimated correctly. Volume calculation from Stereology method in different cases was made. In three cases, Root Mean Square Error (RMSE was measured. Case I with T=5, d=5, Case II with T=10, D=10 and Case III with T=20, d=20 (T=slice thickness, d=resolution as stereology parameters. By comparing these results of two methods, it is obvious that RMSE values for our proposed method are smaller than Stereology method. Conclusion: Using morphological operation, dilation allows to enhance the estimation volume method, Stereology. In the case with less MRI slices and less test points, this method works much better compared to Stereology method.
Interrupted Time Series Versus Statistical Process Control in Quality Improvement Projects.
Andersson Hagiwara, Magnus; Andersson Gäre, Boel; Elg, Mattias
2016-01-01
To measure the effect of quality improvement interventions, it is appropriate to use analysis methods that measure data over time. Examples of such methods include statistical process control analysis and interrupted time series with segmented regression analysis. This article compares the use of statistical process control analysis and interrupted time series with segmented regression analysis for evaluating the longitudinal effects of quality improvement interventions, using an example study on an evaluation of a computerized decision support system.
International Nuclear Information System (INIS)
Pulsipher, B.A.; Kuhn, W.L.
1987-01-01
Current planning for liquid high-level nuclear wastes existing in the United States includes processing in a liquid-fed ceramic melter to incorporate it into a high-quality glass, and placement in a deep geologic repository. The nuclear waste vitrification process requires assurance of a quality product with little or no final inspection. Statistical process control (SPC) is a quantitative approach to one quality assurance aspect of vitrified nuclear waste. This method for monitoring and controlling a process in the presence of uncertainties provides a statistical basis for decisions concerning product quality improvement. Statistical process control is shown to be a feasible and beneficial tool to help the waste glass producers demonstrate that the vitrification process can be controlled sufficiently to produce an acceptable product. This quantitative aspect of quality assurance could be an effective means of establishing confidence in the claims to a quality product
International Nuclear Information System (INIS)
Pulsipher, B.A.; Kuhn, W.L.
1987-02-01
Current planning for liquid high-level nuclear wastes existing in the US includes processing in a liquid-fed ceramic melter to incorporate it into a high-quality glass, and placement in a deep geologic repository. The nuclear waste vitrification process requires assurance of a quality product with little or no final inspection. Statistical process control (SPC) is a quantitative approach to one quality assurance aspect of vitrified nuclear waste. This method for monitoring and controlling a process in the presence of uncertainties provides a statistical basis for decisions concerning product quality improvement. Statistical process control is shown to be a feasible and beneficial tool to help the waste glass producers demonstrate that the vitrification process can be controlled sufficiently to produce an acceptable product. This quantitative aspect of quality assurance could be an effective means of establishing confidence in the claims to a quality product. 2 refs., 4 figs
Halo statistics analysis within medium volume cosmological N-body simulation
Directory of Open Access Journals (Sweden)
Martinović N.
2015-01-01
Full Text Available In this paper we present halo statistics analysis of a ΛCDM N body cosmological simulation (from first halo formation until z = 0. We study mean major merger rate as a function of time, where for time we consider both per redshift and per Gyr dependence. For latter we find that it scales as the well known power law (1 + zn for which we obtain n = 2.4. The halo mass function and halo growth function are derived and compared both with analytical and empirical fits. We analyse halo growth through out entire simulation, making it possible to continuously monitor evolution of halo number density within given mass ranges. The halo formation redshift is studied exploring possibility for a new simple preliminary analysis during the simulation run. Visualization of the simulation is portrayed as well. At redshifts z = 0−7 halos from simulation have good statistics for further analysis especially in mass range of 1011 − 1014 M./h. [176021 ’Visible and invisible matter in nearby galaxies: theory and observations
Auto-recognition of surfaces and auto-generation of material removal volume for finishing process
Kataraki, Pramod S.; Salman Abu Mansor, Mohd
2018-03-01
Auto-recognition of a surface and auto-generation of material removal volumes for the so recognised surfaces has become a need to achieve successful downstream manufacturing activities like automated process planning and scheduling. Few researchers have contributed to generation of material removal volume for a product but resulted in material removal volume discontinuity between two adjacent material removal volumes generated from two adjacent faces that form convex geometry. The need for limitation free material removal volume generation was attempted and an algorithm that automatically recognises computer aided design (CAD) model’s surface and also auto-generate material removal volume for finishing process of the recognised surfaces was developed. The surfaces of CAD model are successfully recognised by the developed algorithm and required material removal volume is obtained. The material removal volume discontinuity limitation that occurred in fewer studies is eliminated.
Wali, F.; Knotter, D. Martin; Wortelboer, Ronald; Mud, Auke
2007-01-01
Ultra pure water supplied inside the Fab is used in different tools at different stages of processing. Data of the particles measured in ultra pure water was compared with the defect density on wafers processed on these tools and a statistical relation is found Keywords— Yield, defect density,
International Nuclear Information System (INIS)
Zambra, M.; Favre, M.; Moreno, J.; Wyndham, E.; Chuaqui, H.; Choi, P.
1998-01-01
The charge formation processes in a hollow cathode region (HCR) of transient hollow cathode discharge have been studied at the final phase. The statistical distribution that describe different processes of ionization have been represented by Gaussian distributions. Nevertheless, was observed a better representation of these distributions when the pressure is near a minimum value, just before breakdown
Hazard rate model and statistical analysis of a compound point process
Czech Academy of Sciences Publication Activity Database
Volf, Petr
2005-01-01
Roč. 41, č. 6 (2005), s. 773-786 ISSN 0023-5954 R&D Projects: GA ČR(CZ) GA402/04/1294 Institutional research plan: CEZ:AV0Z10750506 Keywords : couting process * compound process * Cox regression model * intensity Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.343, year: 2005
Billings, Paul H.
This instructional guide, one of a series developed by the Technical Education Advancement Modules (TEAM) project, is a 6-hour introductory module on statistical process control (SPC), designed to develop competencies in the following skill areas: (1) identification of the three classes of SPC use; (2) understanding a process and how it works; (3)…
Project T.E.A.M. (Technical Education Advancement Modules). Advanced Statistical Process Control.
Dunlap, Dale
This instructional guide, one of a series developed by the Technical Education Advancement Modules (TEAM) project, is a 20-hour advanced statistical process control (SPC) and quality improvement course designed to develop the following competencies: (1) understanding quality systems; (2) knowing the process; (3) solving quality problems; and (4)…
Statistical error in simulations of Poisson processes: Example of diffusion in solids
Nilsson, Johan O.; Leetmaa, Mikael; Vekilova, Olga Yu.; Simak, Sergei I.; Skorodumova, Natalia V.
2016-08-01
Simulations of diffusion in solids often produce poor statistics of diffusion events. We present an analytical expression for the statistical error in ion conductivity obtained in such simulations. The error expression is not restricted to any computational method in particular, but valid in the context of simulation of Poisson processes in general. This analytical error expression is verified numerically for the case of Gd-doped ceria by running a large number of kinetic Monte Carlo calculations.
Hutchens, Dale E.; Doan, Patrick A.; Boothe, Richard E.
1997-01-01
Bonding labs at both MSFC and the northern Utah production plant prepare bond test specimens which simulate or witness the production of NASA's Reusable Solid Rocket Motor (RSRM). The current process for preparing the bonding surfaces employs 1,1,1-trichloroethane vapor degreasing, which simulates the current RSRM process. Government regulations (e.g., the 1990 Amendments to the Clean Air Act) have mandated a production phase-out of a number of ozone depleting compounds (ODC) including 1,1,1-trichloroethane. In order to comply with these regulations, the RSRM Program is qualifying a spray-in-air (SIA) precision cleaning process using Brulin 1990, an aqueous blend of surfactants. Accordingly, surface preparation prior to bonding process simulation test specimens must reflect the new production cleaning process. The Bonding Lab Statistical Process Control (SPC) program monitors the progress of the lab and its capabilities, as well as certifies the bonding technicians, by periodically preparing D6AC steel tensile adhesion panels with EA-91 3NA epoxy adhesive using a standardized process. SPC methods are then used to ensure the process is statistically in control, thus producing reliable data for bonding studies, and identify any problems which might develop. Since the specimen cleaning process is being changed, new SPC limits must be established. This report summarizes side-by-side testing of D6AC steel tensile adhesion witness panels and tapered double cantilevered beams (TDCBs) using both the current baseline vapor degreasing process and a lab-scale spray-in-air process. A Proceco 26 inches Typhoon dishwasher cleaned both tensile adhesion witness panels and TDCBs in a process which simulates the new production process. The tests were performed six times during 1995, subsequent statistical analysis of the data established new upper control limits (UCL) and lower control limits (LCL). The data also demonstrated that the new process was equivalent to the vapor
Parametric analysis of the statistical model of the stick-slip process
Lima, Roberta; Sampaio, Rubens
2017-06-01
In this paper it is performed a parametric analysis of the statistical model of the response of a dry-friction oscillator. The oscillator is a spring-mass system which moves over a base with a rough surface. Due to this roughness, the mass is subject to a dry-frictional force modeled as a Coulomb friction. The system is stochastically excited by an imposed bang-bang base motion. The base velocity is modeled by a Poisson process for which a probabilistic model is fully specified. The excitation induces in the system stochastic stick-slip oscillations. The system response is composed by a random sequence alternating stick and slip-modes. With realizations of the system, a statistical model is constructed for this sequence. In this statistical model, the variables of interest of the sequence are modeled as random variables, as for example, the number of time intervals in which stick or slip occur, the instants at which they begin, and their duration. Samples of the system response are computed by integration of the dynamic equation of the system using independent samples of the base motion. Statistics and histograms of the random variables which characterize the stick-slip process are estimated for the generated samples. The objective of the paper is to analyze how these estimated statistics and histograms vary with the system parameters, i.e., to make a parametric analysis of the statistical model of the stick-slip process.
Statistical test data selection for reliability evalution of process computer software
International Nuclear Information System (INIS)
Volkmann, K.P.; Hoermann, H.; Ehrenberger, W.
1976-01-01
The paper presents a concept for converting knowledge about the characteristics of process states into practicable procedures for the statistical selection of test cases in testing process computer software. Process states are defined as vectors whose components consist of values of input variables lying in discrete positions or within given limits. Two approaches for test data selection, based on knowledge about cases of demand, are outlined referring to a purely probabilistic method and to the mathematics of stratified sampling. (orig.) [de
Ishihara, Masamichi
2018-04-01
We studied the effects of nonextensivity on the phase transition for the system of finite volume V in the ϕ4 theory in the Tsallis nonextensive statistics of entropic parameter q and temperature T, when the deviation from the Boltzmann-Gibbs (BG) statistics, |q ‑ 1|, is small. We calculated the condensate and the effective mass to the order q ‑ 1 with the normalized q-expectation value under the free particle approximation with zero bare mass. The following facts were found. The condensate Φ divided by v, Φ/v, at q (v is the value of the condensate at T = 0) is smaller than that at q‧ for q > q‧ as a function of Tph/v which is the physical temperature Tph divided by v. The physical temperature Tph is related to the variation of the Tsallis entropy and the variation of the internal energies, and Tph at q = 1 coincides with T. The effective mass decreases, reaches minimum, and increases after that, as Tph increases. The effective mass at q > 1 is lighter than the effective mass at q = 1 at low physical temperature and heavier than the effective mass at q = 1 at high physical temperature. The effects of the nonextensivity on the physical quantity as a function of Tph become strong as |q ‑ 1| increases. The results indicate the significance of the definition of the expectation value, the definition of the physical temperature, and the constraints for the density operator, when the terms including the volume of the system are not negligible.
Xu, Min; Zhang, Lei; Yue, Hong-Shui; Pang, Hong-Wei; Ye, Zheng-Liang; Ding, Li
2017-10-01
To establish an on-line monitoring method for extraction process of Schisandrae Chinensis Fructus, the formula medicinal material of Yiqi Fumai lyophilized injection by combining near infrared spectroscopy with multi-variable data analysis technology. The multivariate statistical process control (MSPC) model was established based on 5 normal batches in production and 2 test batches were monitored by PC scores, DModX and Hotelling T2 control charts. The results showed that MSPC model had a good monitoring ability for the extraction process. The application of the MSPC model to actual production process could effectively achieve on-line monitoring for extraction process of Schisandrae Chinensis Fructus, and can reflect the change of material properties in the production process in real time. This established process monitoring method could provide reference for the application of process analysis technology in the process quality control of traditional Chinese medicine injections. Copyright© by the Chinese Pharmaceutical Association.
Statistical analysis and digital processing of the Mössbauer spectra
Prochazka, Roman; Tucek, Pavel; Tucek, Jiri; Marek, Jaroslav; Mashlan, Miroslav; Pechousek, Jiri
2010-02-01
This work is focused on using the statistical methods and development of the filtration procedures for signal processing in Mössbauer spectroscopy. Statistical tools for noise filtering in the measured spectra are used in many scientific areas. The use of a pure statistical approach in accumulated Mössbauer spectra filtration is described. In Mössbauer spectroscopy, the noise can be considered as a Poisson statistical process with a Gaussian distribution for high numbers of observations. This noise is a superposition of the non-resonant photons counting with electronic noise (from γ-ray detection and discrimination units), and the velocity system quality that can be characterized by the velocity nonlinearities. The possibility of a noise-reducing process using a new design of statistical filter procedure is described. This mathematical procedure improves the signal-to-noise ratio and thus makes it easier to determine the hyperfine parameters of the given Mössbauer spectra. The filter procedure is based on a periodogram method that makes it possible to assign the statistically important components in the spectral domain. The significance level for these components is then feedback-controlled using the correlation coefficient test results. The estimation of the theoretical correlation coefficient level which corresponds to the spectrum resolution is performed. Correlation coefficient test is based on comparison of the theoretical and the experimental correlation coefficients given by the Spearman method. The correctness of this solution was analyzed by a series of statistical tests and confirmed by many spectra measured with increasing statistical quality for a given sample (absorber). The effect of this filter procedure depends on the signal-to-noise ratio and the applicability of this method has binding conditions.
Statistical analysis and digital processing of the Mössbauer spectra
International Nuclear Information System (INIS)
Prochazka, Roman; Tucek, Jiri; Mashlan, Miroslav; Pechousek, Jiri; Tucek, Pavel; Marek, Jaroslav
2010-01-01
This work is focused on using the statistical methods and development of the filtration procedures for signal processing in Mössbauer spectroscopy. Statistical tools for noise filtering in the measured spectra are used in many scientific areas. The use of a pure statistical approach in accumulated Mössbauer spectra filtration is described. In Mössbauer spectroscopy, the noise can be considered as a Poisson statistical process with a Gaussian distribution for high numbers of observations. This noise is a superposition of the non-resonant photons counting with electronic noise (from γ-ray detection and discrimination units), and the velocity system quality that can be characterized by the velocity nonlinearities. The possibility of a noise-reducing process using a new design of statistical filter procedure is described. This mathematical procedure improves the signal-to-noise ratio and thus makes it easier to determine the hyperfine parameters of the given Mössbauer spectra. The filter procedure is based on a periodogram method that makes it possible to assign the statistically important components in the spectral domain. The significance level for these components is then feedback-controlled using the correlation coefficient test results. The estimation of the theoretical correlation coefficient level which corresponds to the spectrum resolution is performed. Correlation coefficient test is based on comparison of the theoretical and the experimental correlation coefficients given by the Spearman method. The correctness of this solution was analyzed by a series of statistical tests and confirmed by many spectra measured with increasing statistical quality for a given sample (absorber). The effect of this filter procedure depends on the signal-to-noise ratio and the applicability of this method has binding conditions
Management of Uncertainty by Statistical Process Control and a Genetic Tuned Fuzzy System
Directory of Open Access Journals (Sweden)
Stephan Birle
2016-01-01
Full Text Available In food industry, bioprocesses like fermentation often are a crucial part of the manufacturing process and decisive for the final product quality. In general, they are characterized by highly nonlinear dynamics and uncertainties that make it difficult to control these processes by the use of traditional control techniques. In this context, fuzzy logic controllers offer quite a straightforward way to control processes that are affected by nonlinear behavior and uncertain process knowledge. However, in order to maintain process safety and product quality it is necessary to specify the controller performance and to tune the controller parameters. In this work, an approach is presented to establish an intelligent control system for oxidoreductive yeast propagation as a representative process biased by the aforementioned uncertainties. The presented approach is based on statistical process control and fuzzy logic feedback control. As the cognitive uncertainty among different experts about the limits that define the control performance as still acceptable may differ a lot, a data-driven design method is performed. Based upon a historic data pool statistical process corridors are derived for the controller inputs control error and change in control error. This approach follows the hypothesis that if the control performance criteria stay within predefined statistical boundaries, the final process state meets the required quality definition. In order to keep the process on its optimal growth trajectory (model based reference trajectory a fuzzy logic controller is used that alternates the process temperature. Additionally, in order to stay within the process corridors, a genetic algorithm was applied to tune the input and output fuzzy sets of a preliminarily parameterized fuzzy controller. The presented experimental results show that the genetic tuned fuzzy controller is able to keep the process within its allowed limits. The average absolute error to the
Statistical process control: separating signal from noise in emergency department operations.
Pimentel, Laura; Barrueto, Fermin
2015-05-01
Statistical process control (SPC) is a visually appealing and statistically rigorous methodology very suitable to the analysis of emergency department (ED) operations. We demonstrate that the control chart is the primary tool of SPC; it is constructed by plotting data measuring the key quality indicators of operational processes in rationally ordered subgroups such as units of time. Control limits are calculated using formulas reflecting the variation in the data points from one another and from the mean. SPC allows managers to determine whether operational processes are controlled and predictable. We review why the moving range chart is most appropriate for use in the complex ED milieu, how to apply SPC to ED operations, and how to determine when performance improvement is needed. SPC is an excellent tool for operational analysis and quality improvement for these reasons: 1) control charts make large data sets intuitively coherent by integrating statistical and visual descriptions; 2) SPC provides analysis of process stability and capability rather than simple comparison with a benchmark; 3) SPC allows distinction between special cause variation (signal), indicating an unstable process requiring action, and common cause variation (noise), reflecting a stable process; and 4) SPC keeps the focus of quality improvement on process rather than individual performance. Because data have no meaning apart from their context, and every process generates information that can be used to improve it, we contend that SPC should be seriously considered for driving quality improvement in emergency medicine. Copyright © 2015 Elsevier Inc. All rights reserved.
Franco, Ana; Gaillard, Vinciane; Cleeremans, Axel; Destrebecqz, Arnaud
2015-12-01
Statistical learning can be used to extract the words from continuous speech. Gómez, Bion, and Mehler (Language and Cognitive Processes, 26, 212-223, 2011) proposed an online measure of statistical learning: They superimposed auditory clicks on a continuous artificial speech stream made up of a random succession of trisyllabic nonwords. Participants were instructed to detect these clicks, which could be located either within or between words. The results showed that, over the length of exposure, reaction times (RTs) increased more for within-word than for between-word clicks. This result has been accounted for by means of statistical learning of the between-word boundaries. However, even though statistical learning occurs without an intention to learn, it nevertheless requires attentional resources. Therefore, this process could be affected by a concurrent task such as click detection. In the present study, we evaluated the extent to which the click detection task indeed reflects successful statistical learning. Our results suggest that the emergence of RT differences between within- and between-word click detection is neither systematic nor related to the successful segmentation of the artificial language. Therefore, instead of being an online measure of learning, the click detection task seems to interfere with the extraction of statistical regularities.
Use of statistic control of the process as part of a quality assurance plan
International Nuclear Information System (INIS)
Acosta, S.; Lewis, C.
2013-01-01
One of the technical requirements of the standard IRAM ISO 17025 for the accreditation of testing laboratories, is the assurance of the quality of the results through the control and monitoring of the factors influencing the reliability of them. The grade the factors contribute to the total measurement uncertainty, determines which of them should be considered when developing a quality assurance plan. The laboratory of environmental measurements of strontium-90 in the accreditation process, performs most of its determinations in samples with values close to the detection limit. For this reason the correct characterization of the white, is a critical parameter and is verified through a letter for statistical process control. The scope of the present work is concerned the control of whites and so it was collected a statistically significant amount of data, for a period of time that is covered of different conditions. This allowed consider significant variables in the process, such as temperature and humidity, and build a graph of white control, which forms the basis of a statistical process control. The data obtained were lower and upper limits for the preparation of the charter white control. In this way the process of characterization of white was considered to operate under statistical control and concludes that it can be used as part of a plan of insurance of the quality
Directory of Open Access Journals (Sweden)
Zuzana ANDRÁSSYOVÁ
2012-07-01
Full Text Available Study deals with an analysis of data to the effect that it improves the quality of statistical tools in processes of assembly of automobile seats. Normal distribution of variables is one of inevitable conditions for the analysis, examination, and improvement of the manufacturing processes (f. e.: manufacturing process capability although, there are constantly more approaches to non‐normal data handling. An appropriate probability distribution of measured data is firstly tested by the goodness of fit of empirical distribution with theoretical normal distribution on the basis of hypothesis testing using programme StatGraphics Centurion XV.II. Data are collected from the assembly process of 1st row automobile seats for each characteristic of quality (Safety Regulation ‐S/R individually. Study closely processes the measured data of an airbag´s assembly and it aims to accomplish the normal distributed data and apply it the statistical process control. Results of the contribution conclude in a statement of rejection of the null hypothesis (measured variables do not follow the normal distribution therefore it is necessary to begin to work on data transformation supported by Minitab15. Even this approach does not reach a normal distributed data and so should be proposed a procedure that leads to the quality output of whole statistical control of manufacturing processes.
On-line statistical processing of radiation detector pulse trains with time-varying count rates
International Nuclear Information System (INIS)
Apostolopoulos, G.
2008-01-01
Statistical analysis is of primary importance for the correct interpretation of nuclear measurements, due to the inherent random nature of radioactive decay processes. This paper discusses the application of statistical signal processing techniques to the random pulse trains generated by radiation detectors. The aims of the presented algorithms are: (i) continuous, on-line estimation of the underlying time-varying count rate θ(t) and its first-order derivative dθ/dt; (ii) detection of abrupt changes in both of these quantities and estimation of their new value after the change point. Maximum-likelihood techniques, based on the Poisson probability distribution, are employed for the on-line estimation of θ and dθ/dt. Detection of abrupt changes is achieved on the basis of the generalized likelihood ratio statistical test. The properties of the proposed algorithms are evaluated by extensive simulations and possible applications for on-line radiation monitoring are discussed
Large-Deviation Results for Discriminant Statistics of Gaussian Locally Stationary Processes
Directory of Open Access Journals (Sweden)
Junichi Hirukawa
2012-01-01
Full Text Available This paper discusses the large-deviation principle of discriminant statistics for Gaussian locally stationary processes. First, large-deviation theorems for quadratic forms and the log-likelihood ratio for a Gaussian locally stationary process with a mean function are proved. Their asymptotics are described by the large deviation rate functions. Second, we consider the situations where processes are misspecified to be stationary. In these misspecified cases, we formally make the log-likelihood ratio discriminant statistics and derive the large deviation theorems of them. Since they are complicated, they are evaluated and illustrated by numerical examples. We realize the misspecification of the process to be stationary seriously affecting our discrimination.
Shewhart, Mark
1991-01-01
Statistical Process Control (SPC) charts are one of several tools used in quality control. Other tools include flow charts, histograms, cause and effect diagrams, check sheets, Pareto diagrams, graphs, and scatter diagrams. A control chart is simply a graph which indicates process variation over time. The purpose of drawing a control chart is to detect any changes in the process signalled by abnormal points or patterns on the graph. The Artificial Intelligence Support Center (AISC) of the Acquisition Logistics Division has developed a hybrid machine learning expert system prototype which automates the process of constructing and interpreting control charts.
International Nuclear Information System (INIS)
Slowinski, B.
1987-01-01
A description of a simple phenomenological model of electromagnetic cascade process (ECP) initiated by high-energy gamma quanta in heavy absorbents is given. Within this model spatial structure and fluctuations of ionization losses of shower electrons and positrons are described. Concrete formulae have been obtained as a result of statistical analysis of experimental data from the xenon bubble chamber of ITEP (Moscow)
DEFF Research Database (Denmark)
Malzahn, Dorthe; Opper, Manfred
2003-01-01
We employ the replica method of statistical physics to study the average case performance of learning systems. The new feature of our theory is that general distributions of data can be treated, which enables applications to real data. For a class of Bayesian prediction models which are based...... on Gaussian processes, we discuss Bootstrap estimates for learning curves....
Spectral deformation techniques applied to the study of quantum statistical irreversible processes
International Nuclear Information System (INIS)
Courbage, M.
1978-01-01
A procedure of analytic continuation of the resolvent of Liouville operators for quantum statistical systems is discussed. When applied to the theory of irreversible processes of the Brussels School, this method supports the idea that the restriction to a class of initial conditions is necessary to obtain an irreversible behaviour. The general results are tested on the Friedrichs model. (Auth.)
Hiemstra, Djoerd; de Jong, Franciska M.G.
2001-01-01
Traditionally, natural language processing techniques for information retrieval have always been studied outside the framework of formal models of information retrieval. In this article, we introduce a new formal model of information retrieval based on the application of statistical language models.
Smith, Toni M.; Hjalmarson, Margret A.
2013-01-01
The purpose of this study is to examine prospective mathematics specialists' engagement in an instructional sequence designed to elicit and develop their understandings of random processes. The study was conducted with two different sections of a probability and statistics course for K-8 teachers. Thirty-two teachers participated. Video analyses…
Matthews, Tansy E.
2009-01-01
This article describes the development of the Virtual Library of Virginia (VIVA). The VIVA statistics-processing system remains a work in progress. Member libraries will benefit from the ability to obtain the actual data from the VIVA site, rather than just the summaries, so a project to make these data available is currently being planned. The…
Reducing lumber thickness variation using real-time statistical process control
Thomas M. Young; Brian H. Bond; Jan Wiedenbeck
2002-01-01
A technology feasibility study for reducing lumber thickness variation was conducted from April 2001 until March 2002 at two sawmills located in the southern U.S. A real-time statistical process control (SPC) system was developed that featured Wonderware human machine interface technology (HMI) with distributed real-time control charts for all sawing centers and...
Hantula, Donald A.
1995-01-01
Clinical applications of statistical process control (SPC) in human service organizations are considered. SPC is seen as providing a standard set of criteria that serves as a common interface for data-based decision making, which may bring decision making under the control of established contingencies rather than the immediate contingencies of…
Averitt, Sallie D.
This instructor guide, which was developed for use in a manufacturing firm's advanced technical preparation program, contains the materials required to present a learning module that is designed to prepare trainees for the program's statistical process control module by improving their basic math skills and instructing them in basic calculator…
An Automated Statistical Process Control Study of Inline Mixing Using Spectrophotometric Detection
Dickey, Michael D.; Stewart, Michael D.; Willson, C. Grant
2006-01-01
An experiment is described, which is designed for a junior-level chemical engineering "fundamentals of measurements and data analysis" course, where students are introduced to the concept of statistical process control (SPC) through a simple inline mixing experiment. The students learn how to create and analyze control charts in an effort to…
Karimi, Hamid; O'Brian, Sue; Onslow, Mark; Jones, Mark; Menzies, Ross; Packman, Ann
2013-01-01
Purpose: Stuttering varies between and within speaking situations. In this study, the authors used statistical process control charts with 10 case studies to investigate variability of stuttering frequency. Method: Participants were 10 adults who stutter. The authors counted the percentage of syllables stuttered (%SS) for segments of their speech…
van den Ende, Jan; van Oost, Elizabeth C.J.
2001-01-01
This article is a longitudinal analysis of the relation between gendered labour divisions and new data processing technologies at the Dutch Central Bureau of Statistics (CBS). Following social-constructivist and evolutionary economic approaches, the authors hold that the relation between technology
Study of film data processing systems by means of a statistical simulation
International Nuclear Information System (INIS)
Deart, A.F.; Gromov, A.I.; Kapustinskaya, V.I.; Okorochenko, G.E.; Sychev, A.Yu.; Tatsij, L.I.
1974-01-01
Considered is a statistic model of the film information processing system. The given time diagrams illustrate the model operation algorithm. The program realizing this model of the system is described in detail. The elaborated program model has been tested at the film information processing system which represents a group of measuring devices operating in line with BESM computer. The obtained functioning quantitative characteristics of the system being tested permit to estimate the system operation efficiency
6823 Volume 12 No. 6 October 2012 PROCESSING PINEAPPLE ...
African Journals Online (AJOL)
CRSP
2012-10-06
Oct 6, 2012 ... PROCESSING PINEAPPLE PULP INTO DIETARY FIBRE ... investigate the processing of pineapple pulp waste from a processing plant, into a ... classified dietary fibre chemically as cellulose, hemicellulose and lignin constituents .... drying time was shorter compared to the freeze-drying and yielded a ...
Description of ground motion data processing codes: Volume 3
International Nuclear Information System (INIS)
Sanders, M.L.
1988-02-01
Data processing codes developed to process ground motion at the Nevada Test Site for the Weapons Test Seismic Investigations Project are used today as part of the program to process ground motion records for the Nevada Nuclear Waste Storage Investigations Project. The work contained in this report documents and lists codes and verifies the ''PSRV'' code. 39 figs
Villani, N; Gérard, K; Marchesi, V; Huger, S; François, P; Noël, A
2010-06-01
The first purpose of this study was to illustrate the contribution of statistical process control for a better security in intensity modulated radiotherapy (IMRT) treatments. This improvement is possible by controlling the dose delivery process, characterized by pretreatment quality control results. So, it is necessary to put under control portal dosimetry measurements (currently, the ionisation chamber measurements were already monitored by statistical process control thanks to statistical process control tools). The second objective was to state whether it is possible to substitute ionisation chamber with portal dosimetry in order to optimize time devoted to pretreatment quality control. At Alexis-Vautrin center, pretreatment quality controls in IMRT for prostate and head and neck treatments were performed for each beam of each patient. These controls were made with an ionisation chamber, which is the reference detector for the absolute dose measurement, and with portal dosimetry for the verification of dose distribution. Statistical process control is a statistical analysis method, coming from industry, used to control and improve the studied process quality. It uses graphic tools as control maps to follow-up process, warning the operator in case of failure, and quantitative tools to evaluate the process toward its ability to respect guidelines: this is the capability study. The study was performed on 450 head and neck beams and on 100 prostate beams. Control charts, showing drifts, both slow and weak, and also both strong and fast, of mean and standard deviation have been established and have shown special cause introduced (manual shift of the leaf gap of the multileaf collimator). Correlation between dose measured at one point, given with the EPID and the ionisation chamber has been evaluated at more than 97% and disagreement cases between the two measurements were identified. The study allowed to demonstrate the feasibility to reduce the time devoted to
International Nuclear Information System (INIS)
Villani, N.; Noel, A.; Villani, N.; Gerard, K.; Marchesi, V.; Huger, S.; Noel, A.; Francois, P.
2010-01-01
Purpose The first purpose of this study was to illustrate the contribution of statistical process control for a better security in intensity modulated radiotherapy (I.M.R.T.) treatments. This improvement is possible by controlling the dose delivery process, characterized by pretreatment quality control results. So, it is necessary to put under control portal dosimetry measurements (currently, the ionisation chamber measurements were already monitored by statistical process control thanks to statistical process control tools). The second objective was to state whether it is possible to substitute ionisation chamber with portal dosimetry in order to optimize time devoted to pretreatment quality control. Patients and methods At Alexis-Vautrin center, pretreatment quality controls in I.M.R.T. for prostate and head and neck treatments were performed for each beam of each patient. These controls were made with an ionisation chamber, which is the reference detector for the absolute dose measurement, and with portal dosimetry for the verification of dose distribution. Statistical process control is a statistical analysis method, coming from industry, used to control and improve the studied process quality. It uses graphic tools as control maps to follow-up process, warning the operator in case of failure, and quantitative tools to evaluate the process toward its ability to respect guidelines: this is the capability study. The study was performed on 450 head and neck beams and on 100 prostate beams. Results Control charts, showing drifts, both slow and weak, and also both strong and fast, of mean and standard deviation have been established and have shown special cause introduced (manual shift of the leaf gap of the multi-leaf collimator). Correlation between dose measured at one point, given with the E.P.I.D. and the ionisation chamber has been evaluated at more than 97% and disagreement cases between the two measurements were identified. Conclusion The study allowed to
Cogeneration technology alternatives study. Volume 2: Industrial process characteristics
1980-01-01
Information and data for 26 industrial processes are presented. The following information is given for each process: (1) a description of the process including the annual energy consumption and product production and plant capacity; (2) the energy requirements of the process for each unit of production and the detailed data concerning electrical energy requirements and also hot water, steam, and direct fired thermal requirements; (3) anticipated trends affecting energy requirements with new process or production technologies; and (4) representative plant data including capacity and projected requirements through the year 2000.
Sung, Jaeyoung
2007-07-01
We present an exact theoretical test of Jarzynski's equality (JE) for reversible volume-switching processes of an ideal gas system. The exact analysis shows that the prediction of JE for the free energy difference is the same as the work done on the gas system during the reversible process that is dependent on the shape of path of the reversible volume-switching process.
Statistical process control applied to the liquid-fed ceramic melter process
International Nuclear Information System (INIS)
Pulsipher, B.A.; Kuhn, W.L.
1987-09-01
In this report, an application of control charts to the apparent feed composition of a Liquid-Fed Ceramic Melter (LFCM) is demonstrated by using results from a simulation of the LFCM system. Usual applications of control charts require the assumption of uncorrelated observations over time. This assumption is violated in the LFCM system because of the heels left in tanks from previous batches. Methods for dealing with this problem have been developed to create control charts for individual batches sent to the feed preparation tank (FPT). These control charts are capable of detecting changes in the process average as well as changes in the process variation. All numbers reported in this document were derived from a simulated demonstration of a plausible LFCM system. In practice, site-specific data must be used as input to a simulation tailored to that site. These data directly affect all variance estimates used to develop control charts. 64 refs., 3 figs., 2 tabs
Noskievičová, Darja; Kucharczyk, Radim
2012-01-01
This paper deals with the eff ective application of SPC on the lengthwise tonsure rolled plates process on double side scissors. After explanation of the SPC fundamentals, goals and mistakes during the SPC implementation, the methodical framework for the eff ective SPC application is defi ned. In the next part of the paper the description of practical application of SPC and its analysis from the point of view of this framework is accomplished. Ovaj članak opisuje djelotvornu primj...
International Nuclear Information System (INIS)
Lopez de la Cruz, J.; Gutierrez, M.A.
2008-01-01
This paper presents a stochastic analysis of spatial point patterns as effect of localized pitting corrosion. The Quadrat Counts method is studied with two empirical pit patterns. The results are dependent on the quadrat size and bias is introduced when empty quadrats are accounted for the analysis. The spatially inhomogeneous Poisson process is used to improve the performance of the Quadrat Counts method. The latter combines Quadrat Counts with distance-based statistics in the analysis of pit patterns. The Inter-Event and the Nearest-Neighbour statistics are here implemented in order to compare their results. Further, the treatment of patterns in irregular domains is discussed
Energy Technology Data Exchange (ETDEWEB)
1980-01-01
Volume 1 describes the proposed plant: KBW gasification process, ICI low-pressure methanol process and Mobil M-gasoline process, and also with ancillary processes, such as oxygen plant, shift process, RECTISOL purification process, sulfur recovery equipment and pollution control equipment. Numerous engineering diagrams are included. (LTN)
Directory of Open Access Journals (Sweden)
N. V. Zhelninskaya
2015-01-01
Full Text Available Statistical methods play an important role in the objective evaluation of quantitative and qualitative characteristics of the process and are one of the most important elements of the quality assurance system production and total quality management process. To produce a quality product, one must know the real accuracy of existing equipment, to determine compliance with the accuracy of a selected technological process specified accuracy products, assess process stability. Most of the random events in life, particularly in manufacturing and scientific research, are characterized by the presence of a large number of random factors, is described by a normal distribution, which is the main in many practical studies. Modern statistical methods is quite difficult to grasp and wide practical use without in-depth mathematical training of all participants in the process. When we know the distribution of a random variable, you can get all the features of this batch of products, to determine the mean value and the variance. Using statistical control methods and quality control in the analysis of accuracy and stability of the technological process of production of epoxy resin ED20. Estimated numerical characteristics of the law of distribution of controlled parameters and determined the percentage of defects of the investigated object products. For sustainability assessment of manufacturing process of epoxy resin ED-20 selected Shewhart control charts, using quantitative data, maps of individual values of X and sliding scale R. Using Pareto charts identify the causes that affect low dynamic viscosity in the largest extent. For the analysis of low values of dynamic viscosity were the causes of defects using Ishikawa diagrams, which shows the most typical factors of the variability of the results of the process. To resolve the problem, it is recommended to modify the polymer composition of carbon fullerenes and to use the developed method for the production of
Statistical Process Control: A Quality Tool for a Venous Thromboembolic Disease Registry.
Posadas-Martinez, Maria Lourdes; Rojas, Liliana Paloma; Vazquez, Fernando Javier; De Quiros, Fernan Bernaldo; Waisman, Gabriel Dario; Giunta, Diego Hernan
2016-01-01
We aim to describe Statistical Control Process as a quality tool for the Institutional Registry of Venous Thromboembolic Disease (IRTD), a registry developed in a community-care tertiary hospital in Buenos Aires, Argentina. The IRTD is a prospective cohort. The process of data acquisition began with the creation of a computerized alert generated whenever physicians requested imaging or laboratory study to diagnose venous thromboembolism, which defined eligible patients. The process then followed a structured methodology for patient's inclusion, evaluation, and posterior data entry. To control this process, process performance indicators were designed to be measured monthly. These included the number of eligible patients, the number of included patients, median time to patient's evaluation, and percentage of patients lost to evaluation. Control charts were graphed for each indicator. The registry was evaluated in 93 months, where 25,757 patients were reported and 6,798 patients met inclusion criteria. The median time to evaluation was 20 hours (SD, 12) and 7.7% of the total was lost to evaluation. Each indicator presented trends over time, caused by structural changes and improvement cycles, and therefore the central limit suffered inflexions. Statistical process control through process performance indicators allowed us to control the performance of the registry over time to detect systematic problems. We postulate that this approach could be reproduced for other clinical registries.
Al-Hussein, Fahad A
2009-01-01
To use statistical control charts in a series of audits to improve the acceptance and consistant use of guidelines, and reduce the variations in prescription processing in primary health care. A series of audits were done at the main satellite of King Saud Housing Family and Community Medicine Center, National Guard Health Affairs, Riyadh, where three general practitioners and six pharmacists provide outpatient care to about 3000 residents. Audits were carried out every fortnight to calculate the proportion of prescriptions that did not conform to the given guidelines of prescribing and dispensing. Simple random samples of thirty were chosen from a sampling frame of all prescriptions given in the two previous weeks. Thirty six audits were carried out from September 2004 to February 2006. P-charts were constructed around a parametric specification of non-conformities not exceeding 25%. Of the 1081 prescriptions, the most frequent non-conformity was failure to write generic names (35.5%), followed by the failure to record patient's weight (16.4%), pharmacist's name (14.3%), duration of therapy (9.1%), and the use of inappropriate abbreviations (6.0%). Initially, 100% of prescriptions did not conform to the guidelines, but within a period of three months, this came down to 40%. A process of audits in the context of statistical process control is necessary for any improvement in the implementation of guidelines in primary care. Statistical process control charts are an effective means of visual feedback to the care providers.
Statistical trajectory of an approximate EM algorithm for probabilistic image processing
International Nuclear Information System (INIS)
Tanaka, Kazuyuki; Titterington, D M
2007-01-01
We calculate analytically a statistical average of trajectories of an approximate expectation-maximization (EM) algorithm with generalized belief propagation (GBP) and a Gaussian graphical model for the estimation of hyperparameters from observable data in probabilistic image processing. A statistical average with respect to observed data corresponds to a configuration average for the random-field Ising model in spin glass theory. In the present paper, hyperparameters which correspond to interactions and external fields of spin systems are estimated by an approximate EM algorithm. A practical algorithm is described for gray-level image restoration based on a Gaussian graphical model and GBP. The GBP approach corresponds to the cluster variation method in statistical mechanics. Our main result in the present paper is to obtain the statistical average of the trajectory in the approximate EM algorithm by using loopy belief propagation and GBP with respect to degraded images generated from a probability density function with true values of hyperparameters. The statistical average of the trajectory can be expressed in terms of recursion formulas derived from some analytical calculations
Heimann, Dennis; Nieschulze, Jens; König-Ries, Birgitta
2010-04-20
Data management in the life sciences has evolved from simple storage of data to complex information systems providing additional functionalities like analysis and visualization capabilities, demanding the integration of statistical tools. In many cases the used statistical tools are hard-coded within the system. That leads to an expensive integration, substitution, or extension of tools because all changes have to be done in program code. Other systems are using generic solutions for tool integration but adapting them to another system is mostly rather extensive work. This paper shows a way to provide statistical functionality over a statistics web service, which can be easily integrated in any information system and set up using XML configuration files. The statistical functionality is extendable by simply adding the description of a new application to a configuration file. The service architecture as well as the data exchange process between client and service and the adding of analysis applications to the underlying service provider are described. Furthermore a practical example demonstrates the functionality of the service.
Directory of Open Access Journals (Sweden)
Zhang, M. Z.
2010-12-01
Full Text Available Concrete diffusivity is a function of its microstructure on many scales, ranging from nanometres to millimetres. Multi-scale techniques are therefore needed to model this parameter. Representative elementary volume (REV, in conjunction with the homogenization principle, is one of the most common multi-scale approaches. This study aimed to establish a procedure for establishing the REV required to determine cement paste diffusivity based on a three-step, numerical-statistical approach. First, several series of 3D cement paste microstructures were generated with HYMOSTRUC3D, a cement hydration and microstructure model, for different volumes of cement paste and w/c ratios ranging from 0.30 to 0.60. Second, the finite element method was used to simulate the diffusion of tritiated water through these microstructures. Effective cement paste diffusivity values for different REVs were obtained by applying Fick’s law. Finally, statistical analysis was used to find the fluctuation in effective diffusivity with cement paste volume, from which the REV was then determined. The conclusion drawn was that the REV for measuring diffusivity in cement paste is 100x100x100 μm^{3}.
La difusividad del hormigón depende de su microestructura a numerosas escalas, desde nanómetros hasta milímetros, por lo que se precisa de técnicas multiescala para representar este parámetro. Junto con el principio de homogeneización, uno de los métodos multiescala más habituales es el volumen elemental representativo (VER. El objeto de este estudio era establecer un procedimiento que permitiera determinar el VER necesario para calcular la difusividad de la pasta de cemento, basándose en un método numéricoestadístico que consta de tres etapas. Primero, se crearon varias series de microestructuras de pasta de cemento en 3D con HYMOSTRUC3D, un programa que permite crear un modelo de la hidratación y microestructura del cemento. Luego se empleó el método de
International Nuclear Information System (INIS)
Drecker, St.; Hoff, A.; Dietrich, M.; Guldner, R.
1999-01-01
Statistical Process Control (SPC) is one of the systematic tools to perform a valuable contribution to the control and planning activities for manufacturing processes and product quality. Advanced Nuclear Fuels GmbH (ANF) started a program to introduce SPC in all sections of the manufacturing process of fuel assemblies. The concept phase is based on a realization of SPC in 3 pilot projects. The existing manufacturing devices are reviewed for the utilization of SPC. Subsequent modifications were made to provide the necessary interfaces. The processes 'powder/pellet manufacturing'. 'cladding tube manufacturing' and 'laser-welding of spacers' are located at the different locations of ANF. Due to the completion of the first steps and the experience obtained by the pilot projects, the introduction program for SPC has already been extended to other manufacturing processes. (authors)
Directory of Open Access Journals (Sweden)
Vera Devani
2014-06-01
Full Text Available PKS “XYZ” merupakan perusahaan yang bergerak di bidang pengolahan kelapa sawit. Produk yang dihasilkan adalah Crude Palm Oil (CPO dan Palm Kernel Oil (PKO. Tujuan penelitian ini adalah menganalisa kehilangan minyak (oil losses dan faktor-faktor penyebab dengan menggunakan metoda Statistical Process Control. Statistical Process Control adalah sekumpulan strategi, teknik, dan tindakan yang diambil oleh sebuah organisasi untuk memastikan bahwa strategi tersebut menghasilkan produk yang berkualitas atau menyediakan pelayanan yang berkualitas. Sampel terjadinya oil losses pada CPO yang diteliti adalah tandan kosong (tankos, biji (nut, ampas (fibre, dan sludge akhir. Berdasarkan Peta Kendali I-MR dapat disimpulkan bahwa kondisi keempat jenis oil losses CPO berada dalam batas kendali dan konsisten. Sedangkan nilai Cpk dari total oil losses berada di luar batas kendali rata-rata proses, hal ini berarti CPO yang diproduksi telah memenuhi kebutuhan pelanggan, dengan total oil losses kurang dari batas maksimum yang ditetapkan oleh perusahaan yaitu 1,65%.
Morphology of Laplacian growth processes and statistics of equivalent many-body systems
International Nuclear Information System (INIS)
Blumenfeld, R.
1994-01-01
The authors proposes a theory for the nonlinear evolution of two dimensional interfaces in Laplacian fields. The growing region is conformally mapped onto the unit disk, generating an equivalent many-body system whose dynamics and statistics are studied. The process is shown to be Hamiltonian, with the Hamiltonian being the imaginary part of the complex electrostatic potential. Surface effects are introduced through the Hamiltonian as an external field. An extension to a continuous density of particles is presented. The results are used to study the morphology of the interface using statistical mechanics for the many-body system. The distribution of the curvature and the moments of the growth probability along the interface are calculated exactly from the distribution of the particles. In the dilute limit, the distribution of the curvature is shown to develop algebraic tails, which may, for the first time, explain the origin of fractality in diffusion controlled processes
Application of Statistical Process Control (SPC in it´s Quality control
Directory of Open Access Journals (Sweden)
Carlos Hernández-Pedrera
2015-12-01
Full Text Available The overall objective of this paper is to use the SPC to assess the possibility of improving the process of obtaining a sanitary device. As specific objectives we set out to identify the variables to be analyzed to enter the statistical control of process (SPC, analyze possible errors and variations indicated by the control charts in addition to evaluate and compare the results achieved with the study of SPC before and after monitoring direct in the production line were used sampling methods and laboratory replacement to determine the quality of the finished product, then statistical methods were applied seeking to emphasize the importance and contribution from its application to monitor corrective actions and support processes in production. It was shown that the process is under control because the results were found within established control limits. There is a tendency to be displaced toward one end of the boundary, the distribution exceeds the limits, creating the possibility that under certain conditions the process is out of control, the results also showed that the process being within the limits of quality control is operating far from the optimal conditions. In any of the study situations were obtained products outside the limits of weight and discoloration but defective products were obtained.
Moore, Sarah J; Herst, Patries M; Louwe, Robert J W
2018-05-01
A remarkable improvement in patient positioning was observed after the implementation of various process changes aiming to increase the consistency of patient positioning throughout the radiotherapy treatment chain. However, no tool was available to describe these changes over time in a standardised way. This study reports on the feasibility of Statistical Process Control (SPC) to highlight changes in patient positioning accuracy and facilitate correlation of these changes with the underlying process changes. Metrics were designed to quantify the systematic and random patient deformation as input for the SPC charts. These metrics were based on data obtained from multiple local ROI matches for 191 patients who were treated for head-and-neck cancer during the period 2011-2016. SPC highlighted a significant improvement in patient positioning that coincided with multiple intentional process changes. The observed improvements could be described as a combination of a reduction in outliers and a systematic improvement in the patient positioning accuracy of all patients. SPC is able to track changes in the reproducibility of patient positioning in head-and-neck radiation oncology, and distinguish between systematic and random process changes. Identification of process changes underlying these trends requires additional statistical analysis and seems only possible when the changes do not overlap in time. Copyright © 2018 Elsevier B.V. All rights reserved.
Silva, A F; Sarraguça, M C; Fonteyne, M; Vercruysse, J; De Leersnyder, F; Vanhoorne, V; Bostijn, N; Verstraeten, M; Vervaet, C; Remon, J P; De Beer, T; Lopes, J A
2017-08-07
A multivariate statistical process control (MSPC) strategy was developed for the monitoring of the ConsiGma™-25 continuous tablet manufacturing line. Thirty-five logged variables encompassing three major units, being a twin screw high shear granulator, a fluid bed dryer and a product control unit, were used to monitor the process. The MSPC strategy was based on principal component analysis of data acquired under normal operating conditions using a series of four process runs. Runs with imposed disturbances in the dryer air flow and temperature, in the granulator barrel temperature, speed and liquid mass flow and in the powder dosing unit mass flow were utilized to evaluate the model's monitoring performance. The impact of the imposed deviations to the process continuity was also evaluated using Hotelling's T 2 and Q residuals statistics control charts. The influence of the individual process variables was assessed by analyzing contribution plots at specific time points. Results show that the imposed disturbances were all detected in both control charts. Overall, the MSPC strategy was successfully developed and applied. Additionally, deviations not associated with the imposed changes were detected, mainly in the granulator barrel temperature control. Copyright © 2017 Elsevier B.V. All rights reserved.
Optimization Model for Uncertain Statistics Based on an Analytic Hierarchy Process
Directory of Open Access Journals (Sweden)
Yongchao Hou
2014-01-01
Full Text Available Uncertain statistics is a methodology for collecting and interpreting the expert’s experimental data by uncertainty theory. In order to estimate uncertainty distributions, an optimization model based on analytic hierarchy process (AHP and interpolation method is proposed in this paper. In addition, the principle of least squares method is presented to estimate uncertainty distributions with known functional form. Finally, the effectiveness of this method is illustrated by an example.
Effect of hospital volume on processes of breast cancer care: A National Cancer Data Base study.
Yen, Tina W F; Pezzin, Liliana E; Li, Jianing; Sparapani, Rodney; Laud, Purushuttom W; Nattinger, Ann B
2017-05-15
The purpose of this study was to examine variations in delivery of several breast cancer processes of care that are correlated with lower mortality and disease recurrence, and to determine the extent to which hospital volume explains this variation. Women who were diagnosed with stage I-III unilateral breast cancer between 2007 and 2011 were identified within the National Cancer Data Base. Multiple logistic regression models were developed to determine whether hospital volume was independently associated with each of 10 individual process of care measures addressing diagnosis and treatment, and 2 composite measures assessing appropriateness of systemic treatment (chemotherapy and hormonal therapy) and locoregional treatment (margin status and radiation therapy). Among 573,571 women treated at 1755 different hospitals, 38%, 51%, and 10% were treated at high-, medium-, and low-volume hospitals, respectively. On multivariate analysis controlling for patient sociodemographic characteristics, treatment year and geographic location, hospital volume was a significant predictor for cancer diagnosis by initial biopsy (medium volume: odds ratio [OR] = 1.15, 95% confidence interval [CI] = 1.05-1.25; high volume: OR = 1.30, 95% CI = 1.14-1.49), negative surgical margins (medium volume: OR = 1.15, 95% CI = 1.06-1.24; high volume: OR = 1.28, 95% CI = 1.13-1.44), and appropriate locoregional treatment (medium volume: OR = 1.12, 95% CI = 1.07-1.17; high volume: OR = 1.16, 95% CI = 1.09-1.24). Diagnosis of breast cancer before initial surgery, negative surgical margins and appropriate use of radiation therapy may partially explain the volume-survival relationship. Dissemination of these processes of care to a broader group of hospitals could potentially improve the overall quality of care and outcomes of breast cancer survivors. Cancer 2017;123:957-66. © 2016 American Cancer Society. © 2016 American Cancer Society.
Cogeneration Technology Alternatives Study (CTAS). Volume 3: Industrial processes
Palmer, W. B.; Gerlaugh, H. E.; Priestley, R. R.
1980-01-01
Cogenerating electric power and process heat in single energy conversion systems rather than separately in utility plants and in process boilers is examined in terms of cost savings. The use of various advanced energy conversion systems are examined and compared with each other and with current technology systems for their savings in fuel energy, costs, and emissions in individual plants and on a national level. About fifty industrial processes from the target energy consuming sectors were used as a basis for matching a similar number of energy conversion systems that are considered as candidate which can be made available by the 1985 to 2000 time period. The sectors considered included food, textiles, lumber, paper, chemicals, petroleum, glass, and primary metals. The energy conversion systems included steam and gas turbines, diesels, thermionics, stirling, closed cycle and steam injected gas turbines, and fuel cells. Fuels considered were coal, both coal and petroleum based residual and distillate liquid fuels, and low Btu gas obtained through the on site gasification of coal. An attempt was made to use consistent assumptions and a consistent set of ground rules specified by NASA for determining performance and cost. Data and narrative descriptions of the industrial processes are given.
Navard, Sharon E.
1989-01-01
In recent years there has been a push within NASA to use statistical techniques to improve the quality of production. Two areas where statistics are used are in establishing product and process quality control of flight hardware and in evaluating the uncertainty of calibration of instruments. The Flight Systems Quality Engineering branch is responsible for developing and assuring the quality of all flight hardware; the statistical process control methods employed are reviewed and evaluated. The Measurement Standards and Calibration Laboratory performs the calibration of all instruments used on-site at JSC as well as those used by all off-site contractors. These calibrations must be performed in such a way as to be traceable to national standards maintained by the National Institute of Standards and Technology, and they must meet a four-to-one ratio of the instrument specifications to calibrating standard uncertainty. In some instances this ratio is not met, and in these cases it is desirable to compute the exact uncertainty of the calibration and determine ways of reducing it. A particular example where this problem is encountered is with a machine which does automatic calibrations of force. The process of force calibration using the United Force Machine is described in detail. The sources of error are identified and quantified when possible. Suggestions for improvement are made.
Implementation of statistical process control for proteomic experiments via LC MS/MS.
Bereman, Michael S; Johnson, Richard; Bollinger, James; Boss, Yuval; Shulman, Nick; MacLean, Brendan; Hoofnagle, Andrew N; MacCoss, Michael J
2014-04-01
Statistical process control (SPC) is a robust set of tools that aids in the visualization, detection, and identification of assignable causes of variation in any process that creates products, services, or information. A tool has been developed termed Statistical Process Control in Proteomics (SProCoP) which implements aspects of SPC (e.g., control charts and Pareto analysis) into the Skyline proteomics software. It monitors five quality control metrics in a shotgun or targeted proteomic workflow. None of these metrics require peptide identification. The source code, written in the R statistical language, runs directly from the Skyline interface, which supports the use of raw data files from several of the mass spectrometry vendors. It provides real time evaluation of the chromatographic performance (e.g., retention time reproducibility, peak asymmetry, and resolution), and mass spectrometric performance (targeted peptide ion intensity and mass measurement accuracy for high resolving power instruments) via control charts. Thresholds are experiment- and instrument-specific and are determined empirically from user-defined quality control standards that enable the separation of random noise and systematic error. Finally, Pareto analysis provides a summary of performance metrics and guides the user to metrics with high variance. The utility of these charts to evaluate proteomic experiments is illustrated in two case studies.
Tungsten Ions in Plasmas: Statistical Theory of Radiative-Collisional Processes
Directory of Open Access Journals (Sweden)
Alexander V. Demura
2015-05-01
Full Text Available The statistical model for calculations of the collisional-radiative processes in plasmas with tungsten impurity was developed. The electron structure of tungsten multielectron ions is considered in terms of both the Thomas-Fermi model and the Brandt-Lundquist model of collective oscillations of atomic electron density. The excitation or ionization of atomic electrons by plasma electron impacts are represented as photo-processes under the action of flux of equivalent photons introduced by E. Fermi. The total electron impact single ionization cross-sections of ions Wk+ with respective rates have been calculated and compared with the available experimental and modeling data (e.g., CADW. Plasma radiative losses on tungsten impurity were also calculated in a wide range of electron temperatures 1 eV–20 keV. The numerical code TFATOM was developed for calculations of radiative-collisional processes involving tungsten ions. The needed computational resources for TFATOM code are orders of magnitudes less than for the other conventional numerical codes. The transition from corona to Boltzmann limit was investigated in detail. The results of statistical approach have been tested by comparison with the vast experimental and conventional code data for a set of ions Wk+. It is shown that the universal statistical model accuracy for the ionization cross-sections and radiation losses is within the data scattering of significantly more complex quantum numerical codes, using different approximations for the calculation of atomic structure and the electronic cross-sections.
International Nuclear Information System (INIS)
Neuls, A.S.; Draper, W.E.; Koenig, R.A.; Newmyer, J.M.; Warner, C.L.
1982-11-01
This two-volume report is a detailed design and operating documentation of the Los Alamos National Laboratory Controlled Air Incinerator (CAI) and is an aid to technology transfer to other Department of Energy contractor sites and the commercial sector. Volume I describes the CAI process, equipment, and performance, and it recommends modifications based on Los Alamos experience. It provides the necessary information for conceptual design and feasibility studies. Volume II provides descriptive engineering information such as drawings specifications, calculations, and costs. It aids duplication of the process at other facilities
International Nuclear Information System (INIS)
Neuls, A.S.; Draper, W.E.; Koenig, R.A.; Newmyer, J.M.; Warner, C.L.
1982-08-01
This two-volume report is a detailed design and operating documentation of the Los Alamos National Laboratory Controlled Air Incinerator (CAI) and is an aid to technology transfer to other Department of Energy contractor sites and the commercial sector. Volume I describes the CAI process, equipment, and performance, and it recommends modifications based on Los Alamos experience. It provides the necessary information for conceptual design and feasibility studies. Volume II provides descriptive engineering information such as drawing, specifications, calculations, and costs. It aids duplication of the process at other facilities
Energy Technology Data Exchange (ETDEWEB)
Neuls, A.S.; Draper, W.E.; Koenig, R.A.; Newmyer, J.M.; Warner, C.L.
1982-08-01
This two-volume report is a detailed design and operating documentation of the Los Alamos National Laboratory Controlled Air Incinerator (CAI) and is an aid to technology transfer to other Department of Energy contractor sites and the commercial sector. Volume I describes the CAI process, equipment, and performance, and it recommends modifications based on Los Alamos experience. It provides the necessary information for conceptual design and feasibility studies. Volume II provides descriptive engineering information such as drawing, specifications, calculations, and costs. It aids duplication of the process at other facilities.
International Nuclear Information System (INIS)
Boning, Duane S.; Chung, James E.
1998-01-01
Advanced process technology will require more detailed understanding and tighter control of variation in devices and interconnects. The purpose of statistical metrology is to provide methods to measure and characterize variation, to model systematic and random components of that variation, and to understand the impact of variation on both yield and performance of advanced circuits. Of particular concern are spatial or pattern-dependencies within individual chips; such systematic variation within the chip can have a much larger impact on performance than wafer-level random variation. Statistical metrology methods will play an important role in the creation of design rules for advanced technologies. For example, a key issue in multilayer interconnect is the uniformity of interlevel dielectric (ILD) thickness within the chip. For the case of ILD thickness, we describe phases of statistical metrology development and application to understanding and modeling thickness variation arising from chemical-mechanical polishing (CMP). These phases include screening experiments including design of test structures and test masks to gather electrical or optical data, techniques for statistical decomposition and analysis of the data, and approaches to calibrating empirical and physical variation models. These models can be integrated with circuit CAD tools to evaluate different process integration or design rule strategies. One focus for the generation of interconnect design rules are guidelines for the use of 'dummy fill' or 'metal fill' to improve the uniformity of underlying metal density and thus improve the uniformity of oxide thickness within the die. Trade-offs that can be evaluated via statistical metrology include the improvements to uniformity possible versus the effect of increased capacitance due to additional metal
Gupta, Munish; Kaplan, Heather C
2017-09-01
Quality improvement (QI) is based on measuring performance over time, and variation in data measured over time must be understood to guide change and make optimal improvements. Common cause variation is natural variation owing to factors inherent to any process; special cause variation is unnatural variation owing to external factors. Statistical process control methods, and particularly control charts, are robust tools for understanding data over time and identifying common and special cause variation. This review provides a practical introduction to the use of control charts in health care QI, with a focus on neonatology. Copyright © 2017 Elsevier Inc. All rights reserved.
High-volume manufacturing device overlay process control
Lee, Honggoo; Han, Sangjun; Woo, Jaeson; Lee, DongYoung; Song, ChangRock; Heo, Hoyoung; Brinster, Irina; Choi, DongSub; Robinson, John C.
2017-03-01
Overlay control based on DI metrology of optical targets has been the primary basis for run-to-run process control for many years. In previous work we described a scenario where optical overlay metrology is performed on metrology targets on a high frequency basis including every lot (or most lots) at DI. SEM based FI metrology is performed ondevice in-die as-etched on an infrequent basis. Hybrid control schemes of this type have been in use for many process nodes. What is new is the relative size of the NZO as compared to the overlay spec, and the need to find more comprehensive solutions to characterize and control the size and variability of NZO at the 1x nm node: sampling, modeling, temporal frequency and control aspects, as well as trade-offs between SEM throughput and accuracy.
Process simulation and statistical approaches for validating waste form qualification models
International Nuclear Information System (INIS)
Kuhn, W.L.; Toland, M.R.; Pulsipher, B.A.
1989-05-01
This report describes recent progress toward one of the principal objectives of the Nuclear Waste Treatment Program (NWTP) at the Pacific Northwest Laboratory (PNL): to establish relationships between vitrification process control and glass product quality. during testing of a vitrification system, it is important to show that departures affecting the product quality can be sufficiently detected through process measurements to prevent an unacceptable canister from being produced. Meeting this goal is a practical definition of a successful sampling, data analysis, and process control strategy. A simulation model has been developed and preliminarily tested by applying it to approximate operation of the West Valley Demonstration Project (WVDP) vitrification system at West Valley, New York. Multivariate statistical techniques have been identified and described that can be applied to analyze large sets of process measurements. Information on components, tanks, and time is then combined to create a single statistic through which all of the information can be used at once to determine whether the process has shifted away from a normal condition
Nicolae Lerma, Alexandre; Bulteau, Thomas; Elineau, Sylvain; Paris, François; Durand, Paul; Anselme, Brice; Pedreros, Rodrigo
2018-01-01
A modelling chain was implemented in order to propose a realistic appraisal of the risk in coastal areas affected by overflowing as well as overtopping processes. Simulations are performed through a nested downscaling strategy from regional to local scale at high spatial resolution with explicit buildings, urban structures such as sea front walls and hydraulic structures liable to affect the propagation of water in urban areas. Validation of the model performance is based on hard and soft available data analysis and conversion of qualitative to quantitative information to reconstruct the area affected by flooding and the succession of events during two recent storms. Two joint probability approaches (joint exceedance contour and environmental contour) are used to define 100-year offshore conditions scenarios and to investigate the flood response to each scenario in terms of (1) maximum spatial extent of flooded areas, (2) volumes of water propagation inland and (3) water level in flooded areas. Scenarios of sea level rise are also considered in order to evaluate the potential hazard evolution. Our simulations show that for a maximising 100-year hazard scenario, for the municipality as a whole, 38 % of the affected zones are prone to overflow flooding and 62 % to flooding by propagation of overtopping water volume along the seafront. Results also reveal that for the two kinds of statistic scenarios a difference of about 5 % in the forcing conditions (water level, wave height and period) can produce significant differences in terms of flooding like +13.5 % of water volumes propagating inland or +11.3 % of affected surfaces. In some areas, flood response appears to be very sensitive to the chosen scenario with differences of 0.3 to 0.5 m in water level. The developed approach enables one to frame the 100-year hazard and to characterize spatially the robustness or the uncertainty over the results. Considering a 100-year scenario with mean sea level rise (0.6 m), hazard
Pitard, Francis F
1993-01-01
Pierre Gy's Sampling Theory and Sampling Practice, Second Edition is a concise, step-by-step guide for process variability management and methods. Updated and expanded, this new edition provides a comprehensive study of heterogeneity, covering the basic principles of sampling theory and its various applications. It presents many practical examples to allow readers to select appropriate sampling protocols and assess the validity of sampling protocols from others. The variability of dynamic process streams using variography is discussed to help bridge sampling theory with statistical process control. Many descriptions of good sampling devices, as well as descriptions of poor ones, are featured to educate readers on what to look for when purchasing sampling systems. The book uses its accessible, tutorial style to focus on professional selection and use of methods. The book will be a valuable guide for mineral processing engineers; metallurgists; geologists; miners; chemists; environmental scientists; and practit...
National Science Foundation, Washington, DC.
During the March through July 1981 period a total of 36 Federal agencies and their subdivisions (95 individual respondents) submitted data in response to the Annual Survey of Federal Funds for Research and Development, Volume XXX, conducted by the National Science Foundation. The detailed statistical tables presented in this report were derived…
Measurement and modeling of advanced coal conversion processes, Volume III
Energy Technology Data Exchange (ETDEWEB)
Ghani, M.U.; Hobbs, M.L.; Hamblen, D.G. [and others
1993-08-01
A generalized one-dimensional, heterogeneous, steady-state, fixed-bed model for coal gasification and combustion is presented. The model, FBED-1, is a design and analysis tool that can be used to simulate a variety of gasification, devolatilization, and combustion processes. The model considers separate gas and solid temperatures, axially variable solid and gas flow rates, variable bed void fraction, coal drying, devolatilization based on chemical functional group composition, depolymerization, vaporization and crosslinking, oxidation, and gasification of char, and partial equilibrium in the gas phase.
Farthouat, Juliane; Franco, Ana; Mary, Alison; Delpouve, Julie; Wens, Vincent; Op de Beeck, Marc; De Tiège, Xavier; Peigneux, Philippe
2017-03-01
Humans are highly sensitive to statistical regularities in their environment. This phenomenon, usually referred as statistical learning, is most often assessed using post-learning behavioural measures that are limited by a lack of sensibility and do not monitor the temporal dynamics of learning. In the present study, we used magnetoencephalographic frequency-tagged responses to investigate the neural sources and temporal development of the ongoing brain activity that supports the detection of regularities embedded in auditory streams. Participants passively listened to statistical streams in which tones were grouped as triplets, and to random streams in which tones were randomly presented. Results show that during exposure to statistical (vs. random) streams, tritone frequency-related responses reflecting the learning of regularities embedded in the stream increased in the left supplementary motor area and left posterior superior temporal sulcus (pSTS), whereas tone frequency-related responses decreased in the right angular gyrus and right pSTS. Tritone frequency-related responses rapidly developed to reach significance after 3 min of exposure. These results suggest that the incidental extraction of novel regularities is subtended by a gradual shift from rhythmic activity reflecting individual tone succession toward rhythmic activity synchronised with triplet presentation, and that these rhythmic processes are subtended by distinct neural sources.
Al-Hussein, Fahad Abdullah
2008-01-01
Diabetes constitutes a major burden of disease globally. Both primary and secondary prevention need to improve in order to face this challenge. Improving management of diabetes in primary care is therefore of fundamental importance. The objective of these series of audits was to find means of improving diabetes management in chronic disease mini-clinics in primary health care. In the process, we were able to study the effect and practical usefulness of different audit designs - those measuring clinical outcomes, process of care, or both. King Saud City Family and Community Medicine Centre, Saudi National Guard Health Affairs in Riyadh city, Saudi Arabia. Simple random samples of 30 files were selected every two weeks from a sampling frame of file numbers for all diabetes clients seen over the period. Information was transferred to a form, entered on the computer and an automated response was generated regarding the appropriateness of management, a criterion mutually agreed upon by care providers. The results were plotted on statistical process control charts, p charts, displayed for all employees. Data extraction, archiving, entry, analysis, plotting and design and preparation of p charts were managed by nursing staff specially trained for the purpose by physicians with relevant previous experience. Audit series with mixed outcome and process measures failed to detect any changes in the proportion of non-conforming cases over a period of one year. The process measures series, on the other hand, showed improvement in care corresponding to a reduction in the proportion non-conforming by 10% within a period of 3 months. Non-conformities dropped from a mean of 5.0 to 1.4 over the year (P process audits and feedbacks. Frequent process audits in the context of statistical process control should be supplemented with concurrent outcome audits, once or twice a year.
Directory of Open Access Journals (Sweden)
M Jafarlou
2014-04-01
Full Text Available Physical properties of agricultural products such as volume are the most important parameters influencing grading and packaging systems. They should be measured accurately as they are considered for any good system design. Image processing and neural network techniques are both non-destructive and useful methods which are recently used for such purpose. In this study, the images of apples were captured from a constant distance and then were processed in MATLAB software and the edges of apple images were extracted. The interior area of apple image was divided into some thin trapezoidal elements perpendicular to longitudinal axis. Total volume of apple was estimated by the summation of incremental volumes of these elements revolved around the apple’s longitudinal axis. The picture of half cut apple was also captured in order to obtain the apple shape’s indentation volume, which was subtracted from the previously estimated total volume of apple. The real volume of apples was measured using water displacement method and the relation between the real volume and estimated volume was obtained. The t-test and Bland-Altman indicated that the difference between the real volume and the estimated volume was not significantly different (p>0.05 i.e. the mean difference was 1.52 cm3 and the accuracy of measurement was 92%. Utilizing neural network with input variables of dimension and mass has increased the accuracy up to 97% and the difference between the mean of volumes decreased to 0.7 cm3.
SOLTECH 92 proceedings: Solar Process Heat Program. Volume 1
Energy Technology Data Exchange (ETDEWEB)
1992-03-01
This document is a limited Proceedings, documenting the presentations given at the symposia conducted by the US Department of Energy`s (DOE) Solar Industrial Program and Solar Thermal Electrical Program at SOLTECH92. The SOLTECH92 national solar energy conference was held in Albuquerque, New Mexico during the period February 17--20, 1992. The National Renewable Energy Laboratory manages the Solar Industrial Program; Sandia National Laboratories (Albuquerque) manages the Solar Thermal Electric Program. The symposia sessions were as follows: (1) Solar Industrial Program and Solar Thermal Electric Program Overviews, (2) Solar Process Heat Applications, (3) Solar Decontamination of Water and Soil; (4) Solar Building Technologies, (5) Solar Thermal Electric Systems, (6) PV Applications and Technologies. For each presentation given in these symposia, these Proceedings provide a one- to two-page abstract and copies of the viewgraphs and/or 35mm slides utilized by the speaker. Some speakers provided additional materials in the interest of completeness. The materials presented in this document were not subjected to a peer review process.
Newly Generated Liquid Waste Processing Alternatives Study, Volume 1
Energy Technology Data Exchange (ETDEWEB)
Landman, William Henry; Bates, Steven Odum; Bonnema, Bruce Edward; Palmer, Stanley Leland; Podgorney, Anna Kristine; Walsh, Stephanie
2002-09-01
This report identifies and evaluates three options for treating newly generated liquid waste at the Idaho Nuclear Technology and Engineering Center of the Idaho National Engineering and Environmental Laboratory. The three options are: (a) treat the waste using processing facilities designed for treating sodium-bearing waste, (b) treat the waste using subcontractor-supplied mobile systems, or (c) treat the waste using a special facility designed and constructed for that purpose. In studying these options, engineers concluded that the best approach is to store the newly generated liquid waste until a sodium-bearing waste treatment facility is available and then to co-process the stored inventory of the newly generated waste with the sodium-bearing waste. After the sodium-bearing waste facility completes its mission, two paths are available. The newly generated liquid waste could be treated using the subcontractor-supplied system or the sodium-bearing waste facility or a portion of it. The final decision depends on the design of the sodium-bearing waste treatment facility, which will be completed in coming years.
Dynamic Volume Holography and Optical Information Processing by Raman Scattering
International Nuclear Information System (INIS)
Dodin, I.Y.; Fisch, N.J.
2002-01-01
A method of producing holograms of three-dimensional optical pulses is proposed. It is shown that both the amplitude and the phase profile of three-dimensional optical pulse can be stored in dynamic perturbations of a Raman medium, such as plasma. By employing Raman scattering in a nonlinear medium, information carried by a laser pulse can be captured in the form of a slowly propagating low-frequency wave that persists for a time large compared with the pulse duration. If such a hologram is then probed with a short laser pulse, the information stored in the medium can be retrieved in a second scattered electromagnetic wave. The recording and retrieving processes can conserve robustly the pulse shape, thus enabling the recording and retrieving with fidelity of information stored in optical signals. While storing or reading the pulse structure, the optical information can be processed as an analogue or digital signal, which allows simultaneous transformation of three-dimensional continuous images or computing discrete arrays of binary data. By adjusting the phase fronts of the reference pulses, one can also perform focusing, redirecting, and other types of transformation of the output pulses
SOLTECH 1992 proceedings: Solar Process Heat Program, volume 1
1992-03-01
This document is a limited Proceedings, documenting the presentations given at the symposia conducted by the U.S. Department of Energy's (DOE) Solar Industrial Program and Solar Thermal Electrical Program at SOLTECH92. The SOLTECH92 national solar energy conference was held in Albuquerque, New Mexico during the period February 17-20, 1992. The National Renewable Energy Laboratory manages the Solar Industrial Program; Sandia National Laboratories (Albuquerque) manages the Solar Thermal Electric Program. The symposia sessions were as follows: (1) Solar Industrial Program and Solar Thermal Electric Program Overviews, (2) Solar Process Heat Applications, (3) Solar Decontamination of Water and Soil, (4) Solar Building Technologies, (5) Solar Thermal Electric Systems, and (6) Photovoltaic (PV) Applications and Technologies. For each presentation given in these symposia, these Proceedings provide a one- to two-page abstract and copies of the viewgraphs and/or 35 mm slides utilized by the speaker. Some speakers provided additional materials in the interest of completeness. The materials presented in this document were not subjected to a peer review process.
Directory of Open Access Journals (Sweden)
Rauch Ł.
2015-09-01
Full Text Available The coupled finite element multiscale simulations (FE2 require costly numerical procedures in both macro and micro scales. Attempts to improve numerical efficiency are focused mainly on two areas of development, i.e. parallelization/distribution of numerical procedures and simplification of virtual material representation. One of the representatives of both mentioned areas is the idea of Statistically Similar Representative Volume Element (SSRVE. It aims at the reduction of the number of finite elements in micro scale as well as at parallelization of the calculations in micro scale which can be performed without barriers. The simplification of computational domain is realized by transformation of sophisticated images of material microstructure into artificially created simple objects being characterized by similar features as their original equivalents. In existing solutions for two-phase steels SSRVE is created on the basis of the analysis of shape coefficients of hard phase in real microstructure and searching for a representative simple structure with similar shape coefficients. Optimization techniques were used to solve this task. In the present paper local strains and stresses are added to the cost function in optimization. Various forms of the objective function composed of different elements were investigated and used in the optimization procedure for the creation of the final SSRVE. The results are compared as far as the efficiency of the procedure and uniqueness of the solution are considered. The best objective function composed of shape coefficients, as well as of strains and stresses, was proposed. Examples of SSRVEs determined for the investigated two-phase steel using that objective function are demonstrated in the paper. Each step of SSRVE creation is investigated from computational efficiency point of view. The proposition of implementation of the whole computational procedure on modern High Performance Computing (HPC
Chen, Tsung-Tai; Chang, Yun-Jau; Ku, Shei-Ling; Chung, Kuo-Piao
2010-10-01
There is much research using statistical process control (SPC) to monitor surgical performance, including comparisons among groups to detect small process shifts, but few of these studies have included a stabilization process. This study aimed to analyse the performance of surgeons in operating room (OR) and set a benchmark by SPC after stabilized process. The OR profile of 499 patients who underwent laparoscopic cholecystectomy performed by 16 surgeons at a tertiary hospital in Taiwan during 2005 and 2006 were recorded. SPC was applied to analyse operative and non-operative times using the following five steps: first, the times were divided into two segments; second, they were normalized; third, they were evaluated as individual processes; fourth, the ARL(0) was calculated;, and fifth, the different groups (surgeons) were compared. Outliers were excluded to ensure stability for each group and to facilitate inter-group comparison. The results showed that in the stabilized process, only one surgeon exhibited a significantly shorter total process time (including operative time and non-operative time). In this study, we use five steps to demonstrate how to control surgical and non-surgical time in phase I. There are some measures that can be taken to prevent skew and instability in the process. Also, using SPC, one surgeon can be shown to be a real benchmark. © 2010 Blackwell Publishing Ltd.
The product composition control system at Savannah River: Statistical process control algorithm
International Nuclear Information System (INIS)
Brown, K.G.
1994-01-01
The Defense Waste Processing Facility (DWPF) at the Savannah River Site (SRS) will be used to immobilize the approximately 130 million liters of high-level nuclear waste currently stored at the site in 51 carbon steel tanks. Waste handling operations separate this waste into highly radioactive insoluble sludge and precipitate and less radioactive water soluble salts. In DWPF, precipitate (PHA) is blended with insoluble sludge and ground glass frit to produce melter feed slurry which is continuously fed to the DWPF melter. The melter produces a molten borosilicate glass which is poured into stainless steel canisters for cooling and, ultimately, shipment to and storage in an geologic repository. Described here is the Product Composition Control System (PCCS) process control algorithm. The PCCS is the amalgam of computer hardware and software intended to ensure that the melt will be processable and that the glass wasteform produced will be acceptable. Within PCCS, the Statistical Process Control (SPC) Algorithm is the means which guides control of the DWPF process. The SPC Algorithm is necessary to control the multivariate DWPF process in the face of uncertainties arising from the process, its feeds, sampling, modeling, and measurement systems. This article describes the functions performed by the SPC Algorithm, characterization of DWPF prior to making product, accounting for prediction uncertainty, accounting for measurement uncertainty, monitoring a SME batch, incorporating process information, and advantages of the algorithm. 9 refs., 6 figs
Hawe, David; Hernández Fernández, Francisco R; O'Suilleabháin, Liam; Huang, Jian; Wolsztynski, Eric; O'Sullivan, Finbarr
2012-05-01
In dynamic mode, positron emission tomography (PET) can be used to track the evolution of injected radio-labelled molecules in living tissue. This is a powerful diagnostic imaging technique that provides a unique opportunity to probe the status of healthy and pathological tissue by examining how it processes substrates. The spatial aspect of PET is well established in the computational statistics literature. This article focuses on its temporal aspect. The interpretation of PET time-course data is complicated because the measured signal is a combination of vascular delivery and tissue retention effects. If the arterial time-course is known, the tissue time-course can typically be expressed in terms of a linear convolution between the arterial time-course and the tissue residue. In statistical terms, the residue function is essentially a survival function - a familiar life-time data construct. Kinetic analysis of PET data is concerned with estimation of the residue and associated functionals such as flow, flux, volume of distribution and transit time summaries. This review emphasises a nonparametric approach to the estimation of the residue based on a piecewise linear form. Rapid implementation of this by quadratic programming is described. The approach provides a reference for statistical assessment of widely used one- and two-compartmental model forms. We illustrate the method with data from two of the most well-established PET radiotracers, (15)O-H(2)O and (18)F-fluorodeoxyglucose, used for assessment of blood perfusion and glucose metabolism respectively. The presentation illustrates the use of two open-source tools, AMIDE and R, for PET scan manipulation and model inference.
Super Efficient Refrigerator Program (SERP) evaluation. Volume 1: Process evaluation
Energy Technology Data Exchange (ETDEWEB)
Sandahl, L.J.; Ledbetter, M.R.; Chin, R.I.; Lewis, K.S.; Norling, J.M.
1996-01-01
The Pacific Northwest National Laboratory (PNNL) conducted this study for the US Department of Energy (DOE) as part of the Super Efficient Refrigerator Program (SERP) Evaluation. This report documents the SERP formation and implementation process, and identifies preliminary program administration and implementation issues. The findings are based primarily on interviews with those familiar with the program, such as utilities, appliance manufacturers, and SERP administrators. These interviews occurred primarily between March and April 1995, when SERP was in the early stages of program implementation. A forthcoming report will estimate the preliminary impacts of SERP within the industry and marketplace. Both studies were funded by DOE at the request of SERP Inc., which sought a third-party evaluation of its program.
Sampling, Probability Models and Statistical Reasoning Statistical
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Sampling, Probability Models and Statistical Reasoning Statistical Inference. Mohan Delampady V R Padmawar. General Article Volume 1 Issue 5 May 1996 pp 49-58 ...
Statistical Analysis of Deep Drilling Process Conditions Using Vibrations and Force Signals
Directory of Open Access Journals (Sweden)
Syafiq Hazwan
2016-01-01
Full Text Available Cooling systems is a key point for hot forming process of Ultra High Strength Steels (UHSS. Normally, cooling systems is made using deep drilling technique. Although deep twist drill is better than other drilling techniques in term of higher productivity however its main problem is premature tool breakage, which affects the production quality. In this paper, analysis of deep twist drill process parameters such as cutting speed, feed rate and depth of cut by using statistical analysis to identify the tool condition is presented. The comparisons between different two tool geometries are also studied. Measured data from vibrations and force sensors are being analyzed through several statistical parameters such as root mean square (RMS, mean, kurtosis, standard deviation and skewness. Result found that kurtosis and skewness value are the most appropriate parameters to represent the deep twist drill tool conditions behaviors from vibrations and forces data. The condition of the deep twist drill process been classified according to good, blunt and fracture. It also found that the different tool geometry parameters affect the performance of the tool drill. It believe the results of this study are useful in determining the suitable analysis method to be used for developing online tool condition monitoring system to identify the tertiary tool life stage and helps to avoid mature of tool fracture during drilling process.
Oravec, Heather Ann; Daniels, Christopher C.
2014-01-01
The National Aeronautics and Space Administration has been developing a novel docking system to meet the requirements of future exploration missions to low-Earth orbit and beyond. A dynamic gas pressure seal is located at the main interface between the active and passive mating components of the new docking system. This seal is designed to operate in the harsh space environment, but is also to perform within strict loading requirements while maintaining an acceptable level of leak rate. In this study, a candidate silicone elastomer seal was designed, and multiple subscale test articles were manufactured for evaluation purposes. The force required to fully compress each test article at room temperature was quantified and found to be below the maximum allowable load for the docking system. However, a significant amount of scatter was observed in the test results. Due to the stochastic nature of the mechanical performance of this candidate docking seal, a statistical process control technique was implemented to isolate unusual compression behavior from typical mechanical performance. The results of this statistical analysis indicated a lack of process control, suggesting a variation in the manufacturing phase of the process. Further investigation revealed that changes in the manufacturing molding process had occurred which may have influenced the mechanical performance of the seal. This knowledge improves the chance of this and future space seals to satisfy or exceed design specifications.
An integrated model of statistical process control and maintenance based on the delayed monitoring
International Nuclear Information System (INIS)
Yin, Hui; Zhang, Guojun; Zhu, Haiping; Deng, Yuhao; He, Fei
2015-01-01
This paper develops an integrated model of statistical process control and maintenance decision. The proposal of delayed monitoring policy postpones the sampling process till a scheduled time and contributes to ten-scenarios of the production process, where equipment failure may occur besides quality shift. The equipment failure and the control chart alert trigger the corrective maintenance and the predictive maintenance, respectively. The occurrence probability, the cycle time and the cycle cost of each scenario are obtained by integral calculation; therefore, a mathematical model is established to minimize the expected cost by using the genetic algorithm. A Monte Carlo simulation experiment is conducted and compared with the integral calculation in order to ensure the analysis of the ten-scenario model. Another ordinary integrated model without delayed monitoring is also established as comparison. The results of a numerical example indicate satisfactory economic performance of the proposed model. Finally, a sensitivity analysis is performed to investigate the effect of model parameters. - Highlights: • We develop an integrated model of statistical process control and maintenance. • We propose delayed monitoring policy and derive an economic model with 10 scenarios. • We consider two deterioration mechanisms, quality shift and equipment failure. • The delayed monitoring policy will help reduce the expected cost
Bersimis, Sotiris; Panaretos, John; Psarakis, Stelios
2005-01-01
Woodall and Montgomery [35] in a discussion paper, state that multivariate process control is one of the most rapidly developing sections of statistical process control. Nowadays, in industry, there are many situations in which the simultaneous monitoring or control, of two or more related quality - process characteristics is necessary. Process monitoring problems in which several related variables are of interest are collectively known as Multivariate Statistical Process Control (MSPC).This ...
On the structure of dynamic principal component analysis used in statistical process monitoring
DEFF Research Database (Denmark)
Vanhatalo, Erik; Kulahci, Murat; Bergquist, Bjarne
2017-01-01
When principal component analysis (PCA) is used for statistical process monitoring it relies on the assumption that data are time independent. However, industrial data will often exhibit serial correlation. Dynamic PCA (DPCA) has been suggested as a remedy for high-dimensional and time...... for determining the number of principal components to retain. The number of retained principal components is determined by visual inspection of the serial correlation in the squared prediction error statistic, Q (SPE), together with the cumulative explained variance of the model. The methods are illustrated using...... driven method to determine the maximum number of lags in DPCA with a foundation in multivariate time series analysis. The method is based on the behavior of the eigenvalues of the lagged autocorrelation and partial autocorrelation matrices. Given a specific lag structure we also propose a method...
ZnO crystals obtained by electrodeposition: Statistical analysis of most important process variables
International Nuclear Information System (INIS)
Cembrero, Jesus; Busquets-Mataix, David
2009-01-01
In this paper a comparative study by means of a statistical analysis of the main process variables affecting ZnO crystal electrodeposition is presented. ZnO crystals were deposited on two different substrates, silicon wafer and indium tin oxide. The control variables were substrate types, electrolyte concentration, temperature, exposition time and current density. The morphologies of the different substrates were observed using scanning electron microscopy. The percentage of substrate area covered by ZnO deposit was calculated by computational image analysis. The design of the applied experiments was based on a two-level factorial analysis involving a series of 32 experiments and an analysis of variance. Statistical results reveal that variables exerting a significant influence on the area covered by ZnO deposit are electrolyte concentration, substrate type and time of deposition, together with a combined two-factor interaction between temperature and current density. However, morphology is also influenced by surface roughness of the substrates
Bootstrap-based confidence estimation in PCA and multivariate statistical process control
DEFF Research Database (Denmark)
Babamoradi, Hamid
be used to detect outliers in the data since the outliers can distort the bootstrap estimates. Bootstrap-based confidence limits were suggested as alternative to the asymptotic limits for control charts and contribution plots in MSPC (Paper II). The results showed that in case of the Q-statistic......Traditional/Asymptotic confidence estimation has limited applicability since it needs statistical theories to estimate the confidences, which are not available for all indicators/parameters. Furthermore, in case the theories are available for a specific indicator/parameter, the theories are based....... The goal was to improve process monitoring by improving the quality of MSPC charts and contribution plots. Bootstrapping algorithm to build confidence limits was illustrated in a case study format (Paper I). The main steps in the algorithm were discussed where a set of sensible choices (plus...
Kottner, Jan; Halfens, Ruud
2010-05-01
Institutionally acquired pressure ulcers are used as outcome indicators to assess the quality of pressure ulcer prevention programs. Determining whether quality improvement projects that aim to decrease the proportions of institutionally acquired pressure ulcers lead to real changes in clinical practice depends on the measurement method and statistical analysis used. To examine whether nosocomial pressure ulcer prevalence rates in hospitals in the Netherlands changed, a secondary data analysis using different statistical approaches was conducted of annual (1998-2008) nationwide nursing-sensitive health problem prevalence studies in the Netherlands. Institutions that participated regularly in all survey years were identified. Risk-adjusted nosocomial pressure ulcers prevalence rates, grade 2 to 4 (European Pressure Ulcer Advisory Panel system) were calculated per year and hospital. Descriptive statistics, chi-square trend tests, and P charts based on statistical process control (SPC) were applied and compared. Six of the 905 healthcare institutions participated in every survey year and 11,444 patients in these six hospitals were identified as being at risk for pressure ulcers. Prevalence rates per year ranged from 0.05 to 0.22. Chi-square trend tests revealed statistically significant downward trends in four hospitals but based on SPC methods, prevalence rates of five hospitals varied by chance only. Results of chi-square trend tests and SPC methods were not comparable, making it impossible to decide which approach is more appropriate. P charts provide more valuable information than single P values and are more helpful for monitoring institutional performance. Empirical evidence about the decrease of nosocomial pressure ulcer prevalence rates in the Netherlands is contradictory and limited.
New advances in statistical modeling and applications
Santos, Rui; Oliveira, Maria; Paulino, Carlos
2014-01-01
This volume presents selected papers from the XIXth Congress of the Portuguese Statistical Society, held in the town of Nazaré, Portugal, from September 28 to October 1, 2011. All contributions were selected after a thorough peer-review process. It covers a broad range of papers in the areas of statistical science, probability and stochastic processes, extremes and statistical applications.
International Nuclear Information System (INIS)
Suzuki, Mitsutoshi; Hori, Masato; Asou, Ryoji; Usuda, Shigekazu
2006-01-01
The multiscale statistical process control (MSSPC) method is applied to clarify the elements of material unaccounted for (MUF) in large scale reprocessing plants using numerical calculations. Continuous wavelet functions are used to decompose the process data, which simulate batch operation superimposed by various types of disturbance, and the disturbance components included in the data are divided into time and frequency spaces. The diagnosis of MSSPC is applied to distinguish abnormal events from the process data and shows how to detect abrupt and protracted diversions using principle component analysis. Quantitative performance of MSSPC for the time series data is shown with average run lengths given by Monte-Carlo simulation to compare to the non-detection probability β. Recent discussion about bias corrections in material balances is introduced and another approach is presented to evaluate MUF without assuming the measurement error model. (author)
Rah, Jeong-Eun; Shin, Dongho; Oh, Do Hoon; Kim, Tae Hyun; Kim, Gwe-Ya
2014-09-01
To evaluate and improve the reliability of proton quality assurance (QA) processes and, to provide an optimal customized tolerance level using the statistical process control (SPC) methodology. The authors investigated the consistency check of dose per monitor unit (D/MU) and range in proton beams to see whether it was within the tolerance level of the daily QA process. This study analyzed the difference between the measured and calculated ranges along the central axis to improve the patient-specific QA process in proton beams by using process capability indices. The authors established a customized tolerance level of ±2% for D/MU and ±0.5 mm for beam range in the daily proton QA process. In the authors' analysis of the process capability indices, the patient-specific range measurements were capable of a specification limit of ±2% in clinical plans. SPC methodology is a useful tool for customizing the optimal QA tolerance levels and improving the quality of proton machine maintenance, treatment delivery, and ultimately patient safety.
International Nuclear Information System (INIS)
Rah, Jeong-Eun; Oh, Do Hoon; Shin, Dongho; Kim, Tae Hyun; Kim, Gwe-Ya
2014-01-01
Purpose: To evaluate and improve the reliability of proton quality assurance (QA) processes and, to provide an optimal customized tolerance level using the statistical process control (SPC) methodology. Methods: The authors investigated the consistency check of dose per monitor unit (D/MU) and range in proton beams to see whether it was within the tolerance level of the daily QA process. This study analyzed the difference between the measured and calculated ranges along the central axis to improve the patient-specific QA process in proton beams by using process capability indices. Results: The authors established a customized tolerance level of ±2% for D/MU and ±0.5 mm for beam range in the daily proton QA process. In the authors’ analysis of the process capability indices, the patient-specific range measurements were capable of a specification limit of ±2% in clinical plans. Conclusions: SPC methodology is a useful tool for customizing the optimal QA tolerance levels and improving the quality of proton machine maintenance, treatment delivery, and ultimately patient safety
Mendoza-Rosas, Ana Teresa; Gómez-Vázquez, Ángel; De la Cruz-Reyna, Servando
2017-06-01
Popocatépetl volcano reawakened in 1994 after nearly 70 years of quiescence. Between 1996 and 2015, a succession of at least 38 lava domes have been irregularly emplaced and destroyed, with each dome reaching particular volumes at specific emplacement rates. The complexity of this sequence is analyzed using statistical methods in an attempt to gain insight into the physics and dynamics of the lava dome emplacement and destruction process and to objectively assess the hazards related to that volcano. The time series of emplacements, dome residences, lava effusion lulls, and emplaced dome volumes and thicknesses are modeled using the simple exponential and Weibull distributions, the compound non-homogeneous generalized Pareto-Poisson process (NHPPP), and the mixture of exponentials distribution (MOED). The statistical analysis reveals that the sequence of dome emplacements is a non-stationary, self-regulating process most likely controlled by the balance between buoyancy-driven magma ascent and volatile exsolution crystallization. This balance has supported the sustained effusive activity for decades and may persist for an undetermined amount of time. However, the eruptive history of Popocatépetl includes major Plinian phases that may have resulted from a breach in that balance. Certain criteria to recognize such breaching conditions are inferred from this statistical analysis.
Assessment of the beryllium lymphocyte proliferation test using statistical process control.
Cher, Daniel J; Deubner, David C; Kelsh, Michael A; Chapman, Pamela S; Ray, Rose M
2006-10-01
Despite more than 20 years of surveillance and epidemiologic studies using the beryllium blood lymphocyte proliferation test (BeBLPT) as a measure of beryllium sensitization (BeS) and as an aid for diagnosing subclinical chronic beryllium disease (CBD), improvements in specific understanding of the inhalation toxicology of CBD have been limited. Although epidemiologic data suggest that BeS and CBD risks vary by process/work activity, it has proven difficult to reach specific conclusions regarding the dose-response relationship between workplace beryllium exposure and BeS or subclinical CBD. One possible reason for this uncertainty could be misclassification of BeS resulting from variation in BeBLPT testing performance. The reliability of the BeBLPT, a biological assay that measures beryllium sensitization, is unknown. To assess the performance of four laboratories that conducted this test, we used data from a medical surveillance program that offered testing for beryllium sensitization with the BeBLPT. The study population was workers exposed to beryllium at various facilities over a 10-year period (1992-2001). Workers with abnormal results were offered diagnostic workups for CBD. Our analyses used a standard statistical technique, statistical process control (SPC), to evaluate test reliability. The study design involved a repeated measures analysis of BeBLPT results generated from the company-wide, longitudinal testing. Analytical methods included use of (1) statistical process control charts that examined temporal patterns of variation for the stimulation index, a measure of cell reactivity to beryllium; (2) correlation analysis that compared prior perceptions of BeBLPT instability to the statistical measures of test variation; and (3) assessment of the variation in the proportion of missing test results and how time periods with more missing data influenced SPC findings. During the period of this study, all laboratories displayed variation in test results that
Multivariate statistical process control in product quality review assessment - A case study.
Kharbach, M; Cherrah, Y; Vander Heyden, Y; Bouklouze, A
2017-11-01
According to the Food and Drug Administration and the European Good Manufacturing Practices (GMP) guidelines, Annual Product Review (APR) is a mandatory requirement in GMP. It consists of evaluating a large collection of qualitative or quantitative data in order to verify the consistency of an existing process. According to the Code of Federal Regulation Part 11 (21 CFR 211.180), all finished products should be reviewed annually for the quality standards to determine the need of any change in specification or manufacturing of drug products. Conventional Statistical Process Control (SPC) evaluates the pharmaceutical production process by examining only the effect of a single factor at the time using a Shewhart's chart. It neglects to take into account the interaction between the variables. In order to overcome this issue, Multivariate Statistical Process Control (MSPC) can be used. Our case study concerns an APR assessment, where 164 historical batches containing six active ingredients, manufactured in Morocco, were collected during one year. Each batch has been checked by assaying the six active ingredients by High Performance Liquid Chromatography according to European Pharmacopoeia monographs. The data matrix was evaluated both by SPC and MSPC. The SPC indicated that all batches are under control, while the MSPC, based on Principal Component Analysis (PCA), for the data being either autoscaled or robust scaled, showed four and seven batches, respectively, out of the Hotelling T 2 95% ellipse. Also, an improvement of the capability of the process is observed without the most extreme batches. The MSPC can be used for monitoring subtle changes in the manufacturing process during an APR assessment. Copyright © 2017 Académie Nationale de Pharmacie. Published by Elsevier Masson SAS. All rights reserved.
Directory of Open Access Journals (Sweden)
Fabiano Rodrigues Soriano
Full Text Available Abstract The Statistical Process Control - SPC is a set of statistical techniques focused on process control, monitoring and analyzing variation causes in the quality characteristics and/or in the parameters used to control and process improvements. Implementing SPC in organizations is a complex task. The reasons for its failure are related to organizational or social factors such as lack of top management commitment and little understanding about its potential benefits. Other aspects concern technical factors such as lack of training on and understanding about the statistical techniques. The main aim of the present article is to understand the interrelations between conditioning factors associated with top management commitment (Support, SPC Training and Application, as well as to understand the relationships between these factors and the benefits associated with the implementation of the program. The Partial Least Squares Structural Equation Modeling (PLS-SEM was used in the analysis since the main goal is to establish the causal relations. A cross-section survey was used as research method to collect information of samples from Brazilian auto-parts companies, which were selected according to guides from the auto-parts industry associations. A total of 170 companies were contacted by e-mail and by phone in order to be invited to participate in the survey. However, just 93 industries agreed on participating, and only 43 answered the questionnaire. The results showed that the senior management support considerably affects the way companies develop their training programs. In turn, these trainings affect the way companies apply the techniques. Thus, it will reflect on the benefits gotten from implementing the program. It was observed that the managerial and technical aspects are closely connected to each other and that they are represented by the ratio between top management and training support. The technical aspects observed through SPC
Comparison of Statistical Post-Processing Methods for Probabilistic Wind Speed Forecasting
Han, Keunhee; Choi, JunTae; Kim, Chansoo
2018-02-01
In this study, the statistical post-processing methods that include bias-corrected and probabilistic forecasts of wind speed measured in PyeongChang, which is scheduled to host the 2018 Winter Olympics, are compared and analyzed to provide more accurate weather information. The six post-processing methods used in this study are as follows: mean bias-corrected forecast, mean and variance bias-corrected forecast, decaying averaging forecast, mean absolute bias-corrected forecast, and the alternative implementations of ensemble model output statistics (EMOS) and Bayesian model averaging (BMA) models, which are EMOS and BMA exchangeable models by assuming exchangeable ensemble members and simplified version of EMOS and BMA models. Observations for wind speed were obtained from the 26 stations in PyeongChang and 51 ensemble member forecasts derived from the European Centre for Medium-Range Weather Forecasts (ECMWF Directorate, 2012) that were obtained between 1 May 2013 and 18 March 2016. Prior to applying the post-processing methods, reliability analysis was conducted by using rank histograms to identify the statistical consistency of ensemble forecast and corresponding observations. Based on the results of our study, we found that the prediction skills of probabilistic forecasts of EMOS and BMA models were superior to the biascorrected forecasts in terms of deterministic prediction, whereas in probabilistic prediction, BMA models showed better prediction skill than EMOS. Even though the simplified version of BMA model exhibited best prediction skill among the mentioned six methods, the results showed that the differences of prediction skills between the versions of EMOS and BMA were negligible.
Feasibility of large volume casting cementation process for intermediate level radioactive waste
International Nuclear Information System (INIS)
Chen Zhuying; Chen Baisong; Zeng Jishu; Yu Chengze
1988-01-01
The recent tendency of radioactive waste treatment and disposal both in China and abroad is reviewed. The feasibility of the large volume casting cementation process for treating and disposing the intermediate level radioactive waste from spent fuel reprocessing plant in shallow land is assessed on the basis of the analyses of the experimental results (such as formulation study, solidified radioactive waste properties measurement ect.). It can be concluded large volume casting cementation process is a promising, safe and economic process. It is feasible to dispose the intermediate level radioactive waste from reprocessing plant it the disposal site chosen has resonable geological and geographical conditions and some additional effective protection means are taken
Rodrigues, Clóves G.; Silva, Antônio A. P.; Silva, Carlos A. B.; Vasconcellos, Áurea R.; Ramos, J. Galvão; Luzzi, Roberto
2010-01-01
The nowadays notable development of all the modern technology, fundamental for the progress and well being of world society, imposes a great deal of stress in the realm of basic Physics, more precisely on Thermo-Statistics. We do face situations in electronics and optoelectronics involving physical-chemical systems far-removed-from equilibrium, where ultrafast (in pico- and femto-second scale) and non-linear processes are present. Further, we need to be aware of the rapid unfolding of nano-te...
Babamoradi, Hamid; van den Berg, Frans; Rinnan, Åsmund
2016-02-18
In Multivariate Statistical Process Control, when a fault is expected or detected in the process, contribution plots are essential for operators and optimization engineers in identifying those process variables that were affected by or might be the cause of the fault. The traditional way of interpreting a contribution plot is to examine the largest contributing process variables as the most probable faulty ones. This might result in false readings purely due to the differences in natural variation, measurement uncertainties, etc. It is more reasonable to compare variable contributions for new process runs with historical results achieved under Normal Operating Conditions, where confidence limits for contribution plots estimated from training data are used to judge new production runs. Asymptotic methods cannot provide confidence limits for contribution plots, leaving re-sampling methods as the only option. We suggest bootstrap re-sampling to build confidence limits for all contribution plots in online PCA-based MSPC. The new strategy to estimate CLs is compared to the previously reported CLs for contribution plots. An industrial batch process dataset was used to illustrate the concepts. Copyright © 2016 Elsevier B.V. All rights reserved.
Statistical Analysis of the First Passage Path Ensemble of Jump Processes
von Kleist, Max; Schütte, Christof; Zhang, Wei
2018-02-01
The transition mechanism of jump processes between two different subsets in state space reveals important dynamical information of the processes and therefore has attracted considerable attention in the past years. In this paper, we study the first passage path ensemble of both discrete-time and continuous-time jump processes on a finite state space. The main approach is to divide each first passage path into nonreactive and reactive segments and to study them separately. The analysis can be applied to jump processes which are non-ergodic, as well as continuous-time jump processes where the waiting time distributions are non-exponential. In the particular case that the jump processes are both Markovian and ergodic, our analysis elucidates the relations between the study of the first passage paths and the study of the transition paths in transition path theory. We provide algorithms to numerically compute statistics of the first passage path ensemble. The computational complexity of these algorithms scales with the complexity of solving a linear system, for which efficient methods are available. Several examples demonstrate the wide applicability of the derived results across research areas.
Goessens, W H; Kluytmans, J A; den Toom, N; van Rijsoort-Vos, T H; Niesters, B G; Stolz, E; Verbrugh, H A; Quint, W G
In the present study, it was demonstrated that the sensitivity of the PCR for the detection of Chlamydia trachomatis is influenced by the volume of the clinical sample which is processed in the PCR. An adequate sensitivity for PCR was established by processing at least 4%, i.e., 80 microliters, of
Statistical methods to assess and control processes and products during nuclear fuel fabrication
International Nuclear Information System (INIS)
Weidinger, H.
1999-01-01
Very good statistical tools and techniques are available today to access the quality and the reliability of fabrication process as the original sources for a good and reliable quality of the fabricated processes. Quality control charts of different types play a key role and the high capability of modern electronic data acquisition technologies proved, at least potentially, a high efficiency in the more or less online application of these methods. These techniques focus mainly on stability and the reliability of the fabrication process. In addition, relatively simple statistical tolls are available to access the capability of fabrication process, assuming they are stable, to fulfill the product specifications. All these techniques can only result in as good a product as the product design is able to describe the product requirements necessary for good performance. Therefore it is essential that product design is strictly and closely performance oriented. However, performance orientation is only successful through an open and effective cooperation with the customer who uses or applies those products. During the last one to two decades in the west, a multi-vendor strategy has been developed by the utility, sometimes leading to three different fuel vendors for one reactor core. This development resulted in better economic conditions for the user but did not necessarily increase an open attitude with the vendor toward the using utility. The responsibility of the utility increased considerably to ensure an adequate quality of the fuel they received. As a matter of fact, sometimes the utilities had to pay a high price because of unexpected performance problems. Thus the utilities are now learning that they need to increase their knowledge and experience in the area of nuclear fuel quality management and technology. This process started some time ago in the west. However, it now also reaches the utilities in the eastern countries. (author)
A bibliometric analysis of 50 years of worldwide research on statistical process control
Directory of Open Access Journals (Sweden)
Fabiane Letícia Lizarelli
Full Text Available Abstract An increasing number of papers on statistical process control (SPC has emerged in the last fifty years, especially in the last fifteen years. This may be attributed to the increased global competitiveness generated by innovation and the continuous improvement of products and processes. In this sense, SPC has a fundamentally important role in quality and production systems. The research in this paper considers the context of technological improvement and innovation of products and processes to increase corporate competitiveness. There are several other statistical technics and tools for assisting continuous improvement and innovation of products and processes but, despite the limitations in their use in the improvement projects, there is growing concern about the use of SPC. A gap between the SPC technics taught in engineering courses and their practical applications to industrial problems is observed in empirical research; thus, it is important to understand what has been done and identify the trends in SPC research. The bibliometric study in this paper is proposed in this direction and uses the Web of Science (WoS database. Data analysis indicates that there was a growth rate of more than 90% in the number of publications on SPC after 1990. Our results reveal the countries where these publications have come from, the authors with the highest number of papers and their networks. Main sources of publications are also identified; it is observed that the publications of SPC papers are concentrated in some of the international research journals, not necessarily those with the major high-impact factors. Furthermore, the papers are focused on industrial engineering, operations research and management science fields. The most common term found in the papers was cumulative sum control charts, but new topics have emerged and have been researched in the past ten years, such as multivariate methods for process monitoring and nonparametric methods.
Baldewijns, Greet; Luca, Stijn; Nagels, William; Vanrumste, Bart; Croonenborghs, Tom
2015-01-01
It has been shown that gait speed and transfer times are good measures of functional ability in elderly. However, data currently acquired by systems that measure either gait speed or transfer times in the homes of elderly people require manual reviewing by healthcare workers. This reviewing process is time-consuming. To alleviate this burden, this paper proposes the use of statistical process control methods to automatically detect both positive and negative changes in transfer times. Three SPC techniques: tabular CUSUM, standardized CUSUM and EWMA, known for their ability to detect small shifts in the data, are evaluated on simulated transfer times. This analysis shows that EWMA is the best-suited method with a detection accuracy of 82% and an average detection time of 9.64 days.
Directory of Open Access Journals (Sweden)
Savic Marija V.
2015-09-01
Full Text Available This article presents the results of the statistical modeling of copper losses in the silicate slag of the sulfide concentrates smelting process. The aim of this study was to define the correlation dependence of the degree of copper losses in the silicate slag on the following parameters of technological processes: SiO2, FeO, Fe3O4, CaO and Al2O3 content in the slag and copper content in the matte. Multiple linear regression analysis (MLRA, artificial neural networks (ANNs and adaptive network based fuzzy inference system (ANFIS were used as tools for mathematical analysis of the indicated problem. The best correlation coefficient (R2 = 0.719 of the final model was obtained using the ANFIS modeling approach.
Energy Technology Data Exchange (ETDEWEB)
Fehrmann, Henning [Westinghouse Electric Germany GmbH, Dudenstr. 44, D-68167 Mannheim (Germany); Aign, Joerg [Westinghouse Electric Germany GmbH, Global D and D and Waste Management, Tarpenring 6, D-22419 Hamburg (Germany)
2013-07-01
In nuclear power plants (NPP) ion exchange (IX) resins are used in several systems for water treatment. Spent resins can contain a significant amount of contaminates which makes treatment for disposal of spent resins mandatory. Several treatment processes are available such as direct immobilization with technologies like cementation, bitumisation, polymer solidification or usage of a high integrity container (HIC). These technologies usually come with a significant increase in final waste volume. The Hot Resin Supercompaction (HRSC) is a thermal treatment process which reduces the resin waste volume significantly. For a mixture of powdered and bead resins the HRSC process has demonstrated a volume reduction of up to 75 % [1]. For bead resins only the HRSC process is challenging because the bead resins compaction properties are unfavorable. The bead resin material does not form a solid block after compaction and shows a high spring back effect. The volume reduction of bead resins is not as good as for the mixture described in [1]. The compaction properties of bead resin waste can be significantly improved by grinding the beads to powder. The grinding also eliminates the need for a powder additive.Westinghouse has developed a modular grinding process to grind the bead resin to powder. The developed process requires no circulation of resins and enables a selective adjustment of particle size and distribution to achieve optimal results in the HRSC or in any other following process. A special grinding tool setup is use to minimize maintenance and radiation exposure to personnel. (authors)
A generic statistical methodology to predict the maximum pit depth of a localized corrosion process
International Nuclear Information System (INIS)
Jarrah, A.; Bigerelle, M.; Guillemot, G.; Najjar, D.; Iost, A.; Nianga, J.-M.
2011-01-01
Highlights: → We propose a methodology to predict the maximum pit depth in a corrosion process. → Generalized Lambda Distribution and the Computer Based Bootstrap Method are combined. → GLD fit a large variety of distributions both in their central and tail regions. → Minimum thickness preventing perforation can be estimated with a safety margin. → Considering its applications, this new approach can help to size industrial pieces. - Abstract: This paper outlines a new methodology to predict accurately the maximum pit depth related to a localized corrosion process. It combines two statistical methods: the Generalized Lambda Distribution (GLD), to determine a model of distribution fitting with the experimental frequency distribution of depths, and the Computer Based Bootstrap Method (CBBM), to generate simulated distributions equivalent to the experimental one. In comparison with conventionally established statistical methods that are restricted to the use of inferred distributions constrained by specific mathematical assumptions, the major advantage of the methodology presented in this paper is that both the GLD and the CBBM enable a statistical treatment of the experimental data without making any preconceived choice neither on the unknown theoretical parent underlying distribution of pit depth which characterizes the global corrosion phenomenon nor on the unknown associated theoretical extreme value distribution which characterizes the deepest pits. Considering an experimental distribution of depths of pits produced on an aluminium sample, estimations of maximum pit depth using a GLD model are compared to similar estimations based on usual Gumbel and Generalized Extreme Value (GEV) methods proposed in the corrosion engineering literature. The GLD approach is shown having smaller bias and dispersion in the estimation of the maximum pit depth than the Gumbel approach both for its realization and mean. This leads to comparing the GLD approach to the GEV one
International Nuclear Information System (INIS)
Sen, Biswajit; Mandal, Swapan
2007-01-01
An initially prepared coherent state coupled to a second-order nonlinear medium is responsible for stimulated and spontaneous hyper Raman processes. By using an intuitive approach based on perturbation theory, the Hamiltonian corresponding to the hyper Raman processes is analytically solved to obtain the temporal development of the field operators. It is true that these analytical solutions are valid for small coupling constants. However, the interesting part is that these solutions are valid for reasonably large time. Hence, the present analytical solutions are quite general and are fresh compared to those solutions under short-time approximations. By exploiting the analytical solutions of field operators for various modes, we investigate the squeezing, photon antibunching and nonclassical photon statistics for pure modes of the input coherent light responsible for hyper Raman processes. At least in one instance (stimulated hyper Raman processes for vibration phonon mode), we report the simultaneous appearance of classical (photon bunching) and nonclassical (squeezing) effects of the radiation field responsible for hyper Raman processes
IMPROVING KNITTED FABRICS BY A STATISTICAL CONTROL OF DIMENSIONAL CHANGES AFTER THE DYEING PROCESS
Directory of Open Access Journals (Sweden)
LLINARES-BERENGUER Jorge
2017-05-01
Full Text Available One of the most important problems that cotton knitted fabrics present during the manufacturing process is their dimensional instability, which needs to be minimised. Some of the variables that intervene in fabric shrinkage are related with its structural characteristics, use of fiber when producing yarn, the yarn count used or the dyeing process employed. Conducted under real factory conditions, the present study attempted to model the behaviour of a fabric structure after a dyeing process by contributing several algorithms that calculate dyed fabric stability after the first wash cycle. Small-diameter circular machines are used to produce garments with no side seams. This is the reason why a list of machines that produce the same fabrics for different widths needs to be made available to produce all the sizes of a given garment. Two relaxation states were distingued for interlock fabric: dyed and dry relaxation, and dyed and wash relaxation. The linear density of the yarn employed to produce sample fabric was combed cotton Ne 30. The machines used for optic bleaching were Overflow. To obtain knitting structures with optimum dimensional stability, different statistical tools were used to help us to evaluate all the production process variables (raw material, machines and process responsible for this variation. This allowed to guarantee product quality without creating costs and losses.
Mezzenga, Emilio; D'Errico, Vincenzo; Sarnelli, Anna; Strigari, Lidia; Menghi, Enrico; Marcocci, Francesco; Bianchini, David; Benassi, Marcello
2016-01-01
The purpose of this study was to retrospectively evaluate the results from a Helical TomoTherapy Hi-Art treatment system relating to quality controls based on daily static and dynamic output checks using statistical process control methods. Individual value X-charts, exponentially weighted moving average charts, and process capability and acceptability indices were used to monitor the treatment system performance. Daily output values measured from January 2014 to January 2015 were considered. The results obtained showed that, although the process was in control, there was an out-of-control situation in the principal maintenance intervention for the treatment system. In particular, process capability indices showed a decreasing percentage of points in control which was, however, acceptable according to AAPM TG148 guidelines. Our findings underline the importance of restricting the acceptable range of daily output checks and suggest a future line of investigation for a detailed process control of daily output checks for the Helical TomoTherapy Hi-Art treatment system.
Singh, Sarabjeet; Schneider, David J; Myers, Christopher R
2014-03-01
Branching processes have served as a model for chemical reactions, biological growth processes, and contagion (of disease, information, or fads). Through this connection, these seemingly different physical processes share some common universalities that can be elucidated by analyzing the underlying branching process. In this work we focus on coupled branching processes as a model of infectious diseases spreading from one population to another. An exceedingly important example of such coupled outbreaks are zoonotic infections that spill over from animal populations to humans. We derive several statistical quantities characterizing the first spillover event from animals to humans, including the probability of spillover, the first passage time distribution for human infection, and disease prevalence in the animal population at spillover. Large stochastic fluctuations in those quantities can make inference of the state of the system at the time of spillover difficult. Focusing on outbreaks in the human population, we then characterize the critical threshold for a large outbreak, the distribution of outbreak sizes, and associated scaling laws. These all show a strong dependence on the basic reproduction number in the animal population and indicate the existence of a novel multicritical point with altered scaling behavior. The coupling of animal and human infection dynamics has crucial implications, most importantly allowing for the possibility of large human outbreaks even when human-to-human transmission is subcritical.
Singh, Sarabjeet; Schneider, David J.; Myers, Christopher R.
2014-03-01
Branching processes have served as a model for chemical reactions, biological growth processes, and contagion (of disease, information, or fads). Through this connection, these seemingly different physical processes share some common universalities that can be elucidated by analyzing the underlying branching process. In this work we focus on coupled branching processes as a model of infectious diseases spreading from one population to another. An exceedingly important example of such coupled outbreaks are zoonotic infections that spill over from animal populations to humans. We derive several statistical quantities characterizing the first spillover event from animals to humans, including the probability of spillover, the first passage time distribution for human infection, and disease prevalence in the animal population at spillover. Large stochastic fluctuations in those quantities can make inference of the state of the system at the time of spillover difficult. Focusing on outbreaks in the human population, we then characterize the critical threshold for a large outbreak, the distribution of outbreak sizes, and associated scaling laws. These all show a strong dependence on the basic reproduction number in the animal population and indicate the existence of a novel multicritical point with altered scaling behavior. The coupling of animal and human infection dynamics has crucial implications, most importantly allowing for the possibility of large human outbreaks even when human-to-human transmission is subcritical.
The Initial Regression Statistical Characteristics of Intervals Between Zeros of Random Processes
Directory of Open Access Journals (Sweden)
V. K. Hohlov
2014-01-01
Full Text Available The article substantiates the initial regression statistical characteristics of intervals between zeros of realizing random processes, studies their properties allowing the use these features in the autonomous information systems (AIS of near location (NL. Coefficients of the initial regression (CIR to minimize the residual sum of squares of multiple initial regression views are justified on the basis of vector representations associated with a random vector notion of analyzed signal parameters. It is shown that even with no covariance-based private CIR it is possible to predict one random variable through another with respect to the deterministic components. The paper studies dependences of CIR interval sizes between zeros of the narrowband stationary in wide-sense random process with its energy spectrum. Particular CIR for random processes with Gaussian and rectangular energy spectra are obtained. It is shown that the considered CIRs do not depend on the average frequency of spectra, are determined by the relative bandwidth of the energy spectra, and weakly depend on the type of spectrum. CIR properties enable its use as an informative parameter when implementing temporary regression methods of signal processing, invariant to the average rate and variance of the input implementations. We consider estimates of the average energy spectrum frequency of the random stationary process by calculating the length of the time interval corresponding to the specified number of intervals between zeros. It is shown that the relative variance in estimation of the average energy spectrum frequency of stationary random process with increasing relative bandwidth ceases to depend on the last process implementation in processing above ten intervals between zeros. The obtained results can be used in the AIS NL to solve the tasks of detection and signal recognition, when a decision is made in conditions of unknown mathematical expectations on a limited observation
Frontiers in statistical quality control
Wilrich, Peter-Theodor
2004-01-01
This volume treats the four main categories of Statistical Quality Control: General SQC Methodology, On-line Control including Sampling Inspection and Statistical Process Control, Off-line Control with Data Analysis and Experimental Design, and, fields related to Reliability. Experts with international reputation present their newest contributions.
QUALITY IMPROVEMENT USING STATISTICAL PROCESS CONTROL TOOLS IN GLASS BOTTLES MANUFACTURING COMPANY
Directory of Open Access Journals (Sweden)
Yonatan Mengesha Awaj
2013-03-01
Full Text Available In order to survive in a competitive market, improving quality and productivity of product or process is a must for any company. This study is about to apply the statistical process control (SPC tools in the production processing line and on final product in order to reduce defects by identifying where the highest waste is occur at and to give suggestion for improvement. The approach used in this study is direct observation, thorough examination of production process lines, brain storming session, fishbone diagram, and information has been collected from potential customers and company's workers through interview and questionnaire, Pareto chart/analysis and control chart (p-chart was constructed. It has been found that the company has many problems; specifically there is high rejection or waste in the production processing line. The highest waste occurs in melting process line which causes loss due to trickle and in the forming process line which causes loss due to defective product rejection. The vital few problems were identified, it was found that the blisters, double seam, stone, pressure failure and overweight are the vital few problems. The principal aim of the study is to create awareness to quality team how to use SPC tools in the problem analysis, especially to train quality team on how to held an effective brainstorming session, and exploit these data in cause-and-effect diagram construction, Pareto analysis and control chart construction. The major causes of non-conformities and root causes of the quality problems were specified, and possible remedies were proposed. Although the company has many constraints to implement all suggestion for improvement within short period of time, the company recognized that the suggestion will provide significant productivity improvement in the long run.
Statistical process control analysis for patient-specific IMRT and VMAT QA.
Sanghangthum, Taweap; Suriyapee, Sivalee; Srisatit, Somyot; Pawlicki, Todd
2013-05-01
This work applied statistical process control to establish the control limits of the % gamma pass of patient-specific intensity modulated radiotherapy (IMRT) and volumetric modulated arc therapy (VMAT) quality assurance (QA), and to evaluate the efficiency of the QA process by using the process capability index (Cpml). A total of 278 IMRT QA plans in nasopharyngeal carcinoma were measured with MapCHECK, while 159 VMAT QA plans were undertaken with ArcCHECK. Six megavolts with nine fields were used for the IMRT plan and 2.5 arcs were used to generate the VMAT plans. The gamma (3%/3 mm) criteria were used to evaluate the QA plans. The % gamma passes were plotted on a control chart. The first 50 data points were employed to calculate the control limits. The Cpml was calculated to evaluate the capability of the IMRT/VMAT QA process. The results showed higher systematic errors in IMRT QA than VMAT QA due to the more complicated setup used in IMRT QA. The variation of random errors was also larger in IMRT QA than VMAT QA because the VMAT plan has more continuity of dose distribution. The average % gamma pass was 93.7% ± 3.7% for IMRT and 96.7% ± 2.2% for VMAT. The Cpml value of IMRT QA was 1.60 and VMAT QA was 1.99, which implied that the VMAT QA process was more accurate than the IMRT QA process. Our lower control limit for % gamma pass of IMRT is 85.0%, while the limit for VMAT is 90%. Both the IMRT and VMAT QA processes are good quality because Cpml values are higher than 1.0.
Huser, Raphaël
2018-01-09
Extreme-value theory for stochastic processes has motivated the statistical use of max-stable models for spatial extremes. However, fitting such asymptotic models to maxima observed over finite blocks is problematic when the asymptotic stability of the dependence does not prevail in finite samples. This issue is particularly serious when data are asymptotically independent, such that the dependence strength weakens and eventually vanishes as events become more extreme. We here aim to provide flexible sub-asymptotic models for spatially indexed block maxima, which more realistically account for discrepancies between data and asymptotic theory. We develop models pertaining to the wider class of max-infinitely divisible processes, extending the class of max-stable processes while retaining dependence properties that are natural for maxima: max-id models are positively associated, and they yield a self-consistent family of models for block maxima defined over any time unit. We propose two parametric construction principles for max-id models, emphasizing a point process-based generalized spectral representation, that allows for asymptotic independence while keeping the max-stable extremal-$t$ model as a special case. Parameter estimation is efficiently performed by pairwise likelihood, and we illustrate our new modeling framework with an application to Dutch wind gust maxima calculated over different time units.
Energy Technology Data Exchange (ETDEWEB)
Paradies, M.; Unger, L. [System Improvements, Inc., Knoxville, TN (United States); Haas, P.; Terranova, M. [Concord Associates, Inc., Knoxville, TN (United States)
1993-10-01
The three volumes of this report detail a standard investigation process for use by US Nuclear Regulatory Commission (NRC) personnel when investigating human performance related events at nuclear power plants. The process, called the Human Performance Investigation Process (HPIP), was developed to meet the special needs of NRC personnel, especially NRC resident and regional inspectors. HPIP is a systematic investigation process combining current procedures and field practices, expert experience, NRC human performance research, and applicable investigation techniques. The process is easy to learn and helps NRC personnel perform better field investigations of the root causes of human performance problems. The human performance data gathered through such investigations provides a better understanding of the human performance issues that cause events at nuclear power plants. This document, Volume III, is a detailed documentation of the development effort and the pilot training program.
International Nuclear Information System (INIS)
Treat, R.L.; Nesbitt, J.F.; Blair, H.T.; Carter, J.G.; Gorton, P.S.; Partain, W.L.; Timmerman, C.L.
1980-04-01
This document contains preconceptual design data on 11 processes for the solidification and isolation of nuclear high-level liquid wastes (HLLW). The processes are: in-can glass melting (ICGM) process, joule-heated glass melting (JHGM) process, glass-ceramic (GC) process, marbles-in-lead (MIL) matrix process, supercalcine pellets-in-metal (SCPIM) matrix process, pyrolytic-carbon coated pellets-in-metal (PCCPIM) matrix process, supercalcine hot-isostatic-pressing (SCHIP) process, SYNROC hot-isostatic-pressing (SYNROC HIP) process, titanate process, concrete process, and cermet process. For the purposes of this study, it was assumed that each of the solidification processes is capable of handling similar amounts of HLLW generated in a production-sized fuel reprocessing plant. It was also assumed that each of the processes would be enclosed in a shielded canyon or cells within a waste facility located at the fuel reprocessing plant. Finally, it was assumed that all of the processes would be subject to the same set of regulations, codes and standards. Each of the solidification processes converts waste into forms that may be acceptable for geological disposal. Each process begins with the receipt of HLLW from the fuel reprocessing plant. In this study, it was assumed that the original composition of the HLLW would be the same for each process. The process ends when the different waste forms are enclosed in canisters or containers that are acceptable for interim storage. Overviews of each of the 11 processes and the bases used for their identification are presented in the first part of this report. Each process, including its equipment and its requirements, is covered in more detail in Appendices A through K. Pertinent information on the current state of the art and the research and development required for the implementation of each process are also noted in the appendices
Kaur, Baljinder; Chakraborty, Debkumar
2013-11-01
An isolate of P. acidilactici capable of producing vanillin from rice bran was isolated from a milk product. Response Surface Methodology was employed for statistical media and process optimization for production of biovanillin. Statistical medium optimization was done in two steps involving Placket Burman Design and Central Composite Response Designs. The RSM optimized vanillin production medium consisted of 15% (w/v) rice bran, 0.5% (w/v) peptone, 0.1% (w/v) ammonium nitrate, 0.005% (w/v) ferulic acid, 0.005% (w/v) magnesium sulphate, and 0.1% (v/v) tween-80, pH 5.6, at a temperature of 37 degrees C under shaking conditions at 180 rpm. 1.269 g/L vanillin was obtained within 24 h of incubation in optimized culture medium. This is the first report indicating such a high vanillin yield obtained during biotransformation of ferulic acid to vanillin using a Pediococcal isolate.
A concept of volume rendering guided search process to analyze medical data set.
Zhou, Jianlong; Xiao, Chun; Wang, Zhiyan; Takatsuka, Masahiro
2008-03-01
This paper firstly presents an approach of parallel coordinates based parameter control panel (PCP). The PCP is used to control parameters of focal region-based volume rendering (FRVR) during data analysis. It uses a parallel coordinates style interface. Different rendering parameters represented with nodes on each axis, and renditions based on related parameters are connected using polylines to show dependencies between renditions and parameters. Based on the PCP, a concept of volume rendering guided search process is proposed. The search pipeline is divided into four phases. Different parameters of FRVR are recorded and modulated in the PCP during search phases. The concept shows that volume visualization could play the role of guiding a search process in the rendition space to help users to efficiently find local structures of interest. The usability of the proposed approach is evaluated to show its effectiveness.
GPR Raw-Data Order Statistic Filtering and Split-Spectrum Processing to Detect Moisture
Directory of Open Access Journals (Sweden)
Gokhan Kilic
2014-05-01
Full Text Available Considerable research into the area of bridge health monitoring has been undertaken; however, information is still lacking on the effects of certain defects, such as moisture ingress, on the results of ground penetrating radar (GPR surveying. In this paper, this issue will be addressed by examining the results of a GPR bridge survey, specifically the effect of moisture in the predicted position of the rebars. It was found that moisture ingress alters the radargram to indicate distortion or skewing of the steel reinforcements, when in fact destructive testing was able to confirm that no such distortion or skewing had occurred. Additionally, split-spectrum processing with order statistic filters was utilized to detect moisture ingress from the GPR raw data.
Damage localization by statistical evaluation of signal-processed mode shapes
DEFF Research Database (Denmark)
Ulriksen, Martin Dalgaard; Damkilde, Lars
2015-01-01
Due to their inherent ability to provide structural information on a local level, mode shapes and their derivatives are utilized extensively for structural damage identification. Typically, more or less advanced mathematical methods are implemented to identify damage-induced discontinuities in th...... is conducted on the basis of T2-statistics. The proposed method is demonstrated in the context of analytical work with a free-vibrating Euler-Bernoulli beam under noisy conditions.......) and subsequent application of a generalized discrete Teager-Kaiser energy operator (GDTKEO) to identify damage-induced mode shape discontinuities. In order to evaluate whether the identified discontinuities are in fact damage-induced, outlier analysis of principal components of the signal-processed mode shapes...
Energy Technology Data Exchange (ETDEWEB)
Portwood, J.T.
1995-12-31
This paper discusses a database of information collected and organized during the past eight years from 2,000 producing oil wells in the United States, all of which have been treated with special applications techniques developed to improve the effectiveness of MEOR technology. The database, believed to be the first of its kind, has been generated for the purpose of statistically evaluating the effectiveness and economics of the MEOR process in a wide variety of oil reservoir environments, and is a tool that can be used to improve the predictability of treatment response. The information in the database has also been evaluated to determine which, if any, reservoir characteristics are dominant factors in determining the applicability of MEOR.
Self-Organized Criticality in Astrophysics The Statistics of Nonlinear Processes in the Universe
Aschwanden, Markus
2011-01-01
The concept of ‘self-organized criticality’ (SOC) has been applied to a variety of problems, ranging from population growth and traffic jams to earthquakes, landslides and forest fires. The technique is now being applied to a wide range of phenomena in astrophysics, such as planetary magnetospheres, solar flares, cataclysmic variable stars, accretion disks, black holes and gamma-ray bursts, and also to phenomena in galactic physics and cosmology. Self-organized Criticality in Astrophysics introduces the concept of SOC and shows that, due to its universality and ubiquity, it is a law of nature. The theoretical framework and specific physical models are described, together with a range of applications in various aspects of astrophyics. The mathematical techniques, including the statistics of random processes, time series analysis, time scale and waiting time distributions, are presented and the results are applied to specific observations of astrophysical phenomena.
Statistical learning problem of artificial neural network to control roofing process
Directory of Open Access Journals (Sweden)
Lapidus Azariy
2017-01-01
Full Text Available Now software developed on the basis of artificial neural networks (ANN has been actively implemented in construction companies to support decision-making in organization and management of construction processes. ANN learning is the main stage of its development. A key question for supervised learning is how many number of training examples we need to approximate the true relationship between network inputs and output with the desired accuracy. Also designing of ANN architecture is related to learning problem known as “curse of dimensionality”. This problem is important for the study of construction process management because of the difficulty to get training data from construction sites. In previous studies the authors have designed a 4-layer feedforward ANN with a unit model of 12-5-4-1 to approximate estimation and prediction of roofing process. This paper presented the statistical learning side of created ANN with simple-error-minimization algorithm. The sample size to efficient training and the confidence interval of network outputs defined. In conclusion the authors predicted successful ANN learning in a large construction business company within a short space of time.
Directory of Open Access Journals (Sweden)
Sutikno Sutikno
2010-08-01
Full Text Available One of the climate models used to predict the climatic conditions is Global Circulation Models (GCM. GCM is a computer-based model that consists of different equations. It uses numerical and deterministic equation which follows the physics rules. GCM is a main tool to predict climate and weather, also it uses as primary information source to review the climate change effect. Statistical Downscaling (SD technique is used to bridge the large-scale GCM with a small scale (the study area. GCM data is spatial and temporal data most likely to occur where the spatial correlation between different data on the grid in a single domain. Multicollinearity problems require the need for pre-processing of variable data X. Continuum Regression (CR and pre-processing with Principal Component Analysis (PCA methods is an alternative to SD modelling. CR is one method which was developed by Stone and Brooks (1990. This method is a generalization from Ordinary Least Square (OLS, Principal Component Regression (PCR and Partial Least Square method (PLS methods, used to overcome multicollinearity problems. Data processing for the station in Ambon, Pontianak, Losarang, Indramayu and Yuntinyuat show that the RMSEP values and R2 predict in the domain 8x8 and 12x12 by uses CR method produces results better than by PCR and PLS.
Application of Statistical Model in Wastewater Treatment Process Modeling Using Data Analysis
Directory of Open Access Journals (Sweden)
Alireza Raygan Shirazinezhad
2015-06-01
Full Text Available Background: Wastewater treatment includes very complex and interrelated physical, chemical and biological processes which using data analysis techniques can be rigorously modeled by a non-complex mathematical calculation models. Materials and Methods: In this study, data on wastewater treatment processes from water and wastewater company of Kohgiluyeh and Boyer Ahmad were used. A total of 3306 data for COD, TSS, PH and turbidity were collected, then analyzed by SPSS-16 software (descriptive statistics and data analysis IBM SPSS Modeler 14.2, through 9 algorithm. Results: According to the results on logistic regression algorithms, neural networks, Bayesian networks, discriminant analysis, decision tree C5, tree C & R, CHAID, QUEST and SVM had accuracy precision of 90.16, 94.17, 81.37, 70.48, 97.89, 96.56, 96.46, 96.84 and 88.92, respectively. Discussion and conclusion: The C5 algorithm as the best and most applicable algorithms for modeling of wastewater treatment processes were chosen carefully with accuracy of 97.899 and the most influential variables in this model were PH, COD, TSS and turbidity.
Poster - Thur Eve - 29: Detecting changes in IMRT QA using statistical process control.
Drever, L; Salomons, G
2012-07-01
Statistical process control (SPC) methods were used to analyze 239 measurement based individual IMRT QA events. The selected IMRT QA events were all head and neck (H&N) cases with 70Gy in 35 fractions, and all prostate cases with 76Gy in 38 fractions planned between March 2009 and 2012. The results were used to determine if the tolerance limits currently being used for IMRT QA were able to indicate if the process was under control. The SPC calculations were repeated for IMRT QA of the same type of cases that were planned after the treatment planning system was upgraded from Eclipse version 8.1.18 to version 10.0.39. The initial tolerance limits were found to be acceptable for two of the three metrics tested prior to the upgrade. After the upgrade to the treatment planning system the SPC analysis found that the a priori limits were no longer capable of indicating control for 2 of the 3 metrics analyzed. The changes in the IMRT QA results were clearly identified using SPC, indicating that it is a useful tool for finding changes in the IMRT QA process. Routine application of SPC to IMRT QA results would help to distinguish unintentional trends and changes from the random variation in the IMRT QA results for individual plans. © 2012 American Association of Physicists in Medicine.
Energy Technology Data Exchange (ETDEWEB)
Pourmortazavi, Seied Mahdi, E-mail: pourmortazavi@yahoo.com [Faculty of Material and Manufacturing Technologies, Malek Ashtar University of Technology, P.O. Box 16765-3454, Tehran (Iran, Islamic Republic of); Babaee, Saeed; Ashtiani, Fatemeh Shamsi [Faculty of Chemistry & Chemical Engineering, Malek Ashtar University of Technology, Tehran (Iran, Islamic Republic of)
2015-09-15
Graphical abstract: - Highlights: • Surface of magnesium particles was modified with Viton via solvent/non-solvent method. • FT-IR, SEM, EDX, Map analysis, and TG/DSC techniques were employed to characterize the coated particles. • Coating process factors were optimized by Taguchi robust design. • The importance of coating conditions on resistance of coated magnesium against oxidation was studied. - Abstract: The surface of magnesium particles was modified by coating with Viton as an energetic polymer using solvent/non-solvent technique. Taguchi robust method was utilized as a statistical experiment design to evaluate the role of coating process parameters. The coated magnesium particles were characterized by various techniques, i.e., Fourier transform infrared (FT-IR) spectroscopy, scanning electron microscopy (SEM), energy-dispersive X-ray spectroscopy (EDX) and thermogravimetry (TG), and differential scanning calorimetry (DSC). The results showed that the coating of magnesium powder with the Viton leads to a higher resistance of metal against oxidation in the presence of air atmosphere. Meanwhile, tuning of the coating process parameters (i.e., percent of Viton, flow rate of non-solvent addition, and type of solvent) influences on the resistance of the metal particles against thermal oxidation. Coating of magnesium particles yields Viton coated particles with higher thermal stability (632 °C); in comparison with the pure magnesium powder, which commences oxidation in the presence of air atmosphere at a lower temperature of 260 °C.
Baez-Cazull, S. E.; McGuire, J.T.; Cozzarelli, I.M.; Voytek, M.A.
2008-01-01
Determining the processes governing aqueous biogeochemistry in a wetland hydrologically linked to an underlying contaminated aquifer is challenging due to the complex exchange between the systems and their distinct responses to changes in precipitation, recharge, and biological activities. To evaluate temporal and spatial processes in the wetland-aquifer system, water samples were collected using cm-scale multichambered passive diffusion samplers (peepers) to span the wetland-aquifer interface over a period of 3 yr. Samples were analyzed for major cations and anions, methane, and a suite of organic acids resulting in a large dataset of over 8000 points, which was evaluated using multivariate statistics. Principal component analysis (PCA) was chosen with the purpose of exploring the sources of variation in the dataset to expose related variables and provide insight into the biogeochemical processes that control the water chemistry of the system. Factor scores computed from PCA were mapped by date and depth. Patterns observed suggest that (i) fermentation is the process controlling the greatest variability in the dataset and it peaks in May; (ii) iron and sulfate reduction were the dominant terminal electron-accepting processes in the system and were associated with fermentation but had more complex seasonal variability than fermentation; (iii) methanogenesis was also important and associated with bacterial utilization of minerals as a source of electron acceptors (e.g., barite BaSO4); and (iv) seasonal hydrological patterns (wet and dry periods) control the availability of electron acceptors through the reoxidation of reduced iron-sulfur species enhancing iron and sulfate reduction. Copyright ?? 2008 by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America. All rights reserved.
The Relationship between Processing Speed and Regional White Matter Volume in Healthy Young People.
Directory of Open Access Journals (Sweden)
Daniele Magistro
Full Text Available Processing speed is considered a key cognitive resource and it has a crucial role in all types of cognitive performance. Some researchers have hypothesised the importance of white matter integrity in the brain for processing speed; however, the relationship at the whole-brain level between white matter volume (WMV and processing speed relevant to the modality or problem used in the task has never been clearly evaluated in healthy people. In this study, we used various tests of processing speed and Voxel-Based Morphometry (VBM analyses, it is involves a voxel-wise comparison of the local volume of gray and white, to assess the relationship between processing speed and regional WMV (rWMV. We examined the association between processing speed and WMV in 887 healthy young adults (504 men and 383 women; mean age, 20.7 years, SD, 1.85. We performed three different multiple regression analyses: we evaluated rWMV associated with individual differences in the simple processing speed task, word-colour and colour-word tasks (processing speed tasks with words and the simple arithmetic task, after adjusting for age and sex. The results showed a positive relationship at the whole-brain level between rWMV and processing speed performance. In contrast, the processing speed performance did not correlate with rWMV in any of the regions examined. Our results support the idea that WMV is associated globally with processing speed performance regardless of the type of processing speed task.
Wind gust estimation by combining numerical weather prediction model and statistical post-processing
Patlakas, Platon; Drakaki, Eleni; Galanis, George; Spyrou, Christos; Kallos, George
2017-04-01
The continuous rise of off-shore and near-shore activities as well as the development of structures, such as wind farms and various offshore platforms, requires the employment of state-of-the-art risk assessment techniques. Such analysis is used to set the safety standards and can be characterized as a climatologically oriented approach. Nevertheless, a reliable operational support is also needed in order to minimize cost drawbacks and human danger during the construction and the functioning stage as well as during maintenance activities. One of the most important parameters for this kind of analysis is the wind speed intensity and variability. A critical measure associated with this variability is the presence and magnitude of wind gusts as estimated in the reference level of 10m. The latter can be attributed to different processes that vary among boundary-layer turbulence, convection activities, mountain waves and wake phenomena. The purpose of this work is the development of a wind gust forecasting methodology combining a Numerical Weather Prediction model and a dynamical statistical tool based on Kalman filtering. To this end, the parameterization of Wind Gust Estimate method was implemented to function within the framework of the atmospheric model SKIRON/Dust. The new modeling tool combines the atmospheric model with a statistical local adaptation methodology based on Kalman filters. This has been tested over the offshore west coastline of the United States. The main purpose is to provide a useful tool for wind analysis and prediction and applications related to offshore wind energy (power prediction, operation and maintenance). The results have been evaluated by using observational data from the NOAA's buoy network. As it was found, the predicted output shows a good behavior that is further improved after the local adjustment post-process.
NJOY nuclear data processing system. Volume IV. The ERRORR and COVR modules
International Nuclear Information System (INIS)
Muir, D.W.; MacFarlane, R.E.
1985-12-01
The NJOY nuclear data processing system is a comprehensive computer code package for producing cross sections and related nuclear parameters from ENDF/B evaluated nuclear data. This volume provides detailed descriptions of the NJOY modules ERRORR and COVR, which are concerned with the covariances (uncertainties and correlations) of multigroup cross sections and fission neutron yield (anti nu) values. 17 refs
Sadovskii, Michael V
2012-01-01
This volume provides a compact presentation of modern statistical physics at an advanced level. Beginning with questions on the foundations of statistical mechanics all important aspects of statistical physics are included, such as applications to ideal gases, the theory of quantum liquids and superconductivity and the modern theory of critical phenomena. Beyond that attention is given to new approaches, such as quantum field theory methods and non-equilibrium problems.
van Krimpen-Stoop, Edith M. L. A.; Meijer, Rob R.
Person-fit research in the context of paper-and-pencil tests is reviewed, and some specific problems regarding person fit in the context of computerized adaptive testing (CAT) are discussed. Some new methods are proposed to investigate person fit in a CAT environment. These statistics are based on Statistical Process Control (SPC) theory. A…
Initiating statistical process control to improve quality outcomes in colorectal surgery.
Keller, Deborah S; Stulberg, Jonah J; Lawrence, Justin K; Samia, Hoda; Delaney, Conor P
2015-12-01
Unexpected variations in postoperative length of stay (LOS) negatively impact resources and patient outcomes. Statistical process control (SPC) measures performance, evaluates productivity, and modifies processes for optimal performance. The goal of this study was to initiate SPC to identify LOS outliers and evaluate its feasibility to improve outcomes in colorectal surgery. Review of a prospective database identified colorectal procedures performed by a single surgeon. Patients were grouped into elective and emergent categories and then stratified by laparoscopic and open approaches. All followed a standardized enhanced recovery protocol. SPC was applied to identify outliers and evaluate causes within each group. A total of 1294 cases were analyzed--83% elective (n = 1074) and 17% emergent (n = 220). Emergent cases were 70.5% open and 29.5% laparoscopic; elective cases were 36.8% open and 63.2% laparoscopic. All groups had a wide range in LOS. LOS outliers ranged from 8.6% (elective laparoscopic) to 10.8% (emergent laparoscopic). Evaluation of outliers demonstrated patient characteristics of higher ASA scores, longer operating times, ICU requirement, and temporary nursing at discharge. Outliers had higher postoperative complication rates in elective open (57.1 vs. 20.0%) and elective lap groups (77.6 vs. 26.1%). Outliers also had higher readmission rates for emergent open (11.4 vs. 5.4%), emergent lap (14.3 vs. 9.2%), and elective lap (32.8 vs. 6.9%). Elective open outliers did not follow trends of longer LOS or higher reoperation rates. SPC is feasible and promising for improving colorectal surgery outcomes. SPC identified patient and process characteristics associated with increased LOS. SPC may allow real-time outlier identification, during quality improvement efforts, and reevaluation of outcomes after introducing process change. SPC has clinical implications for improving patient outcomes and resource utilization.
International Nuclear Information System (INIS)
Hu, T.A.; Lo, J.C.
1994-11-01
A Quality Assurance independent assessment has brought about continued improvement in the PUREX Plant surveillance program at the Department of Energy's Hanford Site. After the independent assessment, Quality Assurance personnel were closely involved in improving the surveillance program, specifically regarding storage tank monitoring. The independent assessment activities included reviewing procedures, analyzing surveillance data, conducting personnel interviews, and communicating with management. Process improvement efforts included: (1) designing data collection methods; (2) gaining concurrence between engineering and management, (3) revising procedures; and (4) interfacing with shift surveillance crews. Through this process, Statistical Process Control (SPC) was successfully implemented and surveillance management was improved. The independent assessment identified several deficiencies within the surveillance system. These deficiencies can be grouped into two areas: (1) data recording and analysis and (2) handling off-normal conditions. By using several independent assessment techniques, Quality Assurance was able to point out program weakness to senior management and present suggestions for improvements. SPC charting, as implemented by Quality Assurance, is an excellent tool for diagnosing the process, improving communication between the team members, and providing a scientific database for management decisions. In addition, the surveillance procedure was substantially revised. The goals of this revision were to (1) strengthen the role of surveillance management, engineering and operators and (2) emphasize the importance of teamwork for each individual who performs a task. In this instance we believe that the value independent assessment adds to the system is the continuous improvement activities that follow the independent assessment. Excellence in teamwork between the independent assessment organization and the auditee is the key to continuing improvement
The physics benchmark processes for the detector performance studies used in CLIC CDR Volume 3
Allanach, B.J.; Desch, K.; Ellis, J.; Giudice, G.; Grefe, C.; Kraml, S.; Lastovicka, T.; Linssen, L.; Marschall, J.; Martin, S.P.; Muennich, A.; Poss, S.; Roloff, P.; Simon, F.; Strube, J.; Thomson, M.; Wells, J.D.
2012-01-01
This note describes the detector benchmark processes used in volume 3 of the CLIC conceptual design report (CDR), which explores a staged construction and operation of the CLIC accelerator. The goal of the detector benchmark studies is to assess the performance of the CLIC ILD and CLIC SiD detector concepts for different physics processes and at a few CLIC centre-of-mass energies.
Infant Statistical-Learning Ability Is Related to Real-Time Language Processing
Lany, Jill; Shoaib, Amber; Thompson, Abbie; Estes, Katharine Graf
2018-01-01
Infants are adept at learning statistical regularities in artificial language materials, suggesting that the ability to learn statistical structure may support language development. Indeed, infants who perform better on statistical learning tasks tend to be more advanced in parental reports of infants' language skills. Work with adults suggests…
Directory of Open Access Journals (Sweden)
Sean Ekins
Full Text Available Dispensing and dilution processes may profoundly influence estimates of biological activity of compounds. Published data show Ephrin type-B receptor 4 IC50 values obtained via tip-based serial dilution and dispensing versus acoustic dispensing with direct dilution differ by orders of magnitude with no correlation or ranking of datasets. We generated computational 3D pharmacophores based on data derived by both acoustic and tip-based transfer. The computed pharmacophores differ significantly depending upon dispensing and dilution methods. The acoustic dispensing-derived pharmacophore correctly identified active compounds in a subsequent test set where the tip-based method failed. Data from acoustic dispensing generates a pharmacophore containing two hydrophobic features, one hydrogen bond donor and one hydrogen bond acceptor. This is consistent with X-ray crystallography studies of ligand-protein interactions and automatically generated pharmacophores derived from this structural data. In contrast, the tip-based data suggest a pharmacophore with two hydrogen bond acceptors, one hydrogen bond donor and no hydrophobic features. This pharmacophore is inconsistent with the X-ray crystallographic studies and automatically generated pharmacophores. In short, traditional dispensing processes are another important source of error in high-throughput screening that impacts computational and statistical analyses. These findings have far-reaching implications in biological research.
Lee, Rena; Kim, Kyubo; Cho, Samju; Lim, Sangwook; Lee, Suk; Shim, Jang Bo; Huh, Hyun Do; Lee, Sang Hoon; Ahn, Sohyun
2017-11-01
This study applied statistical process control to set and verify the quality assurances (QA) tolerance standard for our hospital's characteristics with the criteria standards that are applied to all the treatment sites with this analysis. Gamma test factor of delivery quality assurances (DQA) was based on 3%/3 mm. Head and neck, breast, prostate cases of intensity modulated radiation therapy (IMRT) or volumetric arc radiation therapy (VMAT) were selected for the analysis of the QA treatment sites. The numbers of data used in the analysis were 73 and 68 for head and neck patients. Prostate and breast were 49 and 152 by MapCHECK and ArcCHECK respectively. C p value of head and neck and prostate QA were above 1.0, C pml is 1.53 and 1.71 respectively, which is close to the target value of 100%. C pml value of breast (IMRT) was 1.67, data values are close to the target value of 95%. But value of was 0.90, which means that the data values are widely distributed. C p and C pml of breast VMAT QA were respectively 1.07 and 2.10. This suggests that the VMAT QA has better process capability than the IMRT QA. Consequently, we should pay more attention to planning and QA before treatment for breast Radiotherapy.
DEFF Research Database (Denmark)
Thorborg, Jesper
, however, is constituted by the implementation of the $J_2$ flow theory in the control volume method. To apply the control volume formulation on the process of hardening concrete viscoelastic stress-strain models has been examined in terms of various rheological models. The generalized 3D models are based...... on two different suggestions in the literature, that is compressible or incompressible behaviour of the viscos response in the dashpot element. Numerical implementation of the models has shown very good agreement with corresponding analytical solutions. The viscoelastic solid mechanical model is used...
Directory of Open Access Journals (Sweden)
Shafiul Haque
2016-11-01
Full Text Available AbstractFor a commercially viable recombinant intracellular protein production process, efficient cell lysis and protein release is a major bottleneck. The recovery of recombinant protein, cholesterol oxidase (COD was studied in a continuous bead milling process. A full factorial Response Surface Model (RSM design was employed and compared to Artificial Neural Networks coupled with Genetic Algorithm (ANN-GA. Significant process variables, cell slurry feed rate (A, bead load (B, cell load (C and run time (D, were investigated and optimized for maximizing COD recovery. RSM predicted an optimum of feed rate of 310.73 mL/h, bead loading of 79.9% (v/v, cell loading OD600 nm of 74, and run time of 29.9 min with a recovery of ~3.2 g/L. ANN coupled with GA predicted a maximum COD recovery of ~3.5 g/L at an optimum feed rate (mL/h: 258.08, bead loading (%, v/v: 80%, cell loading (OD600 nm: 73.99, and run time of 32 min. An overall 3.7-fold increase in productivity is obtained when compared to a batch process. Optimization and comparison of statistical vs. artificial intelligence techniques in continuous bead milling process has been attempted for the very first time in our study. We were able to successfully represent the complex non-linear multivariable dependence of enzyme recovery on bead milling parameters. The quadratic second order response functions are not flexible enough to represent such complex non-linear dependence. ANN being a summation function of multiple layers are capable to represent complex non-linear dependence of variables in this case; enzyme recovery as a function of bead milling parameters. Since GA can even optimize discontinuous functions present study cites a perfect example of using machine learning (ANN in combination with evolutionary optimization (GA for representing undefined biological functions which is the case for common industrial processes involving biological moieties.
Statistical evaluation of tablet coating processes: influence of pan design and solvent type
Directory of Open Access Journals (Sweden)
Valdomero Pereira de Melo Junior
2010-12-01
Full Text Available Partially and fully perforated pan coaters are among the most relevant types of equipment currently used in the process of coating tablets. The goal of this study was to assess the performance differences among these types of equipment employing a factorial design. This statistical approach allowed the simultaneous study of the process variables and verification of interactions among them. The study included partially-perforated and fully-perforated pan coaters, aqueous and organic solvents, as well as hypromellose-based immediate-release coating. The dependent variables were process time, energy consumption, mean weight of tablets and process yield. For the tests, placebo tablets with a mean weight of 250 mg were produced, divided into eight lots of two kilograms each and coated in duplicate, using both partially perforated pan and fully perforated pan coaters. The results showed a significant difference between the type of equipment used (partially and fully perforated pan coaters with regard to process time and energy consumption, whereas no significant difference was identified for mean weight of the coated tablets and process yield.Entre os tipos de equipamentos de maior relevância utilizados atualmente no processo de revestimento de comprimidos estão os de tambor parcial e totalmente perfurados. A proposta desse trabalho foi avaliar as diferenças de desempenho entre esses equipamentos empregando projeto fatorial. Essa abordagem estatística possibilitou o estudo simultâneo das variáveis do processo, permitindo verificar interações entre elas. O trabalho incluiu equipamento com tambor parcialmente perfurado e totalmente perfurado, solventes aquoso e orgânico, assim como revestimento de liberação imediata à base de hipromelose. As variáveis dependentes ou respostas foram tempo de processo, consumo de energia, peso médio e rendimento do processo. Para os ensaios, foram produzidos comprimidos de placebo de 250 mg de peso m
Exploring the use of statistical process control methods to assess course changes
Vollstedt, Ann-Marie
This dissertation pertains to the field of Engineering Education. The Department of Mechanical Engineering at the University of Nevada, Reno (UNR) is hosting this dissertation under a special agreement. This study was motivated by the desire to find an improved, quantitative measure of student quality that is both convenient to use and easy to evaluate. While traditional statistical analysis tools such as ANOVA (analysis of variance) are useful, they are somewhat time consuming and are subject to error because they are based on grades, which are influenced by numerous variables, independent of student ability and effort (e.g. inflation and curving). Additionally, grades are currently the only measure of quality in most engineering courses even though most faculty agree that grades do not accurately reflect student quality. Based on a literature search, in this study, quality was defined as content knowledge, cognitive level, self efficacy, and critical thinking. Nineteen treatments were applied to a pair of freshmen classes in an effort in increase the qualities. The qualities were measured via quiz grades, essays, surveys, and online critical thinking tests. Results from the quality tests were adjusted and filtered prior to analysis. All test results were subjected to Chauvenet's criterion in order to detect and remove outlying data. In addition to removing outliers from data sets, it was felt that individual course grades needed adjustment to accommodate for the large portion of the grade that was defined by group work. A new method was developed to adjust grades within each group based on the residual of the individual grades within the group and the portion of the course grade defined by group work. It was found that the grade adjustment method agreed 78% of the time with the manual ii grade changes instructors made in 2009, and also increased the correlation between group grades and individual grades. Using these adjusted grades, Statistical Process Control
Walter, Donald A.; Starn, J. Jeffrey
2013-01-01
Statistical models of nitrate occurrence in the glacial aquifer system of the northern United States, developed by the U.S. Geological Survey, use observed relations between nitrate concentrations and sets of explanatory variables—representing well-construction, environmental, and source characteristics— to predict the probability that nitrate, as nitrogen, will exceed a threshold concentration. However, the models do not explicitly account for the processes that control the transport of nitrogen from surface sources to a pumped well and use area-weighted mean spatial variables computed from within a circular buffer around the well as a simplified source-area conceptualization. The use of models that explicitly represent physical-transport processes can inform and, potentially, improve these statistical models. Specifically, groundwater-flow models simulate advective transport—predominant in many surficial aquifers— and can contribute to the refinement of the statistical models by (1) providing for improved, physically based representations of a source area to a well, and (2) allowing for more detailed estimates of environmental variables. A source area to a well, known as a contributing recharge area, represents the area at the water table that contributes recharge to a pumped well; a well pumped at a volumetric rate equal to the amount of recharge through a circular buffer will result in a contributing recharge area that is the same size as the buffer but has a shape that is a function of the hydrologic setting. These volume-equivalent contributing recharge areas will approximate circular buffers in areas of relatively flat hydraulic gradients, such as near groundwater divides, but in areas with steep hydraulic gradients will be elongated in the upgradient direction and agree less with the corresponding circular buffers. The degree to which process-model-estimated contributing recharge areas, which simulate advective transport and therefore account for
Govindarajan, R; Llueguera, E; Melero, A; Molero, J; Soler, N; Rueda, C; Paradinas, C
2010-01-01
Statistical Process Control (SPC) was applied to monitor patient set-up in radiotherapy and, when the measured set-up error values indicated a loss of process stability, its root cause was identified and eliminated to prevent set-up errors. Set up errors were measured for medial-lateral (ml), cranial-caudal (cc) and anterior-posterior (ap) dimensions and then the upper control limits were calculated. Once the control limits were known and the range variability was acceptable, treatment set-up errors were monitored using sub-groups of 3 patients, three times each shift. These values were plotted on a control chart in real time. Control limit values showed that the existing variation was acceptable. Set-up errors, measured and plotted on a X chart, helped monitor the set-up process stability and, if and when the stability was lost, treatment was interrupted, the particular cause responsible for the non-random pattern was identified and corrective action was taken before proceeding with the treatment. SPC protocol focuses on controlling the variability due to assignable cause instead of focusing on patient-to-patient variability which normally does not exist. Compared to weekly sampling of set-up error in each and every patient, which may only ensure that just those sampled sessions were set-up correctly, the SPC method enables set-up error prevention in all treatment sessions for all patients and, at the same time, reduces the control costs. Copyright © 2009 SECA. Published by Elsevier Espana. All rights reserved.
McCray, Wilmon Wil L., Jr.
The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization
DEFF Research Database (Denmark)
Carstensen, Jakob; Madsen, Henrik; Poulsen, Niels Kjølstad
1994-01-01
of the processes, i.e. including prior knowledge, with the significant effects found in data by using statistical identification methods. Rates of the biochemical and hydraulic processes are identified by statistical methods and the related constants for the biochemical processes are estimated assuming Monod...... kinetics. The models only include those hydraulic and kinetic parameters, which have shown to be significant in a statistical sense, and hence they can be quantified. The application potential of these models is on-line control, because the present state of the plant is given by the variables of the models......The introduction of on-line sensors of nutrient salt concentrations on wastewater treatment plants opens a wide new area of modelling wastewater processes. Time series models of these processes are very useful for gaining insight in real time operation of wastewater treatment systems which deal...
International Nuclear Information System (INIS)
Ross, W.A.; Lokken, R.O.; May, R.P.; Roberts, F.P.; Timmerman, C.L.; Treat, R.L.; Westsik, J.H. Jr.
1982-09-01
This study provides an assesses seven waste forms and eight processes for immobilizing transuranic (TRU) wastes. The waste forms considered are cast cement, cold-pressed cement, FUETAP (formed under elevated temperature and pressure) cement, borosilicate glass, aluminosilicate glass, basalt glass-ceramic, and cold-pressed and sintered silicate ceramic. The waste-immobilization processes considered are in-can glass melting, joule-heated glass melting, glass marble forming, cement casting, cement cold-pressing, FUETAP cement processing, ceramic cold-pressing and sintering, basalt glass-ceramic processing. Properties considered included gas generation, chemical durability, mechanical strength, thermal stability, and radiation stability. The ceramic products demonstrated the best properties, except for plutonium release during leaching. The glass and ceramic products had similar properties. The cement products generally had poorer properties than the other forms, except for plutonium release during leaching. Calculations of the Pu release indicated that the waste forms met the proposed NRC release rate limit of 1 part in 10 5 per year in most test conditions. The cast-cement process had the lowest processing cost, followed closely by the cold-pressed and FUETAP cement processes. Joule-heated glass melting had the lower cost of the glass processes. In-can melting in a high-quality canister had the highest cost, and cold-pressed and sintered ceramic the second highest. Labor and canister costs for in-can melting were identified. The major contributor to costs of disposing of TRU wastes in a defense waste repository is waste processing costs. Repository costs could become the dominant cost for disposing of TRU wastes in a commercial repository. It is recommended that cast and FUETAP cement and borosilicate glass waste-form systems be considered. 13 figures, 16 tables
International Nuclear Information System (INIS)
Bradshaw, R.C.; Schmidt, D.P.; Rogers, J.R.; Kelton, K.F.; Hyers, R.W.
2005-01-01
By combining the best practices in optical dilatometry with numerical methods, a high-speed and high-precision technique has been developed to measure the volume of levitated, containerlessly processed samples with subpixel resolution. Containerless processing provides the ability to study highly reactive materials without the possibility of contamination affecting thermophysical properties. Levitation is a common technique used to isolate a sample as it is being processed. Noncontact optical measurement of thermophysical properties is very important as traditional measuring methods cannot be used. Modern, digitally recorded images require advanced numerical routines to recover the subpixel locations of sample edges and, in turn, produce high-precision measurements
Chitre, S. R.
1978-01-01
The paper presents an experimentally developed surface macro-structuring process suitable for high volume production of silicon solar cells. The process lends itself easily to automation for high throughput to meet low-cost solar array goals. The tetrahedron structure observed is 0.5 - 12 micron high. The surface has minimal pitting with virtually no or very few undeveloped areas across the surface. This process has been developed for (100) oriented as cut silicon. Chemi-etched, hydrophobic and lapped surfaces were successfully texturized. A cost analysis as per Samics is presented.
Extrusion Process by Finite Volume Method Using OpenFoam Software
International Nuclear Information System (INIS)
Matos Martins, Marcelo; Tonini Button, Sergio; Divo Bressan, Jose; Ivankovic, Alojz
2011-01-01
The computational codes are very important tools to solve engineering problems. In the analysis of metal forming process, such as extrusion, this is not different because the computational codes allow analyzing the process with reduced cost. Traditionally, the Finite Element Method is used to solve solid mechanic problems, however, the Finite Volume Method (FVM) have been gaining force in this field of applications. This paper presents the velocity field and friction coefficient variation results, obtained by numerical simulation using the OpenFoam Software and the FVM to solve an aluminum direct cold extrusion process.
International Nuclear Information System (INIS)
Lan, Ganhui; Tu, Yuhai
2016-01-01
preserving information, it does not reveal the underlying mechanism that leads to the observed input-output relationship, nor does it tell us much about which information is important for the organism and how biological systems use information to carry out specific functions. To do that, we need to develop models of the biological machineries, e.g. biochemical networks and neural networks, to understand the dynamics of biological information processes. This is a much more difficult task. It requires deep knowledge of the underlying biological network—the main players (nodes) and their interactions (links)—in sufficient detail to build a model with predictive power, as well as quantitative input-output measurements of the system under different perturbations (both genetic variations and different external conditions) to test the model predictions to guide further development of the model. Due to the recent growth of biological knowledge thanks in part to high throughput methods (sequencing, gene expression microarray, etc) and development of quantitative in vivo techniques such as various florescence technology, these requirements are starting to be realized in different biological systems. The possible close interaction between quantitative experimentation and theoretical modeling has made systems biology an attractive field for physicists interested in quantitative biology. In this review, we describe some of the recent work in developing a quantitative predictive model of bacterial chemotaxis, which can be considered as the hydrogen atom of systems biology. Using statistical physics approaches, such as the Ising model and Langevin equation, we study how bacteria, such as E. coli, sense and amplify external signals, how they keep a working memory of the stimuli, and how they use these data to compute the chemical gradient. In particular, we will describe how E. coli cells avoid cross-talk in a heterogeneous receptor cluster to keep a ligand-specific memory. We will also
Lan, Ganhui; Tu, Yuhai
2016-05-01
preserving information, it does not reveal the underlying mechanism that leads to the observed input-output relationship, nor does it tell us much about which information is important for the organism and how biological systems use information to carry out specific functions. To do that, we need to develop models of the biological machineries, e.g. biochemical networks and neural networks, to understand the dynamics of biological information processes. This is a much more difficult task. It requires deep knowledge of the underlying biological network—the main players (nodes) and their interactions (links)—in sufficient detail to build a model with predictive power, as well as quantitative input-output measurements of the system under different perturbations (both genetic variations and different external conditions) to test the model predictions to guide further development of the model. Due to the recent growth of biological knowledge thanks in part to high throughput methods (sequencing, gene expression microarray, etc) and development of quantitative in vivo techniques such as various florescence technology, these requirements are starting to be realized in different biological systems. The possible close interaction between quantitative experimentation and theoretical modeling has made systems biology an attractive field for physicists interested in quantitative biology. In this review, we describe some of the recent work in developing a quantitative predictive model of bacterial chemotaxis, which can be considered as the hydrogen atom of systems biology. Using statistical physics approaches, such as the Ising model and Langevin equation, we study how bacteria, such as E. coli, sense and amplify external signals, how they keep a working memory of the stimuli, and how they use these data to compute the chemical gradient. In particular, we will describe how E. coli cells avoid cross-talk in a heterogeneous receptor cluster to keep a ligand-specific memory. We will also
Lan, Ganhui; Tu, Yuhai
2016-05-01
preserving information, it does not reveal the underlying mechanism that leads to the observed input-output relationship, nor does it tell us much about which information is important for the organism and how biological systems use information to carry out specific functions. To do that, we need to develop models of the biological machineries, e.g. biochemical networks and neural networks, to understand the dynamics of biological information processes. This is a much more difficult task. It requires deep knowledge of the underlying biological network-the main players (nodes) and their interactions (links)-in sufficient detail to build a model with predictive power, as well as quantitative input-output measurements of the system under different perturbations (both genetic variations and different external conditions) to test the model predictions to guide further development of the model. Due to the recent growth of biological knowledge thanks in part to high throughput methods (sequencing, gene expression microarray, etc) and development of quantitative in vivo techniques such as various florescence technology, these requirements are starting to be realized in different biological systems. The possible close interaction between quantitative experimentation and theoretical modeling has made systems biology an attractive field for physicists interested in quantitative biology. In this review, we describe some of the recent work in developing a quantitative predictive model of bacterial chemotaxis, which can be considered as the hydrogen atom of systems biology. Using statistical physics approaches, such as the Ising model and Langevin equation, we study how bacteria, such as E. coli, sense and amplify external signals, how they keep a working memory of the stimuli, and how they use these data to compute the chemical gradient. In particular, we will describe how E. coli cells avoid cross-talk in a heterogeneous receptor cluster to keep a ligand-specific memory. We will also
Influences of excluded volume of molecules on signaling processes on the biomembrane.
Directory of Open Access Journals (Sweden)
Masashi Fujii
Full Text Available We investigate the influences of the excluded volume of molecules on biochemical reaction processes on 2-dimensional surfaces using a model of signal transduction processes on biomembranes. We perform simulations of the 2-dimensional cell-based model, which describes the reactions and diffusion of the receptors, signaling proteins, target proteins, and crowders on the cell membrane. The signaling proteins are activated by receptors, and these activated signaling proteins activate target proteins that bind autonomously from the cytoplasm to the membrane, and unbind from the membrane if activated. If the target proteins bind frequently, the volume fraction of molecules on the membrane becomes so large that the excluded volume of the molecules for the reaction and diffusion dynamics cannot be negligible. We find that such excluded volume effects of the molecules induce non-trivial variations of the signal flow, defined as the activation frequency of target proteins, as follows. With an increase in the binding rate of target proteins, the signal flow varies by i monotonically increasing; ii increasing then decreasing in a bell-shaped curve; or iii increasing, decreasing, then increasing in an S-shaped curve. We further demonstrate that the excluded volume of molecules influences the hierarchical molecular distributions throughout the reaction processes. In particular, when the system exhibits a large signal flow, the signaling proteins tend to surround the receptors to form receptor-signaling protein clusters, and the target proteins tend to become distributed around such clusters. To explain these phenomena, we analyze the stochastic model of the local motions of molecules around the receptor.
International Nuclear Information System (INIS)
Ma, Lingling; Lv, Enmin; Du, Lixiong; Lu, Jie; Ding, Jincheng
2016-01-01
Highlights: • Microwave irradiation was employed for the esterification of acidified oil. • Optimization and modeling of the process was performed by RSM and ANN. • Both models have reliable prediction abilities but the ANN was superior over the RSM. • Membrane vapor permeation and in-situ dehydration were used to shift the equilibrium. • Two dehydration approaches improved the FFAs conversion rate by 20.0% approximately. - Abstract: The esterification of acidified oil with ethanol under microwave radiation was modeled and optimized using response surface methodology (RSM) and artificial neural network (ANN). The impacts of mass ratio of ethanol to acidified oil, catalyst loading, microwave power and reaction time are evaluated by Box-Behnken design (BBD) of RSM and multi-layer perceptron (MLP) of ANN. RSM combined with BBD shows the optimal conditions as catalyst loading of 5.85 g, mass ratio of ethanol to acidified oil of 0.35 (20.0 g acidified oil), microwave power of 328 W and reaction time of 98.0 min with the free fatty acids (FFAs) conversion of 78.57%. Both of the models are fitted well with the experimental data, however, ANN exhibits better prediction accuracy than RSM based on the statistical analyses. Furthermore, membrane vapor permeation and in-situ molecular sieve dehydration were investigated to enhance the esterification under the optimized conditions.
Directory of Open Access Journals (Sweden)
Gilles Feron
Full Text Available For human beings, the mouth is the first organ to perceive food and the different signalling events associated to food breakdown. These events are very complex and as such, their description necessitates combining different data sets. This study proposed an integrated approach to understand the relative contribution of main food oral processing events involved in aroma release during cheese consumption. In vivo aroma release was monitored on forty eight subjects who were asked to eat four different model cheeses varying in fat content and firmness and flavoured with ethyl propanoate and nonan-2-one. A multiblock partial least square regression was performed to explain aroma release from the different physiological data sets (masticatory behaviour, bolus rheology, saliva composition and flux, mouth coating and bolus moistening. This statistical approach was relevant to point out that aroma release was mostly explained by masticatory behaviour whatever the cheese and the aroma, with a specific influence of mean amplitude on aroma release after swallowing. Aroma release from the firmer cheeses was explained mainly by bolus rheology. The persistence of hydrophobic compounds in the breath was mainly explained by bolus spreadability, in close relation with bolus moistening. Resting saliva poorly contributed to the analysis whereas the composition of stimulated saliva was negatively correlated with aroma release and mostly for soft cheeses, when significant.
Strauch, R. L.; Istanbulluoglu, E.
2017-12-01
We develop a landslide hazard modeling approach that integrates a data-driven statistical model and a probabilistic process-based shallow landslide model for mapping probability of landslide initiation, transport, and deposition at regional scales. The empirical model integrates the influence of seven site attribute (SA) classes: elevation, slope, curvature, aspect, land use-land cover, lithology, and topographic wetness index, on over 1,600 observed landslides using a frequency ratio (FR) approach. A susceptibility index is calculated by adding FRs for each SA on a grid-cell basis. Using landslide observations we relate susceptibility index to an empirically-derived probability of landslide impact. This probability is combined with results from a physically-based model to produce an integrated probabilistic map. Slope was key in landslide initiation while deposition was linked to lithology and elevation. Vegetation transition from forest to alpine vegetation and barren land cover with lower root cohesion leads to higher frequency of initiation. Aspect effects are likely linked to differences in root cohesion and moisture controlled by solar insulation and snow. We demonstrate the model in the North Cascades of Washington, USA and identify locations of high and low probability of landslide impacts that can be used by land managers in their design, planning, and maintenance.
Using a statistical process control chart during the quality assessment of cancer registry data.
Myles, Zachary M; German, Robert R; Wilson, Reda J; Wu, Manxia
2011-01-01
Statistical process control (SPC) charts may be used to detect acute variations in the data while simultaneously evaluating unforeseen aberrations that may warrant further investigation by the data user. Using cancer stage data captured by the Summary Stage 2000 (SS2000) variable, we sought to present a brief report highlighting the utility of the SPC chart during the quality assessment of cancer registry data. Using a county-level caseload for the diagnosis period of 2001-2004 (n=25,648), we found the overall variation of the SS2000 variable to be in control during diagnosis years of 2001 and 2002, exceeded the lower control limit (LCL) in 2003, and exceeded the upper control limit (UCL) in 2004; in situ/localized stages were in control throughout the diagnosis period, regional stage exceeded UCL in 2004, and distant stage exceeded the LCL in 2001 and the UCL in 2004. Our application of the SPC chart with cancer registry data illustrates that the SPC chart may serve as a readily available and timely tool for identifying areas of concern during the data collection and quality assessment of central cancer registry data.
TU-FG-201-05: Varian MPC as a Statistical Process Control Tool
International Nuclear Information System (INIS)
Carver, A; Rowbottom, C
2016-01-01
Purpose: Quality assurance in radiotherapy requires the measurement of various machine parameters to ensure they remain within permitted values over time. In Truebeam release 2.0 the Machine Performance Check (MPC) was released allowing beam output and machine axis movements to be assessed in a single test. We aim to evaluate the Varian Machine Performance Check (MPC) as a tool for Statistical Process Control (SPC). Methods: Varian’s MPC tool was used on three Truebeam and one EDGE linac for a period of approximately one year. MPC was commissioned against independent systems. After this period the data were reviewed to determine whether or not the MPC was useful as a process control tool. Analyses on individual tests were analysed using Shewhart control plots, using Matlab for analysis. Principal component analysis was used to determine if a multivariate model was of any benefit in analysing the data. Results: Control charts were found to be useful to detect beam output changes, worn T-nuts and jaw calibration issues. Upper and lower control limits were defined at the 95% level. Multivariate SPC was performed using Principal Component Analysis. We found little evidence of clustering beyond that which might be naively expected such as beam uniformity and beam output. Whilst this makes multivariate analysis of little use it suggests that each test is giving independent information. Conclusion: The variety of independent parameters tested in MPC makes it a sensitive tool for routine machine QA. We have determined that using control charts in our QA programme would rapidly detect changes in machine performance. The use of control charts allows large quantities of tests to be performed on all linacs without visual inspection of all results. The use of control limits alerts users when data are inconsistent with previous measurements before they become out of specification. A. Carver has received a speaker’s honorarium from Varian
SU-E-T-205: MLC Predictive Maintenance Using Statistical Process Control Analysis.
Able, C; Hampton, C; Baydush, A; Bright, M
2012-06-01
MLC failure increases accelerator downtime and negatively affects the clinic treatment delivery schedule. This study investigates the use of Statistical Process Control (SPC), a modern quality control methodology, to retrospectively evaluate MLC performance data thereby predicting the impending failure of individual MLC leaves. SPC, a methodology which detects exceptional variability in a process, was used to analyze MLC leaf velocity data. A MLC velocity test is performed weekly on all leaves during morning QA. The leaves sweep 15 cm across the radiation field with the gantry pointing down. The leaf speed is analyzed from the generated dynalog file using quality assurance software. MLC leaf speeds in which a known motor failure occurred (8) and those in which no motor replacement was performed (11) were retrospectively evaluated for a 71 week period. SPC individual and moving range (I/MR) charts were used in the analysis. The I/MR chart limits were calculated using the first twenty weeks of data and set at 3 standard deviations from the mean. The MLCs in which a motor failure occurred followed two general trends: (a) no data indicating a change in leaf speed prior to failure (5 of 8) and (b) a series of data points exceeding the limit prior to motor failure (3 of 8). I/MR charts for a high percentage (8 of 11) of the non-replaced MLC motors indicated that only a single point exceeded the limit. These single point excesses were deemed false positives. SPC analysis using MLC performance data may be helpful in detecting a significant percentage of impending failures of MLC motors. The ability to detect MLC failure may depend on the method of failure (i.e. gradual or catastrophic). Further study is needed to determine if increasing the sampling frequency could increase reliability. Project was support by a grant from Varian Medical Systems, Inc. © 2012 American Association of Physicists in Medicine.
Müller, M. F.; Thompson, S. E.
2016-02-01
The prediction of flow duration curves (FDCs) in ungauged basins remains an important task for hydrologists given the practical relevance of FDCs for water management and infrastructure design. Predicting FDCs in ungauged basins typically requires spatial interpolation of statistical or model parameters. This task is complicated if climate becomes non-stationary, as the prediction challenge now also requires extrapolation through time. In this context, process-based models for FDCs that mechanistically link the streamflow distribution to climate and landscape factors may have an advantage over purely statistical methods to predict FDCs. This study compares a stochastic (process-based) and statistical method for FDC prediction in both stationary and non-stationary contexts, using Nepal as a case study. Under contemporary conditions, both models perform well in predicting FDCs, with Nash-Sutcliffe coefficients above 0.80 in 75 % of the tested catchments. The main drivers of uncertainty differ between the models: parameter interpolation was the main source of error for the statistical model, while violations of the assumptions of the process-based model represented the main source of its error. The process-based approach performed better than the statistical approach in numerical simulations with non-stationary climate drivers. The predictions of the statistical method under non-stationary rainfall conditions were poor if (i) local runoff coefficients were not accurately determined from the gauge network, or (ii) streamflow variability was strongly affected by changes in rainfall. A Monte Carlo analysis shows that the streamflow regimes in catchments characterized by frequent wet-season runoff and a rapid, strongly non-linear hydrologic response are particularly sensitive to changes in rainfall statistics. In these cases, process-based prediction approaches are favored over statistical models.
Incineration as a radioactive waste volume reduction process for CEA nuclear centers
International Nuclear Information System (INIS)
Atabek, R.; Chaudon, L.
1994-01-01
Incineration processes represent a promising solution for waste volume reduction, and will be increasingly used in the future. The features and performance specifications of low-level waste incinerators with capacities ranging from 10 to 20 kg - h -1 at the Fontenay-aux-Roses, Grenoble and Cadarache nuclear centers in France are briefly reviewed. More extensive knowledge of low-level wastes produced in facilities operated by the Commissariat a l'Energie Atomique (CEA) has allowed us to assess the volume reduction obtained by processing combustible waste in existing incinerators. Research and development work is in progress to improve management procedures for higher-level waste and to build facilities capable of incinerating α - contaminated waste. (authors). 6 refs., 5 figs., 1 tab
Energy Technology Data Exchange (ETDEWEB)
Dreher, Marion; Memmler, Michael; Rother, Stefan; Schneider, Sven [Umweltbundesamt, Dessau (Germany); Boehme, Dieter [Bundesministerium fuer Umwelt, Naturschutz und Reaktorsicherheit, Berlin (Germany)
2012-02-15
In July 2011, the Federal Environment Agency (Dessau-Rosslau, Federal Republic of Germany) and the Federal Ministry for the Environment, Nature Conservation and Nuclear Safety (Berlin, Federal Republic of Germany) performed the workshop ''Bioenergy. Data base for the statistics of the renewable energy and emissions balance''. The material volume of this workshop under consideration contains plenary lectures on the state of knowledge and information need as well as materials to the working groups solid biomass (working group 1), biogas (working group 2) and liquid biomass (working group 3).
A Discussion of the Statistical Investigation Process in the Australian Curriculum
McQuade, Vivienne
2013-01-01
Statistics and statistical literacy can be found in the Learning Areas of Mathematics, Geography, Science, History and the upcoming Business and Economics, as well as in the General Capability of Numeracy and all three Crosscurriculum priorities. The Australian Curriculum affords many exciting and varied entry points for the teaching of…
International Nuclear Information System (INIS)
MacFarlane, R.E.; Muir, D.W.; Boicourt, R.M.
1982-05-01
The NJOY nuclear data processing system is a comprehensive computer code package for producing cross sections and related nuclear parameters from ENDF/B evaluated nuclear data. This volume provides detailed descriptions of the NJOY module, which contains the executive program and utility subroutines used by the other modules, and it discusses the theory and computational methods of four of the modules used for producing pointwise cross sections: RECONR, BROADR, HEATR, and THERMR
Stochastic processes, optimization, and control theory a volume in honor of Suresh Sethi
Yan, Houmin
2006-01-01
This edited volume contains 16 research articles. It presents recent and pressing issues in stochastic processes, control theory, differential games, optimization, and their applications in finance, manufacturing, queueing networks, and climate control. One of the salient features is that the book is highly multi-disciplinary. The book is dedicated to Professor Suresh Sethi on the occasion of his 60th birthday, in view of his distinguished career.
The perceptual processing capacity of summary statistics between and within feature dimensions
Attarha, Mouna; Moore, Cathleen M.
2015-01-01
The simultaneous–sequential method was used to test the processing capacity of statistical summary representations both within and between feature dimensions. Sixteen gratings varied with respect to their size and orientation. In Experiment 1, the gratings were equally divided into four separate smaller sets, one of which with a mean size that was larger or smaller than the other three sets, and one of which with a mean orientation that was tilted more leftward or rightward. The task was to report the mean size and orientation of the oddball sets. This therefore required four summary representations for size and another four for orientation. The sets were presented at the same time in the simultaneous condition or across two temporal frames in the sequential condition. Experiment 1 showed evidence of a sequential advantage, suggesting that the system may be limited with respect to establishing multiple within-feature summaries. Experiment 2 eliminates the possibility that some aspect of the task, other than averaging, was contributing to this observed limitation. In Experiment 3, the same 16 gratings appeared as one large superset, and therefore the task only required one summary representation for size and another one for orientation. Equal simultaneous–sequential performance indicated that between-feature summaries are capacity free. These findings challenge the view that within-feature summaries drive a global sense of visual continuity across areas of the peripheral visual field, and suggest a shift in focus to seeking an understanding of how between-feature summaries in one area of the environment control behavior. PMID:26360153
Varona, M A; Soriano, A; Aguirre-Jaime, A; Barrera, M A; Medina, M L; Bañon, N; Mendez, S; Lopez, E; Portero, J; Dominguez, D; Gonzalez, A
2012-01-01
Liver transplantation, the best option for many end-stage liver diseases, is indicated in more candidates than the donor availability. In this situation, this demanding treatment must achieve excellence, accessibility and patient satisfaction to be ethical, scientific, and efficient. The current consensus of quality measurements promoted by the Sociedad Española de Trasplante Hepático (SETH) seeks to depict criteria, indicators, and standards for liver transplantation in Spain. According to this recommendation, the Canary Islands liver program has studied its experience. We separated the 411 cadaveric transplants performed in the last 15 years into 2 groups: The first 100 and the other 311. The 8 criteria of SETH 2010 were correctly fulfilled. In most indicators, the outcomes were favorable, with an actuarial survivals at 1, 3, 5, and 10 years of 84%, 79%, 76%, and 65%, respectively; excellent results in retransplant rates (early 0.56% and long-term 5.9%), primary nonfunction rate (0.43%), waiting list mortality (13.34%), and patient satisfaction (91.5%). On the other hand, some indicators of mortality were worse as perioperative, postoperative, and early mortality with normal graft function and reoperation rate. After the analyses of the series with statistical quality control charts, we observed an improvement in all indicators, even in the apparently worst, early mortality with normal graft functions in a stable program. Such results helped us to discover specific areas to improve the program. The application of the quality measurement, as SETH consensus recommends, has shown in our study that despite being a consuming time process, it is a useful tool. Copyright © 2012 Elsevier Inc. All rights reserved.
Statistical learning and auditory processing in children with music training: An ERP study.
Mandikal Vasuki, Pragati Rao; Sharma, Mridula; Ibrahim, Ronny; Arciuli, Joanne
2017-07-01
The question whether musical training is associated with enhanced auditory and cognitive abilities in children is of considerable interest. In the present study, we compared children with music training versus those without music training across a range of auditory and cognitive measures, including the ability to detect implicitly statistical regularities in input (statistical learning). Statistical learning of regularities embedded in auditory and visual stimuli was measured in musically trained and age-matched untrained children between the ages of 9-11years. In addition to collecting behavioural measures, we recorded electrophysiological measures to obtain an online measure of segmentation during the statistical learning tasks. Musically trained children showed better performance on melody discrimination, rhythm discrimination, frequency discrimination, and auditory statistical learning. Furthermore, grand-averaged ERPs showed that triplet onset (initial stimulus) elicited larger responses in the musically trained children during both auditory and visual statistical learning tasks. In addition, children's music skills were associated with performance on auditory and visual behavioural statistical learning tasks. Our data suggests that individual differences in musical skills are associated with children's ability to detect regularities. The ERP data suggest that musical training is associated with better encoding of both auditory and visual stimuli. Although causality must be explored in further research, these results may have implications for developing music-based remediation strategies for children with learning impairments. Copyright © 2017 International Federation of Clinical Neurophysiology. Published by Elsevier B.V. All rights reserved.
Salzmann-Erikson, Martin; Eriksson, Henrik
2018-02-01
Technology in the healthcare sector is undergoing rapid development. One of the most prominent areas of healthcare in which robots are implemented is nursing homes. However, nursing and technology are often considered as being contradictory, an attitude originating in the view of "the natural" versus "the artificial". Social media mirror this view, including in attitudes and societal debates regarding nursing and care robots. However, little is known about this topic in previous research. To examine user behaviour in social media platforms on the topic of nursing and care robots. A retrospective and cross-sectional observation study design was applied. Data were collected via the Alchemy streaming application programming interface. Data from social media were collected from 1 January 2014 to 5 January 2016. The data set consisted of 12,311 mentions in total. Nursing and care robots are a small-scale topic of discussion in social media. Twitter was found to be the largest channel in terms of volume, followed by Tumblr. News channels had the highest percentage of visibility, while forums and Tumblr had the least. It was found in the data that 67.9% of the mentions were positive, 24.4% were negative and 7.8% were neutral. The volume and visibility of the data on nursing robots found in social media, as well as the attitudes to nursing robots found there, indicate that nursing care robots, which are seen as representing a next step in technological development in healthcare, are a topic on the rise in social media. These findings are likely to be related to the idea that nursing care robots are on the breakthrough of replacing human labour in healthcare worldwide.
Rolinski, S.; Müller, C.; Lotze-Campen, H.; Bondeau, A.
2010-12-01
More than a quarter of the Earth’s land surface is covered by grassland, which is also the major part (~ 70 %) of the agricultural area. Most of this area is used for livestock production in different degrees of intensity. The dynamic global vegetation model LPJmL (Sitch et al., Global Change Biology, 2003; Bondeau et al., Global Change Biology, 2007) is one of few process-based model that simulates biomass production on managed grasslands at the global scale. The implementation of managed grasslands and its evaluation has received little attention so far, as reference data on grassland productivity are scarce and the definition of grassland extent and usage are highly uncertain. However, grassland productivity is related to large areas, and strongly influences global estimates of carbon and water budgets and should thus be improved. Plants are implemented in LPJmL in an aggregated form as plant functional types assuming that processes concerning carbon and water fluxes are quite similar between species of the same type. Therefore, the parameterization of a functional type is possible with parameters in a physiologically meaningful range of values. The actual choice of the parameter values from the possible and reasonable phase space should satisfy the condition of the best fit of model results and measured data. In order to improve the parameterization of managed grass we follow a combined procedure using model output and measured data of carbon and water fluxes. By comparing carbon and water fluxes simultaneously, we expect well-balanced refinements and avoid over-tuning of the model in only one direction. The comparison of annual biomass from grassland to data from the Food and Agriculture Organization of the United Nations (FAO) per country provide an overview about the order of magnitude and the identification of deviations. The comparison of daily net primary productivity, soil respiration and water fluxes at specific sites (FluxNet Data) provides
Energy Technology Data Exchange (ETDEWEB)
Norton, S.A.; Lindberg, S.E.; Page, A.L. (eds.)
1990-01-01
Acidic precipitation and its effects have been the focus of intense research for over two decades. Recently, research has focused on a greater understanding of dose-response relationships between atmospheric loading of acidifying material and lake acidity. This volume of the subseries Acidic Precipitation emphasizes acid neutralizing processes and the capacity of terrestrial and aquatic systems to assimilate acidifying substances and, conversely, the ability of systems to recover after acid loading diminishes. Eight chapters have been processed separately for inclusion in the appropriate data bases.
International Nuclear Information System (INIS)
Klepetkova, Hana; Thinova, Lenka
2010-01-01
All available data regarding natality in Czechoslovakia (or the Czech Republic) before and after the Chernobyl accident are summarized. Data from the databases of the Czech Statistical Office and of the State Office for Nuclear Safety were used to analyze natality and mortality of children in the Czech Republic and to evaluate the relationship between the level of contamination and the change in the sex ratio at time of birth that was observed in some areas in November of 1986. Although the change in the ratio of newborn boys-to-girls ratio was statistically significant, no direct relationship between that ratio and the level of contamination was found. Statistically significant changes in the sex ratio also occurred in Czechoslovakia (or in the Czech Republic) in the past, both before and after the accident. Furthermore, no statistically significant changes in the rate of stillbirths and multiple pregnancies were observed after the Chernobyl accident
An easy and low cost option for economic statistical process control ...
African Journals Online (AJOL)
a large number of nonconforming products are manufactured. ... size, n, sampling interval, h, and control limit parameter, k, that minimize the ...... [11] Montgomery DC, 2001, Introduction to statistical quality control, 4th Edition, John Wiley, New.
Dupont, Corinne; Occelli, Pauline; Deneux-Tharaux, Catherine; Touzet, Sandrine; Duclos, Antoine; Bouvier-Colle, Marie-Hélène; Rudigoz, René-Charles; Huissoud, Cyril
2014-07-01
Severe postpartum haemorrhage after vaginal delivery: a statistical process control chart to report seven years of continuous quality improvement To use statistical process control charts to describe trends in the prevalence of severe postpartum haemorrhage after vaginal delivery. This assessment was performed 7 years after we initiated a continuous quality improvement programme that began with regular criteria-based audits Observational descriptive study, in a French maternity unit in the Rhône-Alpes region. Quarterly clinical audit meetings to analyse all cases of severe postpartum haemorrhage after vaginal delivery and provide feedback on quality of care with statistical process control tools. The primary outcomes were the prevalence of severe PPH after vaginal delivery and its quarterly monitoring with a control chart. The secondary outcomes included the global quality of care for women with severe postpartum haemorrhage, including the performance rate of each recommended procedure. Differences in these variables between 2005 and 2012 were tested. From 2005 to 2012, the prevalence of severe postpartum haemorrhage declined significantly, from 1.2% to 0.6% of vaginal deliveries (pcontrol limits, that is, been out of statistical control. The proportion of cases that were managed consistently with the guidelines increased for all of their main components. Implementation of continuous quality improvement efforts began seven years ago and used, among other tools, statistical process control charts. During this period, the prevalence of severe postpartum haemorrhage after vaginal delivery has been reduced by 50%. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
International Nuclear Information System (INIS)
Huang Guoliang; Yang Xiaoyong; Ma Li; Yang Xu
2011-01-01
In this paper, we developed a metal micro-fluidic chip with advanced surface processing for ultra-small volume molecular isothermal amplification. This method takes advantages of the nucleic acid amplification with good stability and consistency, high sensitivity about 31 genomic DNA copies and bacteria specific gene identification. Based on the advanced surface processing, the bioreaction assays of nucleic acid amplification was dropped about 392nl in volume. A high numerical aperture confocal optical detection system was advanced to sensitively monitor the DNA amplification with low noise and high power collecting fluorescence near to the optical diffraction limit. A speedy nucleic acid isothermal amplification was performed in the ultra-small volume microfluidic chip, where the time at the inflexions of second derivative to DNA exponential amplified curves was brought forward and the sensitivity was improved about 65 folds to that of in current 25μl Ep-tube amplified reaction, which indicates a promising clinic molecular diagnostics in the droplet amplification.
Pisipati, Padmapriya
(acrylonitrile-co-methyl acrylate) of Mw = 200 kg/mol to 160 0C as measured via DSC. Glycerin, ethylene glycol and glycerin/water combinations were investigated as potential plasticizers for high molecular weight (˜200,000 g/mol), high acrylonitrile (93-96 mole:mole %) content poly(acrylonitrile-co-methyl acrylate) statistical copolymers. Pure glycerin (25 wt %) induced crystallization followed by a reduced "Tm" of about 213 0C via DSC. However this composition did not melt process well. A lower M W (˜35 kg/mol) copolymer did extrude with no apparent degradation. Our hypothesis is that the hydroxyl groups in glycerin (or water) disrupt the strong dipole-dipole interactions between the chains enabling the copolymer endothermic transition (Tm) to be reduced and enable melting before the onset of degradation. Additionally high molecular weight (Mw = 200-230 kg/mol) poly(acrylonitrile-co-methyl acrylate) copolymers with lower acrylonitrile content (82-85 wt %) were synthesized via emulsion copolymerization and successfully melt pressed. These materials will be further investigated for their utility in packaging applications.
Im, Sung-Ju
2017-11-15
A new concept of volume-retarded osmosis and low-pressure membrane (VRO-LPM) hybrid process was developed and evaluated for the first time in this study. Commercially available forward osmosis (FO) and ultrafiltration (UF) membranes were employed in a VRO-LPM hybrid process to overcome energy limitations of draw solution (DS) regeneration and production of permeate in the FO process. To evaluate its feasibility as a water reclamation process, and to optimize the operational conditions, cross-flow FO and dead-end mode UF processes were individually evaluated. For the FO process, a DS concentration of 0.15 g mL−1 of polysulfonate styrene (PSS) was determined to be optimal, having a high flux with a low reverse salt flux. The UF membrane with a molecular weight cut-off of 1 kDa was chosen for its high PSS rejection in the LPM process. As a single process, UF (LPM) exhibited a higher flux than FO, but this could be controlled by adjusting the effective membrane area of the FO and UF membranes in the VRO-LPM system. The VRO-LPM hybrid process only required a circulation pump for the FO process. This led to a decrease in the specific energy consumption of the VRO-LPM process for potable water production, that was similar to the single FO process. Therefore, the newly developed VRO-LPM hybrid process, with an appropriate DS selection, can be used as an energy efficient water production method, and can outperform conventional water reclamation processes.
Im, Sung-Ju; Choi, Jungwon; Lee, Jung Gil; Jeong, Sanghyun; Jang, Am
2017-01-01
A new concept of volume-retarded osmosis and low-pressure membrane (VRO-LPM) hybrid process was developed and evaluated for the first time in this study. Commercially available forward osmosis (FO) and ultrafiltration (UF) membranes were employed in a VRO-LPM hybrid process to overcome energy limitations of draw solution (DS) regeneration and production of permeate in the FO process. To evaluate its feasibility as a water reclamation process, and to optimize the operational conditions, cross-flow FO and dead-end mode UF processes were individually evaluated. For the FO process, a DS concentration of 0.15 g mL−1 of polysulfonate styrene (PSS) was determined to be optimal, having a high flux with a low reverse salt flux. The UF membrane with a molecular weight cut-off of 1 kDa was chosen for its high PSS rejection in the LPM process. As a single process, UF (LPM) exhibited a higher flux than FO, but this could be controlled by adjusting the effective membrane area of the FO and UF membranes in the VRO-LPM system. The VRO-LPM hybrid process only required a circulation pump for the FO process. This led to a decrease in the specific energy consumption of the VRO-LPM process for potable water production, that was similar to the single FO process. Therefore, the newly developed VRO-LPM hybrid process, with an appropriate DS selection, can be used as an energy efficient water production method, and can outperform conventional water reclamation processes.
Im, Sung-Ju; Choi, Jungwon; Lee, Jung-Gil; Jeong, Sanghyun; Jang, Am
2018-03-01
A new concept of volume-retarded osmosis and low-pressure membrane (VRO-LPM) hybrid process was developed and evaluated for the first time in this study. Commercially available forward osmosis (FO) and ultrafiltration (UF) membranes were employed in a VRO-LPM hybrid process to overcome energy limitations of draw solution (DS) regeneration and production of permeate in the FO process. To evaluate its feasibility as a water reclamation process, and to optimize the operational conditions, cross-flow FO and dead-end mode UF processes were individually evaluated. For the FO process, a DS concentration of 0.15 g mL -1 of polysulfonate styrene (PSS) was determined to be optimal, having a high flux with a low reverse salt flux. The UF membrane with a molecular weight cut-off of 1 kDa was chosen for its high PSS rejection in the LPM process. As a single process, UF (LPM) exhibited a higher flux than FO, but this could be controlled by adjusting the effective membrane area of the FO and UF membranes in the VRO-LPM system. The VRO-LPM hybrid process only required a circulation pump for the FO process. This led to a decrease in the specific energy consumption of the VRO-LPM process for potable water production, that was similar to the single FO process. Therefore, the newly developed VRO-LPM hybrid process, with an appropriate DS selection, can be used as an energy efficient water production method, and can outperform conventional water reclamation processes. Copyright © 2017 Elsevier Ltd. All rights reserved.
Directory of Open Access Journals (Sweden)
G. Lointier
2013-02-01
Full Text Available The Cluster mission offers an excellent opportunity to investigate the evolution of the plasma population in a large part of the inner magnetosphere, explored near its orbit's perigee, over a complete solar cycle. The WHISPER sounder, on board each satellite of the mission, is particularly suitable to study the electron density in this region, between 0.2 and 80 cm−3. Compiling WHISPER observations during 1339 perigee passes distributed over more than three years of the Cluster mission, we present first results of a statistical analysis dedicated to the study of the electron density morphology and dynamics along and across magnetic field lines between L = 2 and L = 10. In this study, we examine a specific topic: the refilling of the plasmasphere and trough regions during extended periods of quiet magnetic conditions. To do so, we survey the evolution of the ap index during the days preceding each perigee crossing and sort out electron density profiles along the orbit according to three classes, namely after respectively less than 2 days, between 2 and 4 days, and more than 4 days of quiet magnetic conditions (ap ≤ 15 nT following an active episode (ap > 15 nT. This leads to three independent data subsets. Comparisons between density distributions in the 3-D plasmasphere and trough regions at the three stages of quiet magnetosphere provide novel views about the distribution of matter inside the inner magnetosphere during several days of low activity. Clear signatures of a refilling process inside an expended plasmasphere in formation are noted. A plasmapause-like boundary, at L ~ 6 for all MLT sectors, is formed after 3 to 4 days and expends somewhat further after that. In the outer part of the plasmasphere (L ~ 8, latitudinal profiles of median density values vary essentially according to the MLT sector considered rather than according to the refilling duration. The shape of these density profiles indicates that magnetic flux tubes are not
Energy Technology Data Exchange (ETDEWEB)
None
1996-06-01
NUREG-0700, Revision 1, provides human factors engineering (HFE) guidance to the US Nuclear Regulatory Commission staff for its: (1) review of the human system interface (HSI) design submittals prepared by licensees or applications for a license or design certification of commercial nuclear power plants, and (2) performance of HSI reviews that could be undertaken as part of an inspection or other type of regulatory review involving HSI design or incidents involving human performance. The guidance consists of a review process and HFE guidelines. The document describes those aspects of the HSI design review process that are important to the identification and resolution of human engineering discrepancies that could adversely affect plant safety. Guidance is provided that could be used by the staff to review an applicant`s HSI design review process or to guide the development of an HSI design review plan, e.g., as part of an inspection activity. The document also provides detailed HFE guidelines for the assessment of HSI design implementations. NUREG-0700, Revision 1, consists of three stand-alone volumes. Volume 1 consists of two major parts. Part 1 describes those aspects of the review process of the HSI design that are important to identifying and resolving human engineering discrepancies. Part 2 contains detailed guidelines for a human factors engineering review which identify criteria for assessing the implementation of an applicant`s or licensee`s HSI design.
Choi, Hyungsuk; Choi, Woohyuk; Quan, Tran Minh; Hildebrand, David G C; Pfister, Hanspeter; Jeong, Won-Ki
2014-12-01
As the size of image data from microscopes and telescopes increases, the need for high-throughput processing and visualization of large volumetric data has become more pressing. At the same time, many-core processors and GPU accelerators are commonplace, making high-performance distributed heterogeneous computing systems affordable. However, effectively utilizing GPU clusters is difficult for novice programmers, and even experienced programmers often fail to fully leverage the computing power of new parallel architectures due to their steep learning curve and programming complexity. In this paper, we propose Vivaldi, a new domain-specific language for volume processing and visualization on distributed heterogeneous computing systems. Vivaldi's Python-like grammar and parallel processing abstractions provide flexible programming tools for non-experts to easily write high-performance parallel computing code. Vivaldi provides commonly used functions and numerical operators for customized visualization and high-throughput image processing applications. We demonstrate the performance and usability of Vivaldi on several examples ranging from volume rendering to image segmentation.
International Nuclear Information System (INIS)
1996-06-01
NUREG-0700, Revision 1, provides human factors engineering (HFE) guidance to the US Nuclear Regulatory Commission staff for its: (1) review of the human system interface (HSI) design submittals prepared by licensees or applications for a license or design certification of commercial nuclear power plants, and (2) performance of HSI reviews that could be undertaken as part of an inspection or other type of regulatory review involving HSI design or incidents involving human performance. The guidance consists of a review process and HFE guidelines. The document describes those aspects of the HSI design review process that are important to the identification and resolution of human engineering discrepancies that could adversely affect plant safety. Guidance is provided that could be used by the staff to review an applicant's HSI design review process or to guide the development of an HSI design review plan, e.g., as part of an inspection activity. The document also provides detailed HFE guidelines for the assessment of HSI design implementations. NUREG-0700, Revision 1, consists of three stand-alone volumes. Volume 1 consists of two major parts. Part 1 describes those aspects of the review process of the HSI design that are important to identifying and resolving human engineering discrepancies. Part 2 contains detailed guidelines for a human factors engineering review which identify criteria for assessing the implementation of an applicant's or licensee's HSI design
Statistical Signal Process in R Language in the Pharmacovigilance Programme of India.
Kumar, Aman; Ahuja, Jitin; Shrivastava, Tarani Prakash; Kumar, Vipin; Kalaiselvan, Vivekanandan
2018-05-01
The Ministry of Health & Family Welfare, Government of India, initiated the Pharmacovigilance Programme of India (PvPI) in July 2010. The purpose of the PvPI is to collect data on adverse reactions due to medications, analyze it, and use the reference to recommend informed regulatory intervention, besides communicating the risk to health care professionals and the public. The goal of the present study was to apply statistical tools to find the relationship between drugs and ADRs for signal detection by R programming. Four statistical parameters were proposed for quantitative signal detection. These 4 parameters are IC 025 , PRR and PRR lb , chi-square, and N 11 ; we calculated these 4 values using R programming. We analyzed 78,983 drug-ADR combinations, and the total count of drug-ADR combination was 4,20,060. During the calculation of the statistical parameter, we use 3 variables: (1) N 11 (number of counts), (2) N 1. (Drug margin), and (3) N .1 (ADR margin). The structure and calculation of these 4 statistical parameters in R language are easily understandable. On the basis of the IC value (IC value >0), out of the 78,983 drug-ADR combination (drug-ADR combination), we found the 8,667 combinations to be significantly associated. The calculation of statistical parameters in R language is time saving and allows to easily identify new signals in the Indian ICSR (Individual Case Safety Reports) database.
Oh, Jungsu S.; Kim, Jae Seung; Chae, Sun Young; Oh, Minyoung; Oh, Seung Jun; Cha, Seung Nam; Chang, Ho-Jong; Lee, Chong Sik; Lee, Jae Hong
2017-03-01
We present an optimized voxelwise statistical parametric mapping (SPM) of partial-volume (PV)-corrected positron emission tomography (PET) of 11C Pittsburgh Compound B (PiB), incorporating the anatomical precision of magnetic resonance image (MRI) and amyloid β (A β) burden-specificity of PiB PET. First, we applied region-based partial-volume correction (PVC), termed the geometric transfer matrix (GTM) method, to PiB PET, creating MRI-based lobar parcels filled with mean PiB uptakes. Then, we conducted a voxelwise PVC by multiplying the original PET by the ratio of a GTM-based PV-corrected PET to a 6-mm-smoothed PV-corrected PET. Finally, we conducted spatial normalizations of the PV-corrected PETs onto the study-specific template. As such, we increased the accuracy of the SPM normalization and the tissue specificity of SPM results. Moreover, lobar smoothing (instead of whole-brain smoothing) was applied to increase the signal-to-noise ratio in the image without degrading the tissue specificity. Thereby, we could optimize a voxelwise group comparison between subjects with high and normal A β burdens (from 10 patients with Alzheimer's disease, 30 patients with Lewy body dementia, and 9 normal controls). Our SPM framework outperformed than the conventional one in terms of the accuracy of the spatial normalization (85% of maximum likelihood tissue classification volume) and the tissue specificity (larger gray matter, and smaller cerebrospinal fluid volume fraction from the SPM results). Our SPM framework optimized the SPM of a PV-corrected A β PET in terms of anatomical precision, normalization accuracy, and tissue specificity, resulting in better detection and localization of A β burdens in patients with Alzheimer's disease and Lewy body dementia.
Statistical and Fractal Processing of Phase Images of Human Biological Fluids
Directory of Open Access Journals (Sweden)
MARCHUK, Y. I.
2010-11-01
Full Text Available Performed in this work are complex statistical and fractal analyses of phase properties inherent to birefringence networks of liquid crystals consisting of optically-thin layers prepared from human bile. Within the framework of a statistical approach, the authors have investigated values and ranges for changes of statistical moments of the 1-st to 4-th orders that characterize coordinate distributions for phase shifts between orthogonal components of amplitudes inherent to laser radiation transformed by human bile with various pathologies. Using the Gramm-Charlie method, ascertained are correlation criteria for differentiation of phase maps describing pathologically changed liquid-crystal networks. In the framework of the fractal approach, determined are dimensionalities of self-similar coordinate phase distributions as well as features of transformation of logarithmic dependences for power spectra of these distributions for various types of human pathologies.
Statistical modelling of space-time processes with application to wind power
DEFF Research Database (Denmark)
Lenzi, Amanda
. This thesis aims at contributing to the wind power literature by building and evaluating new statistical techniques for producing forecasts at multiple locations and lead times using spatio-temporal information. By exploring the features of a rich portfolio of wind farms in western Denmark, we investigate...... propose spatial models for predicting wind power generation at two different time scales: for annual average wind power generation and for a high temporal resolution (typically wind power averages over 15-min time steps). In both cases, we use a spatial hierarchical statistical model in which spatial...
Directory of Open Access Journals (Sweden)
Đurić I.
2010-01-01
Full Text Available This paper presents the results of defining the mathematical model which describes the dependence of leaching degree of Al2O3 in bauxite from the most influential input parameters in industrial conditions of conducting the leaching process in the Bayer technology of alumina production. Mathematical model is defined using the stepwise MLRA method, with R2 = 0.764 and significant statistical reliability - VIF<2 and p<0.05, on the one-year statistical sample. Validation of the acquired model was performed using the data from the following year, collected from the process conducted under industrial conditions, rendering the same statistical reliability, with R2 = 0.759.
Vitrification process for the volume reduction and stabilization of organic resins
International Nuclear Information System (INIS)
Buelt, J.L.
1982-10-01
Pacific Northwest Laboratory has completed a series of experimental tests sponsored by the US Department of Energy (DOE) to determine the feasibility of incinerating and vitrifying organic ion-exchange resins in a single-step process. The resins used in this study were identical to those used for decontaminating auxiliary building water at the Three Mile Island (TMI) Unit 2 reactor. The primarily organic resins were loaded with nonradioactive isotopes of cesium and strontium for processing in a pilot-scale, joule-heated glass melter modified to support resin combustion. The feasibility tests demonstrated an average process rate of 3.0 kg/h. Based on this rate, if 50 organic resin liners were vitrified in a six-month campaign, a melter 2.5 times the size of the pilot scale unit would be adequate. A maximum achievable volume reduction of 91% was demonstrated in these tests
International Nuclear Information System (INIS)
Yang, C.; Wang, Z.; Hollebone, B.; Brown, C.E.; Landriault, M.
2007-01-01
A thorough chemical characterization of oil must be conducted following an oil spill in order to determine the source of the oil, to distinguish the spilled oil from background hydrocarbons and to quantitatively evaluate the extent of impact of the spill. Gas chromatography, flame ionization and mass spectrometry analysis was used in conjunction with statistical data analysis to determine the source of a spill that occurred in 2004 in a harbor in the Netherlands. Three oil samples were collected from the harbor spill, where a thick layer of oil was found between a bunker boat and the quay next to the bunker centre. The 3 samples were sent to different laboratories for a round robin test to defensively correlate the spilled oil to the suspected source candidates. The source characterization and identification was validated by quantitative evaluation of 5 petroleum-characteristic alkylated PAH homologous series (naphthalene, phenanthrene, dibenzothiophene, fluorene and chrysene), pentacyclic biomarkers, bicyclic sesquiterpanes and diamondoid compounds. The use of biomarkers for identifying the source of spilled oils has also increased in recent years due to their specificity and high resistance to biodegradation. There was no strong difference among the 3 oil samples according to radar plots of diagnostic ratios of PAHs, isoprenoids, biomarkers, bicyclic sesquiterpanes and diamondoids. The two-tailed unpaired student's t-tests provided strong evidence for which ship was responsible for the oil spill incident. However, it was cautioned that although two-tailed unpaired student's t-tests along with oil fingerprinting successfully identified the spill source, the method has limitations. Experimental results showed that the spilled oil and two source candidates were quite similar in both chemical fingerprints and concentration profiles for determined target hydrocarbons. 17 refs., 4 tabs., 3 figs
Feature Statistics Modulate the Activation of Meaning during Spoken Word Processing
Devereux, Barry J.; Taylor, Kirsten I.; Randall, Billi; Geertzen, Jeroen; Tyler, Lorraine K.
2016-01-01
Understanding spoken words involves a rapid mapping from speech to conceptual representations. One distributed feature-based conceptual account assumes that the statistical characteristics of concepts' features--the number of concepts they occur in ("distinctiveness/sharedness") and likelihood of co-occurrence ("correlational…
On the γ-photon detection processes and the statistics of radiation
International Nuclear Information System (INIS)
Bertolotti, M.; Sibilia, C.
1977-01-01
The problem of detection of γ-photons is treated in the cases of photoelectric and Compton effects. In both cases the probability of detecting a γ-photon is found proportional to the first-order correlation function of the e.m. field. The statistical properties of the γ-radiation can therefore be determined through the methods developed in quantum optics
Mortara, L.; Alcock, Jeffrey R.
2011-01-01
Process-scale up is the change from a feasibility study in a laboratory to a full-scale prototype production process. It is an important issue for the ceramics industry, but has been the subject of relatively little systematic research. This paper will show how certain manufacturing concepts used in a number of industries - can be applied to the scale up of a feasibility study level, aqueous tape casting process. In particular, it examines the elements of process standardisa...
International Nuclear Information System (INIS)
Suda, S.; Franssen, F.
1987-01-01
A safeguards verification technique is being developed for determining whether process-liquid homogeneity has been achieved in process tanks and for authenticating volume-measurement algorithms involving temperature corrections. It is proposed that, in new designs for bulk-handling plants employing automated process lines, bubbler probes and thermocouples be installed at several heights in key accountability tanks. High-accuracy measurements of density using an electromanometer can now be made which match or even exceed analytical-laboratory accuracies. Together with regional determination of tank temperatures, these measurements provide density, liquid-column weight and temperature gradients over the fill range of the tank that can be used to ascertain when the tank solution has reached equilibrium. Temperature-correction algorithms can be authenticated by comparing the volumes obtained from the several bubbler-probe liquid-height measurements, each based on different amounts of liquid above and below the probe. The verification technique is based on the automated electromanometer system developed by Brookhaven National Laboratory (BNL). The IAEA has recently approved the purchase of a stainless-steel tank equipped with multiple bubbler and thermocouple probes for installation in its Bulk Calibration Laboratory at IAEA Headquarters, Vienna. The verification technique is scheduled for preliminary trials in late 1987
van de Glind, Esther M. M.; Willems, Hanna C.; Eslami, Saeid; Abu-Hanna, Ameen; Lems, Willem F.; Hooft, Lotty; de Rooij, Sophia E.; Black, Dennis M.; van Munster, Barbara C.
2016-01-01
For physicians dealing with patients with a limited life expectancy, knowing the time to benefit (TTB) of preventive medication is essential to support treatment decisions. The aim of this study was to investigate the usefulness of statistical process control (SPC) for determining the TTB in
van de Glind, Esther M M; Willems, Hanna C.; Eslami, Saeid; Abu-Hanna, Ameen; Lems, Willem F.; Hooft, Lotty; de Rooij, Sophia E.; Black, Dennis M.; van Munster, Barbara C.
2016-01-01
Background For physicians dealing with patients with a limited life expectancy, knowing the time to benefit (TTB) of preventive medication is essential to support treatment decisions. Objective The aim of this study was to investigate the usefulness of statistical process control (SPC) for
van de Glind, Esther M. M.; Willems, Hanna C.; Eslami, Saeid; Abu-Hanna, Ameen; Lems, Willem F.; Hooft, Lotty; de Rooij, Sophia E.; Black, Dennis M.; van Munster, Barbara C.
For physicians dealing with patients with a limited life expectancy, knowing the time to benefit (TTB) of preventive medication is essential to support treatment decisions. The aim of this study was to investigate the usefulness of statistical process control (SPC) for determining the TTB in
Directory of Open Access Journals (Sweden)
Isabel M. Joao
2017-09-01
Full Text Available Projects thematically focused on simulation and statistical techniques for designing and optimizing chemical processes can be helpful in chemical engineering education in order to meet the needs of engineers. We argue for the relevance of the projects to improve a student centred approach and boost higher order thinking skills. This paper addresses the use of Aspen HYSYS by Portuguese chemical engineering master students to model distillation systems together with statistical experimental design techniques in order to optimize the systems highlighting the value of applying problem specific knowledge, simulation tools and sound statistical techniques. The paper summarizes the work developed by the students in order to model steady-state processes, dynamic processes and optimize the distillation systems emphasizing the benefits of the simulation tools and statistical techniques in helping the students learn how to learn. Students strengthened their domain specific knowledge and became motivated to rethink and improve chemical processes in their future chemical engineering profession. We discuss the main advantages of the methodology from the students’ and teachers perspective
Averitt, Sallie D.
This instructor guide, which was developed for use in a manufacturing firm's advanced technical preparation program, contains the materials required to present a learning module that is designed to prepare trainees for the program's statistical process control module by improving their basic math skills in working with line graphs and teaching…
International Nuclear Information System (INIS)
Debón, A.; Carlos Garcia-Díaz, J.
2012-01-01
Advanced statistical models can help industry to design more economical and rational investment plans. Fault detection and diagnosis is an important problem in continuous hot dip galvanizing. Increasingly stringent quality requirements in the automotive industry also require ongoing efforts in process control to make processes more robust. Robust methods for estimating the quality of galvanized steel coils are an important tool for the comprehensive monitoring of the performance of the manufacturing process. This study applies different statistical regression models: generalized linear models, generalized additive models and classification trees to estimate the quality of galvanized steel coils on the basis of short time histories. The data, consisting of 48 galvanized steel coils, was divided into sets of conforming and nonconforming coils. Five variables were selected for monitoring the process: steel strip velocity and four bath temperatures. The present paper reports a comparative evaluation of statistical models for binary data using Receiver Operating Characteristic (ROC) curves. A ROC curve is a graph or a technique for visualizing, organizing and selecting classifiers based on their performance. The purpose of this paper is to examine their use in research to obtain the best model to predict defective steel coil probability. In relation to the work of other authors who only propose goodness of fit statistics, we should highlight one distinctive feature of the methodology presented here, which is the possibility of comparing the different models with ROC graphs which are based on model classification performance. Finally, the results are validated by bootstrap procedures.
Hill, Stephen E.; Schvaneveldt, Shane J.
2011-01-01
This article presents an educational exercise in which statistical process control charts are constructed and used to identify the Steroids Era in American professional baseball. During this period (roughly 1993 until the present), numerous baseball players were alleged or proven to have used banned, performance-enhancing drugs. Also observed…
Coast Community Coll. District, Costa Mesa, CA.
This instructor's manual for workplace trainers contains the materials required to conduct a course in pre-statistical process control. The course consists of six lessons for workers and two lessons for supervisors that discuss the following: concepts taught in the six lessons; workers' progress in the individual lessons; and strategies for…
Small, coded, pill-sized tracers embedded in grain are proposed as a method for grain traceability. A sampling process for a grain traceability system was designed and investigated by applying probability statistics using a science-based sampling approach to collect an adequate number of tracers fo...
International Nuclear Information System (INIS)
Létourneau, Daniel; McNiven, Andrea; Keller, Harald; Wang, An; Amin, Md Nurul; Pearce, Jim; Norrlinger, Bernhard; Jaffray, David A.
2014-01-01
Purpose: High-quality radiation therapy using highly conformal dose distributions and image-guided techniques requires optimum machine delivery performance. In this work, a monitoring system for multileaf collimator (MLC) performance, integrating semiautomated MLC quality control (QC) tests and statistical process control tools, was developed. The MLC performance monitoring system was used for almost a year on two commercially available MLC models. Control charts were used to establish MLC performance and assess test frequency required to achieve a given level of performance. MLC-related interlocks and servicing events were recorded during the monitoring period and were investigated as indicators of MLC performance variations. Methods: The QC test developed as part of the MLC performance monitoring system uses 2D megavoltage images (acquired using an electronic portal imaging device) of 23 fields to determine the location of the leaves with respect to the radiation isocenter. The precision of the MLC performance monitoring QC test and the MLC itself was assessed by detecting the MLC leaf positions on 127 megavoltage images of a static field. After initial calibration, the MLC performance monitoring QC test was performed 3–4 times/week over a period of 10–11 months to monitor positional accuracy of individual leaves for two different MLC models. Analysis of test results was performed using individuals control charts per leaf with control limits computed based on the measurements as well as two sets of specifications of ±0.5 and ±1 mm. Out-of-specification and out-of-control leaves were automatically flagged by the monitoring system and reviewed monthly by physicists. MLC-related interlocks reported by the linear accelerator and servicing events were recorded to help identify potential causes of nonrandom MLC leaf positioning variations. Results: The precision of the MLC performance monitoring QC test and the MLC itself was within ±0.22 mm for most MLC leaves
Létourneau, Daniel; Wang, An; Amin, Md Nurul; Pearce, Jim; McNiven, Andrea; Keller, Harald; Norrlinger, Bernhard; Jaffray, David A
2014-12-01
High-quality radiation therapy using highly conformal dose distributions and image-guided techniques requires optimum machine delivery performance. In this work, a monitoring system for multileaf collimator (MLC) performance, integrating semiautomated MLC quality control (QC) tests and statistical process control tools, was developed. The MLC performance monitoring system was used for almost a year on two commercially available MLC models. Control charts were used to establish MLC performance and assess test frequency required to achieve a given level of performance. MLC-related interlocks and servicing events were recorded during the monitoring period and were investigated as indicators of MLC performance variations. The QC test developed as part of the MLC performance monitoring system uses 2D megavoltage images (acquired using an electronic portal imaging device) of 23 fields to determine the location of the leaves with respect to the radiation isocenter. The precision of the MLC performance monitoring QC test and the MLC itself was assessed by detecting the MLC leaf positions on 127 megavoltage images of a static field. After initial calibration, the MLC performance monitoring QC test was performed 3-4 times/week over a period of 10-11 months to monitor positional accuracy of individual leaves for two different MLC models. Analysis of test results was performed using individuals control charts per leaf with control limits computed based on the measurements as well as two sets of specifications of ± 0.5 and ± 1 mm. Out-of-specification and out-of-control leaves were automatically flagged by the monitoring system and reviewed monthly by physicists. MLC-related interlocks reported by the linear accelerator and servicing events were recorded to help identify potential causes of nonrandom MLC leaf positioning variations. The precision of the MLC performance monitoring QC test and the MLC itself was within ± 0.22 mm for most MLC leaves and the majority of the
Yang, Yi; Tokita, Midori; Ishiguchi, Akira
2018-01-01
A number of studies revealed that our visual system can extract different types of summary statistics, such as the mean and variance, from sets of items. Although the extraction of such summary statistics has been studied well in isolation, the relationship between these statistics remains unclear. In this study, we explored this issue using an individual differences approach. Observers viewed illustrations of strawberries and lollypops varying in size or orientation and performed four tasks in a within-subject design, namely mean and variance discrimination tasks with size and orientation domains. We found that the performances in the mean and variance discrimination tasks were not correlated with each other and demonstrated that extractions of the mean and variance are mediated by different representation mechanisms. In addition, we tested the relationship between performances in size and orientation domains for each summary statistic (i.e. mean and variance) and examined whether each summary statistic has distinct processes across perceptual domains. The results illustrated that statistical summary representations of size and orientation may share a common mechanism for representing the mean and possibly for representing variance. Introspections for each observer performing the tasks were also examined and discussed.
Management of Uncertainty by Statistical Process Control and a Genetic Tuned Fuzzy System
Birle, Stephan;Hussein, Mohamed Ahmed;Becker, Thomas
2017-01-01
In food industry, bioprocesses like fermentation often are a crucial part of the manufacturing process and decisive for the final product quality. In general, they are characterized by highly nonlinear dynamics and uncertainties that make it difficult to control these processes by the use of traditional control techniques. In this context, fuzzy logic controllers offer quite a straightforward way to control processes that are affected by nonlinear behavior and uncertain process knowledge. How...
Duncan, Fiona; Haigh, Carol
2013-10-01
To explore and improve the quality of continuous epidural analgesia for pain relief using Statistical Process Control tools. Measuring the quality of pain management interventions is complex. Intermittent audits do not accurately capture the results of quality improvement initiatives. The failure rate for one intervention, epidural analgesia, is approximately 30% in everyday practice, so it is an important area for improvement. Continuous measurement and analysis are required to understand the multiple factors involved in providing effective pain relief. Process control and quality improvement Routine prospectively acquired data collection started in 2006. Patients were asked about their pain and side effects of treatment. Statistical Process Control methods were applied for continuous data analysis. A multidisciplinary group worked together to identify reasons for variation in the data and instigated ideas for improvement. The key measure for improvement was a reduction in the percentage of patients with an epidural in severe pain. The baseline control charts illustrated the recorded variation in the rate of several processes and outcomes for 293 surgical patients. The mean visual analogue pain score (VNRS) was four. There was no special cause variation when data were stratified by surgeons, clinical area or patients who had experienced pain before surgery. Fifty-seven per cent of patients were hypotensive on the first day after surgery. We were able to demonstrate a significant improvement in the failure rate of epidurals as the project continued with quality improvement interventions. Statistical Process Control is a useful tool for measuring and improving the quality of pain management. The applications of Statistical Process Control methods offer the potential to learn more about the process of change and outcomes in an Acute Pain Service both locally and nationally. We have been able to develop measures for improvement and benchmarking in routine care that
Xiang-Guo, Meng; Hong-Yi, Fan; Ji-Suo, Wang
2018-04-01
This paper proposes a kind of displaced thermal states (DTS) and explores how this kind of optical field emerges using the entangled state representation. The results show that the DTS can be generated by a coherent state passing through a diffusion channel with the diffusion coefficient ϰ only when there exists κ t = (e^{\\hbar ν /kBT} - 1 )^{-1}. Also, its statistical properties, such as mean photon number, Wigner function and entropy, are investigated.
Categorical data processing for real estate objects valuation using statistical analysis
Parygin, D. S.; Malikov, V. P.; Golubev, A. V.; Sadovnikova, N. P.; Petrova, T. M.; Finogeev, A. G.
2018-05-01
Theoretical and practical approaches to the use of statistical methods for studying various properties of infrastructure objects are analyzed in the paper. Methods of forecasting the value of objects are considered. A method for coding categorical variables describing properties of real estate objects is proposed. The analysis of the results of modeling the price of real estate objects using regression analysis and an algorithm based on a comparative approach is carried out.
Frontiers in statistical quality control 11
Schmid, Wolfgang
2015-01-01
The main focus of this edited volume is on three major areas of statistical quality control: statistical process control (SPC), acceptance sampling and design of experiments. The majority of the papers deal with statistical process control, while acceptance sampling and design of experiments are also treated to a lesser extent. The book is organized into four thematic parts, with Part I addressing statistical process control. Part II is devoted to acceptance sampling. Part III covers the design of experiments, while Part IV discusses related fields. The twenty-three papers in this volume stem from The 11th International Workshop on Intelligent Statistical Quality Control, which was held in Sydney, Australia from August 20 to August 23, 2013. The event was hosted by Professor Ross Sparks, CSIRO Mathematics, Informatics and Statistics, North Ryde, Australia and was jointly organized by Professors S. Knoth, W. Schmid and Ross Sparks. The papers presented here were carefully selected and reviewed by the scientifi...
International Nuclear Information System (INIS)
Mueller, A.J.; Shields, J.A. Jr.; Buckman, R.W. Jr.
2001-01-01
Variations in oxide species and consolidation method have been shown to have a significant effect on the mechanical properties of oxide dispersion strengthened (ODS) molybdenum material. The mechanical behavior of molybdenum - 2 volume % La 2 O 3 mill product forms, produced by CSM Industries by a wet doping process, were characterized over the temperature range of -150 o C to 1800 o C. The various mill product forms evaluated ranged from thin sheet stock to bar stock. Tensile properties of the material in the various product forms were not significantly affected by the vast difference in total cold work. Creep properties, however, were sensitive to the total amount of cold work as well as the starting microstructure. Stress-relieved .material had superior creep rupture properties to recrystallized material at 1200 o C, while at 1500 o C and above the opposite was observed. Thus it is necessary to match the appropriate thermo-mechanical processing and microstructure of molybdenum - 2 volume % La 2 O 3 to the demands of the application being considered. (author)
2013-07-01
This report documents the performance of two low traffic volume experimental chip seals constructed using : locally available, minimally processed sand and gravel aggregates after four winters of service. The projects : were constructed by CDOT maint...
Waste Receiving and Processing Facility Module 2A: Advanced Conceptual Design Report. Volume 1
Energy Technology Data Exchange (ETDEWEB)
1994-03-01
This ACDR was performed following completed of the Conceptual Design Report in July 1992; the work encompassed August 1992 to January 1994. Mission of the WRAP Module 2A facility is to receive, process, package, certify, and ship for permanent burial at the Hanford site disposal facilities the Category 1 and 3 contact handled low-level radioactive mixed wastes that are currently in retrievable storage at Hanford and are forecast to be generated over the next 30 years by Hanford, and waste to be shipped to Hanford from about DOE sites. This volume provides an introduction to the ACDR process and the scope of the task along with a project summary of the facility, treatment technologies, cost, and schedule. Major areas of departure from the CDR are highlighted. Descriptions of the facility layout and operations are included.
The acid digestion process for radioactive waste: The radioactive waste management series. Volume II
International Nuclear Information System (INIS)
Cecille, L.; Simon, R.
1983-01-01
This volume focuses on the acid digestion process for the treatment of alpha combustible solid waste by presenting detailed performance figures for the principal sub-assemblies of the Alona pilot plant, Belgium. Experience gained from the operation of the US RADTU plant, the only other acid digestion pilot plant, is also summarized, and the performances of these two plants compared. In addition, the research and development programmes carried out or supported by the Commission of the European Communities are reviewed, and details of an alternative to acid digestion for waste contamination described. Topics considered include review of the treatment of actinides-bearing radioactive wastes; alpha waste arisings in fuel fabrication; Alona Demonstration Facility for the acid digestion process at Eurochemic Mol (Belgium); the treatment of alpha waste at Eurochemic by acid digestion-feed pretreatment and plutonium recovery; US experience with acid digestion of combustible transuranic waste; and The European Communities R and D actions on alpha waste
Processing plutonium-contaminated soild for volume reduction using the segmented gate system
International Nuclear Information System (INIS)
Moroney, K.S.; Moroney, J.D.; Turney, J.M.; Doane, R.W.
1994-01-01
TMA/Eberline has developed and demonstrated an effective method for removing mixed plutonium and americium contamination from a coral soil matrix at the Defense Nuclear Agency's Johnston Atoll site. TMA's onsite soil processing for volume reduction is ongoing at a rate of over 2000 metric tons per week. The system uses arrays of sensitive radiation detectors coupled with sophisticated computer software developed by Eberline Instrument Corporation. The proprietary software controls four soil sorting units operating in parallel that utilize TMA's unique Segmented Gate System technology to remove radiologically contaminated soil from a moving supply on conveyor belts. Clean soil is released for use elsewhere on the island. Contaminated soil is diverted to either a metal drum for collecting higher activity open-quotes hotclose quotes particles (>5000 Becquerels), or to a supplementary soil washing process designed to remove finely divided particles of dispersed low level contamination. Site contamination limits specify maximum dispersed radioactivity of no more than 500 Becquerels per kilogram of soil averaged over no more than 0.1 cubic meter. Results of soil processing at this site have been excellent. After processing over 50,000 metric tons, the volume of contaminated material that would have required expensive special handling, packaging, and disposal as radioactive waste has been successfully reduced by over 98 percent. By mid-January 1994, nearly three million kiloBecquerels of plutonium/americium contamination had been physically separated from the contaminated feed by TMA's Segmented Gate System, and quality control sampling showed no radioactivity above release criteria in the open-quotes cleanclose quotes soil pile
The unitarity defect of the S-matrix and statistical multistep direct nuclear processes
International Nuclear Information System (INIS)
Hussein, M.S.
1986-09-01
A relation is derived which connects the unitarity defect function S + S-1 with the imaginary part of the absorptive potential responsible for the nuclear scattering. The concept of angle-dependent reaction cross-section is introduced for the purpose. A similar relation is also obtained for the equivalent quantity S + -S -1 . Several applications to nucler scattering are made, and possible relevance of this unitarity defect relation to statistical coupled channels theories of preequilibrium reactions is pointed out and discussed. (Author) [pt
Attributing Meanings to Representations of Data: The Case of Statistical Process Control
Hoyles, Celia; Bakker, Arthur; Kent, Phillip; Noss, Richard
2007-01-01
This article is concerned with the meanings that employees in industry attribute to representations of data and the contingencies of these meanings in context. Our primary concern is to more precisely characterize how the context of the industrial process is constitutive of the meaning of graphs of data derived from this process. We draw on data…
Signals of dynamical and statistical process from IMF-IMF correlation function
Pagano, E. V.; Acosta, L.; Auditore, L.; Baran, V.; Cap, T.; Cardella, G.; Colonna, M.; De Luca, S.; De Filippo, E.; Dell'Aquila, D.; Francalanza, L.; Gnoffo, B.; Lanzalone, G.; Lombardo, I.; Maiolino, C.; Martorana, N. S.; Norella, S.; Pagano, A.; Papa, M.; Piasecki, E.; Pirrone, S.; Politi, G.; Porto, F.; Quattrocchi, L.; Rizzo, F.; Rosato, E.; Russotto, P.; Siwek-Wilczyńska, K.; Trifiro, A.; Trimarchi, M.; Verde, G.; Vigilante, M.; Wilczyńsky, J.
2017-11-01
In this paper we briefly discuss about a novel application of the IMF-IMF correlation function to the physical case of binary massive projectile-like (PLF) splitting for dynamical and statistical breakup/fission in heavy ion collisions at Fermi energy. Theoretical simulations are also shown for comparisons with the data. These preliminary results have been obtained for the reverse kinematics reaction 124Sn + 64Ni at 35 AMeV that was studied using the forward part of CHIMERA detector. In that reaction a strong competition between a dynamical and a statistical components and its evolution with the charge asymmetry of the binary break up was already shown. In this work we show that the IMF-IMF correlation function can be used to pin down the timescale of the fragments production in binary fission-like phenomena. We also made simulations with the CoMDII model in order to compare to the experimental IMF-IMF correlation function. In future we plan to extend these studies to different reaction mechanisms and nuclear systems and to compare with different theoretical transport simulations.
Energy Technology Data Exchange (ETDEWEB)
Tratnyek, P. G.; Bylaska, Eric J.; Weber, Eric J.
2017-01-01
Quantitative structure–activity relationships (QSARs) have long been used in the environmental sciences. More recently, molecular modeling and chemoinformatic methods have become widespread. These methods have the potential to expand and accelerate advances in environmental chemistry because they complement observational and experimental data with “in silico” results and analysis. The opportunities and challenges that arise at the intersection between statistical and theoretical in silico methods are most apparent in the context of properties that determine the environmental fate and effects of chemical contaminants (degradation rate constants, partition coefficients, toxicities, etc.). The main example of this is the calibration of QSARs using descriptor variable data calculated from molecular modeling, which can make QSARs more useful for predicting property data that are unavailable, but also can make them more powerful tools for diagnosis of fate determining pathways and mechanisms. Emerging opportunities for “in silico environmental chemical science” are to move beyond the calculation of specific chemical properties using statistical models and toward more fully in silico models, prediction of transformation pathways and products, incorporation of environmental factors into model predictions, integration of databases and predictive models into more comprehensive and efficient tools for exposure assessment, and extending the applicability of all the above from chemicals to biologicals and materials.
Tratnyek, Paul G; Bylaska, Eric J; Weber, Eric J
2017-03-22
Quantitative structure-activity relationships (QSARs) have long been used in the environmental sciences. More recently, molecular modeling and chemoinformatic methods have become widespread. These methods have the potential to expand and accelerate advances in environmental chemistry because they complement observational and experimental data with "in silico" results and analysis. The opportunities and challenges that arise at the intersection between statistical and theoretical in silico methods are most apparent in the context of properties that determine the environmental fate and effects of chemical contaminants (degradation rate constants, partition coefficients, toxicities, etc.). The main example of this is the calibration of QSARs using descriptor variable data calculated from molecular modeling, which can make QSARs more useful for predicting property data that are unavailable, but also can make them more powerful tools for diagnosis of fate determining pathways and mechanisms. Emerging opportunities for "in silico environmental chemical science" are to move beyond the calculation of specific chemical properties using statistical models and toward more fully in silico models, prediction of transformation pathways and products, incorporation of environmental factors into model predictions, integration of databases and predictive models into more comprehensive and efficient tools for exposure assessment, and extending the applicability of all the above from chemicals to biologicals and materials.
Comparison of multivariate and univariate statistical process control and monitoring methods
International Nuclear Information System (INIS)
Leger, R.P.; Garland, WM.J.; Macgregor, J.F.
1996-01-01
Work in recent years has lead to the development of multivariate process monitoring schemes which use Principal Component Analysis (PCA). This research compares the performance of a univariate scheme and a multivariate PCA scheme used for monitoring a simple process with 11 measured variables. The multivariate PCA scheme was able to adequately represent the process using two principal components. This resulted in a PCA monitoring scheme which used two charts as opposed to 11 charts for the univariate scheme and therefore had distinct advantages in terms of both data representation, presentation, and fault diagnosis capabilities. (author)
Directory of Open Access Journals (Sweden)
Peter James Allen
2016-02-01
Full Text Available Quantitative research methods are essential to the development of professional competence in psychology. They are also an area of weakness for many students. In particular, students are known to struggle with the skill of selecting quantitative analytical strategies appropriate for common research questions, hypotheses and data types. To begin understanding this apparent deficit, we presented nine psychology undergraduates (who had all completed at least one quantitative methods course with brief research vignettes, and asked them to explicate the process they would follow to identify an appropriate statistical technique for each. Thematic analysis revealed that all participants found this task challenging, and even those who had completed several research methods courses struggled to articulate how they would approach the vignettes on more than a very superficial and intuitive level. While some students recognized that there is a systematic decision making process that can be followed, none could describe it clearly or completely. We then presented the same vignettes to 10 psychology academics with particular expertise in conducting research and/or research methods instruction. Predictably, these ‘experts’ were able to describe a far more systematic, comprehensive, flexible and nuanced approach to statistical decision making, which begins early in the research process, and pays consideration to multiple contextual factors. They were sensitive to the challenges that students experience when making statistical decisions, which they attributed partially to how research methods and statistics are commonly taught. This sensitivity was reflected in their pedagogic practices. When asked to consider the format and features of an aid that could facilitate the statistical decision making process, both groups expressed a preference for an accessible, comprehensive and reputable resource that follows a basic decision tree logic. For the academics in
Allen, Peter J; Dorozenko, Kate P; Roberts, Lynne D
2016-01-01
Quantitative research methods are essential to the development of professional competence in psychology. They are also an area of weakness for many students. In particular, students are known to struggle with the skill of selecting quantitative analytical strategies appropriate for common research questions, hypotheses and data types. To begin understanding this apparent deficit, we presented nine psychology undergraduates (who had all completed at least one quantitative methods course) with brief research vignettes, and asked them to explicate the process they would follow to identify an appropriate statistical technique for each. Thematic analysis revealed that all participants found this task challenging, and even those who had completed several research methods courses struggled to articulate how they would approach the vignettes on more than a very superficial and intuitive level. While some students recognized that there is a systematic decision making process that can be followed, none could describe it clearly or completely. We then presented the same vignettes to 10 psychology academics with particular expertise in conducting research and/or research methods instruction. Predictably, these "experts" were able to describe a far more systematic, comprehensive, flexible, and nuanced approach to statistical decision making, which begins early in the research process, and pays consideration to multiple contextual factors. They were sensitive to the challenges that students experience when making statistical decisions, which they attributed partially to how research methods and statistics are commonly taught. This sensitivity was reflected in their pedagogic practices. When asked to consider the format and features of an aid that could facilitate the statistical decision making process, both groups expressed a preference for an accessible, comprehensive and reputable resource that follows a basic decision tree logic. For the academics in particular, this aid
Allen, Peter J.; Dorozenko, Kate P.; Roberts, Lynne D.
2016-01-01
Quantitative research methods are essential to the development of professional competence in psychology. They are also an area of weakness for many students. In particular, students are known to struggle with the skill of selecting quantitative analytical strategies appropriate for common research questions, hypotheses and data types. To begin understanding this apparent deficit, we presented nine psychology undergraduates (who had all completed at least one quantitative methods course) with brief research vignettes, and asked them to explicate the process they would follow to identify an appropriate statistical technique for each. Thematic analysis revealed that all participants found this task challenging, and even those who had completed several research methods courses struggled to articulate how they would approach the vignettes on more than a very superficial and intuitive level. While some students recognized that there is a systematic decision making process that can be followed, none could describe it clearly or completely. We then presented the same vignettes to 10 psychology academics with particular expertise in conducting research and/or research methods instruction. Predictably, these “experts” were able to describe a far more systematic, comprehensive, flexible, and nuanced approach to statistical decision making, which begins early in the research process, and pays consideration to multiple contextual factors. They were sensitive to the challenges that students experience when making statistical decisions, which they attributed partially to how research methods and statistics are commonly taught. This sensitivity was reflected in their pedagogic practices. When asked to consider the format and features of an aid that could facilitate the statistical decision making process, both groups expressed a preference for an accessible, comprehensive and reputable resource that follows a basic decision tree logic. For the academics in particular, this aid
Computer processing of 14C data; statistical tests and corrections of data
International Nuclear Information System (INIS)
Obelic, B.; Planinic, J.
1977-01-01
The described computer program calculates the age of samples and performs statistical tests and corrections of data. Data are obtained from the proportional counter that measures anticoincident pulses per 20 minute intervals. After every 9th interval the counter measures total number of counts per interval. Input data are punched on cards. The output list contains input data schedule and the following results: mean CPM value, correction of CPM for normal pressure and temperature (NTP), sample age calculation based on 14 C half life of 5570 and 5730 years, age correction for NTP, dendrochronological corrections and the relative radiocarbon concentration. All results are given with one standard deviation. Input data test (Chauvenet's criterion), gas purity test, standard deviation test and test of the data processor are also included in the program. (author)
Process automation system for integration and operation of Large Volume Plasma Device
International Nuclear Information System (INIS)
Sugandhi, R.; Srivastava, P.K.; Sanyasi, A.K.; Srivastav, Prabhakar; Awasthi, L.M.; Mattoo, S.K.
2016-01-01
Highlights: • Analysis and design of process automation system for Large Volume Plasma Device (LVPD). • Data flow modeling for process model development. • Modbus based data communication and interfacing. • Interface software development for subsystem control in LabVIEW. - Abstract: Large Volume Plasma Device (LVPD) has been successfully contributing towards understanding of the plasma turbulence driven by Electron Temperature Gradient (ETG), considered as a major contributor for the plasma loss in the fusion devices. Large size of the device imposes certain difficulties in the operation, such as access of the diagnostics, manual control of subsystems and large number of signals monitoring etc. To achieve integrated operation of the machine, automation is essential for the enhanced performance and operational efficiency. Recently, the machine is undergoing major upgradation for the new physics experiments. The new operation and control system consists of following: (1) PXIe based fast data acquisition system for the equipped diagnostics; (2) Modbus based Process Automation System (PAS) for the subsystem controls and (3) Data Utilization System (DUS) for efficient storage, processing and retrieval of the acquired data. In the ongoing development, data flow model of the machine’s operation has been developed. As a proof of concept, following two subsystems have been successfully integrated: (1) Filament Power Supply (FPS) for the heating of W- filaments based plasma source and (2) Probe Positioning System (PPS) for control of 12 number of linear probe drives for a travel length of 100 cm. The process model of the vacuum production system has been prepared and validated against acquired pressure data. In the next upgrade, all the subsystems of the machine will be integrated in a systematic manner. The automation backbone is based on 4-wire multi-drop serial interface (RS485) using Modbus communication protocol. Software is developed on LabVIEW platform using
Process automation system for integration and operation of Large Volume Plasma Device
Energy Technology Data Exchange (ETDEWEB)
Sugandhi, R., E-mail: ritesh@ipr.res.in; Srivastava, P.K.; Sanyasi, A.K.; Srivastav, Prabhakar; Awasthi, L.M.; Mattoo, S.K.
2016-11-15
Highlights: • Analysis and design of process automation system for Large Volume Plasma Device (LVPD). • Data flow modeling for process model development. • Modbus based data communication and interfacing. • Interface software development for subsystem control in LabVIEW. - Abstract: Large Volume Plasma Device (LVPD) has been successfully contributing towards understanding of the plasma turbulence driven by Electron Temperature Gradient (ETG), considered as a major contributor for the plasma loss in the fusion devices. Large size of the device imposes certain difficulties in the operation, such as access of the diagnostics, manual control of subsystems and large number of signals monitoring etc. To achieve integrated operation of the machine, automation is essential for the enhanced performance and operational efficiency. Recently, the machine is undergoing major upgradation for the new physics experiments. The new operation and control system consists of following: (1) PXIe based fast data acquisition system for the equipped diagnostics; (2) Modbus based Process Automation System (PAS) for the subsystem controls and (3) Data Utilization System (DUS) for efficient storage, processing and retrieval of the acquired data. In the ongoing development, data flow model of the machine’s operation has been developed. As a proof of concept, following two subsystems have been successfully integrated: (1) Filament Power Supply (FPS) for the heating of W- filaments based plasma source and (2) Probe Positioning System (PPS) for control of 12 number of linear probe drives for a travel length of 100 cm. The process model of the vacuum production system has been prepared and validated against acquired pressure data. In the next upgrade, all the subsystems of the machine will be integrated in a systematic manner. The automation backbone is based on 4-wire multi-drop serial interface (RS485) using Modbus communication protocol. Software is developed on LabVIEW platform using
van de Glind, Esther M. M.; Willems, Hanna C.; Eslami, Saeid; Abu-Hanna, Ameen; Lems, Willem F.; Hooft, Lotty; de Rooij, Sophia E.; Black, Dennis M.; van Munster, Barbara C.
2016-01-01
Background For physicians dealing with patients with a limited life expectancy, knowing the time to benefit (TTB) of preventive medication is essential to support treatment decisions. Objective The aim of this study was to investigate the usefulness of statistical process control (SPC) for determining the TTB in relation to fracture risk with alendronate versus placebo in postmenopausal women. Methods We performed a post?hoc analysis of the Fracture Intervention Trial (FIT), a randomized, con...
Energy Technology Data Exchange (ETDEWEB)
Eliazar, Iddo, E-mail: eliazar@post.tau.ac.il
2017-05-15
The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.
International Nuclear Information System (INIS)
Eliazar, Iddo
2017-01-01
The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.
Energy Technology Data Exchange (ETDEWEB)
Paradies, M.; Unger, L. [System Improvements, Inc., Knoxville, TN (United States); Haas, P.; Terranova, M. [Concord Associates, Inc., Knoxville, TN (United States)
1993-10-01
The three volumes of this report detail a standard investigation process for use by US Nuclear Regulatory Commission (NRC) personnel when investigating human performance related events at nuclear power plants. The process, called the Human Performance Investigation Process (HPIP), was developed to meet the special needs of NRC personnel, especially NRC resident and regional inspectors. HPIP is a systematic investigation process combining current procedures and field practices, expert experience, NRC human performance research, and applicable investigation techniques. The process is easy to learn and helps NRC personnel perform better field investigations of the root causes of human performance problems. The human performance data gathered through such investigations provides a better understanding of the human performance issues that cause event at nuclear power plants. This document, Volume II, is a field manual for use by investigators when performing event investigations. Volume II includes the HPIP Procedure, the HPIP Modules, and Appendices that provide extensive documentation of each investigation technique.
Wang, Jin; Wang, Hui-Ping; Wang, Xiaojie; Cui, Haichao; Lu, Fenggui
2015-03-01
This paper investigates hot cracking rate in Al fiber laser welding under various process conditions and performs corresponding process optimization. First, effects of welding process parameters such as distance between welding center line and its closest trim edge, laser power and welding speed on hot cracking rate were investigated experimentally with response surface methodology (RSM). The hot cracking rate in the paper is defined as ratio of hot cracking length over the total weld seam length. Based on the experimental results following Box-Behnken design, a prediction model for the hot cracking rate was developed using a second order polynomial function considering only two factor interaction. The initial prediction result indicated that the established model could predict the hot cracking rate adequately within the range of welding parameters being used. The model was then used to optimize welding parameters to achieve cracking-free welds.
The NJOY Nuclear Data Processing System: Volume 3, The GROUPR, GAMINR, and MODER modules
International Nuclear Information System (INIS)
MacFarlane, R.E.; Muir, D.W.
1987-10-01
The NJOY Nuclear Data Processing System is a comprehensive computer code package for producing pointwise and multigroup cross sections and related quantities from ENDF/B-IV, V, or VI evaluated nuclear data. A concise description of the code system and references to the ancestors of NJOY are given in Vol. 1 of this report. This volume describes the GROUPR module, which produces multigroup neutron interaction cross sections and group-to-group production cross sections for neutrons and photons; the GAMINR module, which produces multigroup photon-interaction cross sections and group-to-group matrices; and the MODER module, which converts ENDF/B and NJOY interface files back and forth between formatted (i.e., BCD, ASCII) and binary modes and performs several associated editing functions. 34 refs., 13 figs
Shukla, Asmita; Shukla, Sanjay; Annable, Michael D.; Hodges, Alan W.
2017-08-01
Stormwater detention areas (SDAs) play an important role in treating end-of-the-farm runoff in phosphorous (P) limited agroecosystems. Phosphorus transport from the SDAs, including those through subsurface pathways, are not well understood. The prevailing understanding of these systems assumes that biogeochemical processes play the primary treatment role and that subsurface losses can be neglected. Water and P fluxes from a SDA located in a row-crop farm were measured for two years (2009-2011) to assess the SDA's role in reducing downstream P loads. The SDA treated 55% (497 kg) and 95% (205 kg) of the incoming load during Year 1 (Y1, 09-10) and Year 2 (Y2, 10-11), respectively. These treatment efficiencies were similar to surface water volumetric retention (49% in Y1 and 84% in Y2) and varied primarily with rainfall. Similar water volume and P retentions indicate that volume retention is the main process controlling P loads. A limited role of biogeochemical processes was supported by low to no remaining soil P adsorption capacity due to long-term drainage P input. The fact that outflow P concentrations (Y1 = 368.3 μg L- 1, Y2 = 230.4 μg L- 1) could be approximated by using a simple mixing of rainfall and drainage P input further confirmed the near inert biogeochemical processes. Subsurface P losses through groundwater were 304 kg (27% of inflow P) indicating that they are an important source for downstream P. Including subsurface P losses reduces the treatment efficiency to 35% (from 61%). The aboveground biomass in the SDA contained 42% (240 kg) of the average incoming P load suggesting that biomass harvesting could be a cost-effective alternative for reviving the role of biogeochemical processes to enhance P treatment in aged, P-saturated SDAs. The 20-year present economic value of P removal through harvesting was estimated to be 341,000, which if covered through a cost share or a payment for P treatment services program could be a positive outcome for both
Surface of Maximums of AR(2 Process Spectral Densities and its Application in Time Series Statistics
Directory of Open Access Journals (Sweden)
Alexander V. Ivanov
2017-09-01
Conclusions. The obtained formula of surface of maximums of noise spectral densities gives an opportunity to realize for which values of AR(2 process characteristic polynomial coefficients it is possible to look for greater rate of convergence to zero of the probabilities of large deviations of the considered estimates.
Borovkova, Svetlana; Burton, Robert; Dehling, Herold
2001-01-01
In this paper we develop a general approach for investigating the asymptotic distribution of functional Xn = f((Zn+k)k∈z) of absolutely regular stochastic processes (Zn)n∈z. Such functional occur naturally as orbits of chaotic dynamical systems, and thus our results can be used to study
Implementation of a real-time statistical process control system in hardwood sawmills
Timothy M. Young; Brian H. Bond; Jan Wiedenbeck
2007-01-01
Variation in sawmill processes reduces the financial benefit of converting fiber from a log into lumber. Lumber is intentionally oversized during manufacture to allow for sawing variation, shrinkage from drying, and final surfacing. This oversizing of lumber due to sawing variation requires higher operating targets and leads to suboptimal fiber recovery. For more than...
Use of statistical process control in the production of blood components
DEFF Research Database (Denmark)
Magnussen, K.; Quere, S.; Winkel, P.
2008-01-01
occasional component manufacturing staff to an experienced regular manufacturing staff. Production of blood products is a semi-automated process in which the manual steps may be difficult to control. This study was performed in an ongoing effort to improve the control and optimize the quality of the blood...
Jarman, Jay
2011-01-01
This dissertation focuses on developing and evaluating hybrid approaches for analyzing free-form text in the medical domain. This research draws on natural language processing (NLP) techniques that are used to parse and extract concepts based on a controlled vocabulary. Once important concepts are extracted, additional machine learning algorithms,…
Multivariate statistical modelling of the pharmaceutical process of wet granulation and tableting
Westerhuis, Johannes Arnold
1997-01-01
Wet granulation in high-shear mixers is a process of particle size enlargement much used in the pharmaceutical industry to improve the tableting properties of powder mixtures, such as flowability and compactibility, necessary for the large scale production of pharmaceutical talbets. ... Zie: Summary
International Nuclear Information System (INIS)
Ung, M.N.; Wee, Leonard
2010-01-01
Full text: Portal imaging of implanted fiducial markers has been in use for image-guided radiotherapy (TORT) of prostate cancer, with ample attention to localization accuracy and organ motion. The geometric uncertainties in point-based rigid-body (PBRB) image registration during localization of prostate fiducial markers can be quantified in terms of a fiducial registration error (FRE). Statistical process control charts for individual patients can be designed to identify potentially significant deviation of FRE from expected behaviour. In this study, the aim was to retrospectively apply statistical process control methods to FREs in 34 individuals to identify parameters that may impact on the process stability in image-based localization. A robust procedure for estimating control parameters, control lim its and fixed tolerance levels from a small number of initial observations has been proposed and discussed. Four distinct types of qualitative control chart behavior have been observed. Probable clinical factors leading to IORT process instability are discussed in light of the control chart behaviour. Control charts have been shown to be a useful decision-making tool for detecting potentially out-of control processes on an individual basis. It can sensitively identify potential problems that warrant more detailed investigation in the 10RT of prostate cancer.
Di roma, Antonella; Vaccaro, Carmela
2017-04-01
The ground water should not be seen only as a reserve for the water supply, but also be protected for its environmental value. Groundwater plays an essential role in the hydrological cycle for which the characterization, pollution prevention, monitoring and restoration are essential in view of the recovery and identification of the water bodies to be submitted to recharge for the adaptation to DM n. 100/2016. Groundwater of Ferrara province presents salinisation problems and pollution of noxious metals that can be mitigated through recharge processes evaluated based on the specific site characteristics. It is essential to know the hydrogeochemical characteristics of different aquifer levels. To do this have been discuss analytical results of groundwater (2014-2015 monitoring phreatic ground water and temporal series from 2003-2015 A1-A2-A3 samples from Emilia Romagna databases). Results showed that in the territory analyzed insist both salinization and refreshening processes. Factor analysis(FA) conducted on samples has divided them into three groups. 1: samples affected by ionic exchange, 2: pH reaction on heavy metal, 3: samples affected by mineralization. The geochemical groundwater facies changed from Ca-HCO3, and NaHCO3 with a small samples group of CaSO4 and through geochemical investigations were observed the reactions that take place in the waters mixing of different composition. The Na excesses are explained by ionic exchange processes. A determinant role is played by ionic exchange between Ca and Na. In this territory is important also the role of CH4 presence which typically rises towards the surface along faults and fractures and influence rise of deep water with different composition. On samples selected from FA Group 1 has been observed an increase of the CEC (Cation exchange capacity). Adsorption-desorption exchanges take place between water and the fine fraction sediment rich in clay minerals. Higher CEC values are found in rich organic substance
Chattopadhyay, Anirban; Khondekar, Mofazzal Hossain; Bhattacharjee, Anup Kumar
2017-09-01
In this paper initiative has been taken to search the periodicities of linear speed of Coronal Mass Ejection in solar cycle 23. Double exponential smoothing and Discrete Wavelet Transform are being used for detrending and filtering of the CME linear speed time series. To choose the appropriate statistical methodology for the said purpose, Smoothed Pseudo Wigner-Ville distribution (SPWVD) has been used beforehand to confirm the non-stationarity of the time series. The Time-Frequency representation tool like Hilbert Huang Transform and Empirical Mode decomposition has been implemented to unearth the underneath periodicities in the non-stationary time series of the linear speed of CME. Of all the periodicities having more than 95% Confidence Level, the relevant periodicities have been segregated out using Integral peak detection algorithm. The periodicities observed are of low scale ranging from 2-159 days with some relevant periods like 4 days, 10 days, 11 days, 12 days, 13.7 days, 14.5 and 21.6 days. These short range periodicities indicate the probable origin of the CME is the active longitude and the magnetic flux network of the sun. The results also insinuate about the probable mutual influence and causality with other solar activities (like solar radio emission, Ap index, solar wind speed, etc.) owing to the similitude between their periods and CME linear speed periods. The periodicities of 4 days and 10 days indicate the possible existence of the Rossby-type waves or planetary waves in Sun.
Baijal, Shruti; Nakatani, Chie; van Leeuwen, Cees; Srinivasan, Narayanan
2013-06-07
Human observers show remarkable efficiency in statistical estimation; they are able, for instance, to estimate the mean size of visual objects, even if their number exceeds the capacity limits of focused attention. This ability has been understood as the result of a distinct mode of attention, i.e. distributed attention. Compared to the focused attention mode, working memory representations under distributed attention are proposed to be more compressed, leading to reduced working memory loads. An alternate proposal is that distributed attention uses less structured, feature-level representations. These would fill up working memory (WM) more, even when target set size is low. Using event-related potentials, we compared WM loading in a typical distributed attention task (mean size estimation) to that in a corresponding focused attention task (object recognition), using a measure called contralateral delay activity (CDA). Participants performed both tasks on 2, 4, or 8 different-sized target disks. In the recognition task, CDA amplitude increased with set size; notably, however, in the mean estimation task the CDA amplitude was high regardless of set size. In particular for set-size 2, the amplitude was higher in the mean estimation task than in the recognition task. The result showed that the task involves full WM loading even with a low target set size. This suggests that in the distributed attention mode, representations are not compressed, but rather less structured than under focused attention conditions. Copyright © 2012 Elsevier Ltd. All rights reserved.
Budin, Francois; Hoogstoel, Marion; Reynolds, Patrick; Grauer, Michael; O'Leary-Moore, Shonagh K; Oguz, Ipek
2013-01-01
Magnetic resonance imaging (MRI) of rodent brains enables study of the development and the integrity of the brain under certain conditions (alcohol, drugs etc.). However, these images are difficult to analyze for biomedical researchers with limited image processing experience. In this paper we present an image processing pipeline running on a Midas server, a web-based data storage system. It is composed of the following steps: rigid registration, skull-stripping, average computation, average parcellation, parcellation propagation to individual subjects, and computation of region-based statistics on each image. The pipeline is easy to configure and requires very little image processing knowledge. We present results obtained by processing a data set using this pipeline and demonstrate how this pipeline can be used to find differences between populations.
Szulc, Stefan
1965-01-01
Statistical Methods provides a discussion of the principles of the organization and technique of research, with emphasis on its application to the problems in social statistics. This book discusses branch statistics, which aims to develop practical ways of collecting and processing numerical data and to adapt general statistical methods to the objectives in a given field.Organized into five parts encompassing 22 chapters, this book begins with an overview of how to organize the collection of such information on individual units, primarily as accomplished by government agencies. This text then
Marrakesh International Conference on Probability and Statistics
Ouassou, Idir; Rachdi, Mustapha
2015-01-01
This volume, which highlights recent advances in statistical methodology and applications, is divided into two main parts. The first part presents theoretical results on estimation techniques in functional statistics, while the second examines three key areas of application: estimation problems in queuing theory, an application in signal processing, and the copula approach to epidemiologic modelling. The book’s peer-reviewed contributions are based on papers originally presented at the Marrakesh International Conference on Probability and Statistics held in December 2013.
EUROPEAN INTEGRATION: A MULTILEVEL PROCESS THAT REQUIRES A MULTILEVEL STATISTICAL ANALYSIS
Directory of Open Access Journals (Sweden)
Roxana-Otilia-Sonia HRITCU
2015-11-01
Full Text Available A process of market regulation and a system of multi-level governance and several supranational, national and subnational levels of decision making, European integration subscribes to being a multilevel phenomenon. The individual characteristics of citizens, as well as the environment where the integration process takes place, are important. To understand the European integration and its consequences it is important to develop and test multi-level theories that consider individual-level characteristics, as well as the overall context where individuals act and express their characteristics. A central argument of this paper is that support for European integration is influenced by factors operating at different levels. We review and present theories and related research on the use of multilevel analysis in the European area. This paper draws insights on various aspects and consequences of the European integration to take stock of what we know about how and why to use multilevel modeling.
International Nuclear Information System (INIS)
Naquid G, C.
2003-01-01
In this work the methodology for the control of the irradiation service applied to disposable products of medical use, foods, medications, cosmetics and several to different dose in the Gamma Irradiation Plant, with radiation of Cobalt 60, by means of the Graphs of Quality Control by Variables. These they are used to check if the irradiation process has been carried out appropriately, or well, if the variations surpass the established limits in the one that influences some another factor. (Author)
International Nuclear Information System (INIS)
Theodorsen, A; Garcia, O E; Rypdal, M
2017-01-01
Filtered Poisson processes are often used as reference models for intermittent fluctuations in physical systems. Such a process is here extended by adding a noise term, either as a purely additive term to the process or as a dynamical term in a stochastic differential equation. The lowest order moments, probability density function, auto-correlation function and power spectral density are derived and used to identify and compare the effects of the two different noise terms. Monte-Carlo studies of synthetic time series are used to investigate the accuracy of model parameter estimation and to identify methods for distinguishing the noise types. It is shown that the probability density function and the three lowest order moments provide accurate estimations of the model parameters, but are unable to separate the noise types. The auto-correlation function and the power spectral density also provide methods for estimating the model parameters, as well as being capable of identifying the noise type. The number of times the signal crosses a prescribed threshold level in the positive direction also promises to be able to differentiate the noise type. (paper)
International Nuclear Information System (INIS)
Kim, Yong Il; Im, Hyung Jun; Paeng, Jin Chul; Lee, Jae Sung; Eo, Jae Seon; Kim, Dong Hyun; Kim, Euishin E.; Kang, Keon Wook; Chung, June Key; Lee Dong Soo
2012-01-01
18 F FP CIT positron emission tomography (PET) is an effective imaging for dopamine transporters. In usual clinical practice, 18 F FP CIT PET is analyzed visually or quantified using manual delineation of a volume of interest (VOI) fir the stratum. in this study, we suggested and validated two simple quantitative methods based on automatic VOI delineation using statistical probabilistic anatomical mapping (SPAM) and isocontour margin setting. Seventy five 18 F FP CIT images acquired in routine clinical practice were used for this study. A study-specific image template was made and the subject images were normalized to the template. afterwards, uptakes in the striatal regions and cerebellum were quantified using probabilistic VOI based on SPAM. A quantitative parameter, Q SPAM, was calculated to simulate binding potential. additionally, the functional volume of each striatal region and its uptake were measured in automatically delineated VOI using isocontour margin setting. Uptake volume product(Q UVP) was calculated for each striatal region. Q SPAMa nd Q UVPw as calculated for each visual grading and the influence of cerebral atrophy on the measurements was tested. Image analyses were successful in all the cases. Both the Q SPAMa nd Q UVPw ere significantly different according to visual grading (0.001). The agreements of Q UVPa nd Q SPAMw ith visual grading were slight to fair for the caudate nucleus (K= 0.421 and 0.291, respectively) and good to prefect to the putamen (K=0.663 and 0.607, respectively). Also, Q SPAMa nd Q UVPh ad a significant correlation with each other (0.001). Cerebral atrophy made a significant difference in Q SPAMa nd Q UVPo f the caudate nuclei regions with decreased 18 F FP CIT uptake. Simple quantitative measurements of Q SPAMa nd Q UVPs howed acceptable agreement with visual grad-ing. although Q SPAMi n some group may be influenced by cerebral atrophy, these simple methods are expected to be effective in the quantitative analysis of F FP
A garbage-processing technology has been developed that shreds, sterilizes, and separates inorganic and organic components of municipal solid waste. The technology not only greatly reduces waste volume, but the non-composted byproduct of this process, Fluff®, has the potential to be utilized as a s...
van Vliet, Ellen J.; Bredenhoff, E.; Bredenhoff, Eelco; Sermeus, Walter; Kop, Lucas M.; Sol, Johannes C.A.; van Harten, Willem H.
2011-01-01
Objective: To compare process designs of three high-volume cataract pathways in a lean thinking framework and to explore how efficiency in terms of lead times, hospital visits and costs is related to process design. Design: International retrospective comparative benchmark study with a mixed-method
Catelani, Tiago A; Santos, João Rodrigo; Páscoa, Ricardo N M J; Pezza, Leonardo; Pezza, Helena R; Lopes, João A
2018-03-01
This work proposes the use of near infrared (NIR) spectroscopy in diffuse reflectance mode and multivariate statistical process control (MSPC) based on principal component analysis (PCA) for real-time monitoring of the coffee roasting process. The main objective was the development of a MSPC methodology able to early detect disturbances to the roasting process resourcing to real-time acquisition of NIR spectra. A total of fifteen roasting batches were defined according to an experimental design to develop the MSPC models. This methodology was tested on a set of five batches where disturbances of different nature were imposed to simulate real faulty situations. Some of these batches were used to optimize the model while the remaining was used to test the methodology. A modelling strategy based on a time sliding window provided the best results in terms of distinguishing batches with and without disturbances, resourcing to typical MSPC charts: Hotelling's T 2 and squared predicted error statistics. A PCA model encompassing a time window of four minutes with three principal components was able to efficiently detect all disturbances assayed. NIR spectroscopy combined with the MSPC approach proved to be an adequate auxiliary tool for coffee roasters to detect faults in a conventional roasting process in real-time. Copyright © 2017 Elsevier B.V. All rights reserved.
Directory of Open Access Journals (Sweden)
Jaroslav Jedlička
2014-11-01
Full Text Available We have focused on the influence evaluation of the locality, the vintage year and fermentation process on the volume of copper and lead into grape must and wine. First of all copper and lead volume was assessed into fresh grape musts. Subsequently the musts were fermented. During the wines analyses we found great decrease of copper by the fermentation process. Assessed Cu2+ values vary from 0.07 to 0.2 mg.L-1 and represent a decrease of the original copper volume from 90 to 97%. On the copper content into grape has probably the significant influence also the precipitation amount, which falling in the second part of the vegetation half a year. Total rainfall in the period before the grape harvesting (the months of August - September was for the first year 153 mm and for second year 137,5 mm. During both observed vintage years it was concerning to the above average values. Copper is not possible to eliminate totally in the protection of the vine against fungal diseases, because against it does not come into existence resistance into a pathogen. For resolution of this problem it is suitable to combine the copper and organic products. Fermentation affect as a biological filter and influence also lead volume. Into analysed wines we found the decrease of the lead volume from 25 to 94%. Maximal assessed Pb2+ value into wine was 0.09 mg.L-1. The linear relationship between lead and copper into grape must in relationship to the lead and copper into wine was not statistically demonstrated. We found the statistically significant relationship in lead content into grape must by the influence of the vintage year, which as we supposed, it was connected with the atmospheric precipitation quantity and distribution during the vegetation. On the base of the assessed results of the lead and copper volume into wine, we state that by using of the faultless material and appropriate technological equipment during the wine production, it is possible to eliminate almost
2015-03-26
universal definition” (Evans & Lindsay, 1996). Heizer and Render (2010) argue that several definitions of this term are user-based, meaning, that quality...for example, really good ice cream has high butterfat levels.” ( Heizer & Render , 2010). Garvin, in his Competing in Eight Dimensions of Quality...Montgomery, 2005). As for definition purposes, the concept adopted by this research was provided by Heizer and Render (2010), for whom Statistical Process
Oberoi, Harinder Singh; Vadlani, Praveen V; Saida, Lavudi; Bansal, Sunil; Hughes, Joshua D
2011-07-01
Dried and ground banana peel biomass (BP) after hydrothermal sterilization pretreatment was used for ethanol production using simultaneous saccharification and fermentation (SSF). Central composite design (CCD) was used to optimize concentrations of cellulase and pectinase, temperature and time for ethanol production from BP using SSF. Analysis of variance showed a high coefficient of determination (R(2)) value of 0.92 for ethanol production. On the basis of model graphs and numerical optimization, the validation was done in a laboratory batch fermenter with cellulase, pectinase, temperature and time of nine cellulase filter paper unit/gram cellulose (FPU/g-cellulose), 72 international units/gram pectin (IU/g-pectin), 37 °C and 15 h, respectively. The experiment using optimized parameters in batch fermenter not only resulted in higher ethanol concentration than the one predicted by the model equation, but also saved fermentation time. This study demonstrated that both hydrothermal pretreatment and SSF could be successfully carried out in a single vessel, and use of optimized process parameters helped achieve significant ethanol productivity, indicating commercial potential for the process. To the best of our knowledge, ethanol concentration and ethanol productivity of 28.2 g/l and 2.3 g/l/h, respectively from banana peels have not been reported to date. Copyright © 2011 Elsevier Ltd. All rights reserved.
The Product Composition Control System at Savannah River: The statistical process control algorithm
International Nuclear Information System (INIS)
Brown, K.G.
1993-01-01
The Defense Waste Processing Facility (DWPF) at the Savannah River Site (SRS) in Aiken, South Carolina, will be used to immobilize the approximately 130 million liters of high-level nuclear waste currently stored at the site in 51 carbon steel tanks. Waste handling operations separate this waste into highly radioactive insoluble sludge and precipitate and less radioactive water soluble salts. (In a separate facility, the soluble salts are disposed of as low-level waste in a mixture of cement, slag, and flyash.) In DWPF, precipitate (PHA) is blended with insoluble sludge and ground glass tit to produce melter feed slurry which is continuously fed to the DWPF melter. The melter produces a molten borosilicate glass which is poured into stainless steel canisters for cooling and, ultimately, shipment to and storage in a geologic repository. The repository requires that the glass wasteform be resistant to leaching by underground water that might contact it. In addition, there are processing constraints on melt viscosity, liquidus temperature, and waste solubility
Energy Technology Data Exchange (ETDEWEB)
Teodoro, Rodrigo; Dias, Carla R.B.R.; Osso Junior, Joao A., E-mail: jaosso@ipen.b [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil); Fernandez Nunez, Eutimio Gustavo [Universidade de Sao Paulo (EP/USP), SP (Brazil). Escola Politecnica. Dept. de Engenharia Quimica
2011-07-01
Precipitation of {sup 99}Mo by {alpha}-benzoin oxime ({alpha}-Bz) is a standard precipitation method for molybdenum due the high selectivity of this agent. Nowadays, statistical analysis tools have been employed in analytical systems to prove its efficiency and feasibility. IPEN has a project aiming the production of {sup 99}Mo by the fission of {sup 235}U route. The processing uses as the first step the precipitation of {sup 99}Mo with {alpha}-Bz. This precipitation step involves many key reaction parameters. The aim of this work is based on the development of the already known acidic route to produce {sup 99}Mo as well as the optimization of the reactional parameters applying statistical tools. In order to simulate {sup 99}Mo precipitation, the study was conducted in acidic media using HNO{sub 3}, {alpha}Bz as precipitant agent and NaOH /1%H{sub 2}O{sub 2} as dissolver solution. Then, a Mo carrier, KMnO{sub 4} solutions and {sup 99}Mo tracer were added to the reaction flask. The reactional parameters ({alpha}-Bz/Mo ratio, Mo carrier, reaction time and temperature, and cooling reaction time before filtration) were evaluated under a fractional factorial design of resolution V. The best values of each reactional parameter were determined by a response surface statistical planning. The precipitation and recovery yields of {sup 99}Mo were measured using HPGe detector. Statistical analysis from experimental data suggested that the reactional parameters {alpha}-Bz/Mo ratio, reaction time and temperature have a significant impact on {sup 99}Mo precipitation. Optimization statistical planning showed that higher {alpha}Bz/Mo ratios, room temperature, and lower reaction time lead to higher {sup 99}Mo yields. (author)
A catalytic wet oxidation process for mixed waste volume reduction/recycling
International Nuclear Information System (INIS)
Dhooge, Patrick M.
1992-01-01
Mixed wastes have presented a challenge to treatment and destruction technologies. A recently developed catalytic wet oxidation method has promising characteristics for volume reduction and recycling of mixed wastes. The process utilizes iron (III) as an oxidant in the presence of homogeneous cocatalysts which increase organics' oxidation rates and the rate of oxidation of iron (II) by oxygen. The reaction is conducted in an aqueous mineral acid solution at temperatures of 373 - 573 deg K. The mineral acid should solvate a number of heavy metals, including U and Pu. Studies of reaction rates show that the process can oxidize a wide range of organic compounds including aromatics and chlorinated hydrocarbons. Rate constants in the range of 10 -7 to 10 -4 sec -1 , depending on the cocatalyst, acidity, type of anions, type of organic, temperature, and time. Activation energies ranged from 25. to 32. KJ/mole. Preliminary measurements of the extent of oxidation which could be obtained ranged from 80% for trichloroethylene to 99.8% for 1,2,4-trimethylbenzene; evidence was obtained that absorption by the fluorocarbon liners of the reaction bombs allowed some of the organics to escape exposure to the catalyst solution. The results indicate that complete oxidation of the organics used here, and presumably many others, can be achieved. (author)
Stress strain modelling of casting processes in the framework of the control volume method
DEFF Research Database (Denmark)
Hattel, Jesper; Andersen, Søren; Thorborg, Jesper
1998-01-01
Realistic computer simulations of casting processes call for the solution of both thermal, fluid-flow and stress/strain related problems. The multitude of the influencing parameters, and their non-linear, transient and temperature dependent nature, make the calculations complex. Therefore the nee......, the present model is based on the mainly decoupled representation of the thermal, mechanical and microstructural processes. Examples of industrial applications, such as predicting residual deformations in castings and stress levels in die casting dies, are presented...... for fast, flexible, multidimensional numerical methods is obvious. The basis of the deformation and stress/strain calculation is a transient heat transfer analysis including solidification. This paper presents an approach where the stress/strain and the heat transfer analysis uses the same computational...... domain, which is highly convenient. The basis of the method is the control volume finite difference approach on structured meshes. The basic assumptions of the method are shortly reviewed and discussed. As for other methods which aim at application oriented analysis of casting deformations and stresses...
Statistical and cost-benefit enhancements to the DQO process for characterization decisions
International Nuclear Information System (INIS)
Goodman, D.
1996-01-01
The costs of characterization can comprise a sizeable fraction of a remediation program budget. The DQO Process has been instituted at DOE to ensure that the investment in characterization adds net value to each remediation project. Thoughtful characterization can be very important to minimizing the total cost of a remediation. Strategic information gained by characterization can reduce the remediation costs by reducing the unproductive investment in unnecessary remediation of portions of a site that really don't need to be remediated, and strategic information can reduce remediation costs by reducing the frequency of expensive rework or emergency action that result when remediation has not been pursued to the extent that really is needed
Statistical signal processing for gamma spectrometry: application for a pileup correction method
International Nuclear Information System (INIS)
Trigano, T.
2005-12-01
The main objective of gamma spectrometry is to characterize the radioactive elements of an unknown source by studying the energy of the emitted photons. When a photon interacts with a detector, its energy is converted into an electrical pulse. The histogram obtained by collecting the energies can be used to identify radioactive elements and measure their activity. However, at high counting rates, perturbations which are due to the stochastic aspect of the temporal signal can cripple the identification of the radioactive elements. More specifically, since the detector has a finite resolution, close arrival times of photons which can be modeled as an homogeneous Poisson process cause pile-ups of individual pulses. This phenomenon distorts energy spectra by introducing multiple fake spikes and prolonging artificially the Compton continuum, which can mask spikes of low intensity. The objective of this thesis is to correct the distortion caused by the pile-up phenomenon in the energy spectra. Since the shape of photonic pulses depends on many physical parameters, we consider this problem in a nonparametric framework. By introducing an adapted model based on two marked point processes, we establish a nonlinear relation between the probability measure associated to the observations and the probability density function we wish to estimate. This relation is derived both for continuous and for discrete time signals, and therefore can be used on a large set of detectors and from an analog or digital point of view. It also provides a framework to this problem, which can be considered as a problem of nonlinear density deconvolution and nonparametric density estimation from indirect measurements. Using these considerations, we propose an estimator obtained by direct inversion. We show that this estimator is consistent and almost achieves the usual rate of convergence obtained in classical nonparametric density estimation in the L 2 sense. We have applied our method to a set of
International Nuclear Information System (INIS)
WESTCOTT, J.L.
2006-01-01
Two facilities for storing spent nuclear fuel underwater at the Hanford site in southeastern Washington State being removed from service, decommissioned, and prepared for eventual demolition. The fuel-storage facilities consist of two separate basins called K East (KE) and K West (KW) that are large subsurface concrete pools filled with water, with a containment structure over each. The basins presently contain sludge, debris, and equipment that have accumulated over the years. The spent fuel has been removed from the basins. The process for removing the remaining sludge, equipment, and structure has been initiated for the basins. Ongoing removal operations generate solid waste that is being treated as required, and then disposed. The waste, equipment and building structures must be characterized to properly manage, ship, treat (if necessary), and dispose as radioactive waste. As the work progresses, it is expected that radiological conditions in each basin may change as radioactive materials are being moved within and between the basins. It is imperative that these changing conditions be monitored so that radioactive characterization of waste is adjusted as necessary
International Nuclear Information System (INIS)
WESTCOTT, J.L.; JOCHEN; PREVETTE
2007-01-01
Two facilities for storing spent nuclear fuel underwater at the Hanford site in southeastern Washington State are being removed from service, decommissioned, and prepared for eventual demolition. The fuel-storage facilities consist of two separate basins called K East (KE) and K West (KW) that are large subsurface concrete pools filled with water, with a containment structure over each. The basins presently contain sludge, debris, and equipment that have accumulated over the years. The spent fuel has been removed from the basins. The process for removing the remaining sludge, equipment, and structure has been initiated for the basins. Ongoing removal operations generate solid waste that is being treated as required, and then disposed. The waste, equipment and building structures must be characterized to properly manage, ship, treat (if necessary), and dispose as radioactive waste. As the work progresses, it is expected that radiological conditions in each basin may change as radioactive materials are being moved within and between the basins. It is imperative that these changing conditions be monitored so that radioactive characterization of waste is adjusted as necessary