#### Sample records for volume statistical process

1. Multivariate Statistical Process Control

DEFF Research Database (Denmark)

Kulahci, Murat

2013-01-01

As sensor and computer technology continues to improve, it becomes a normal occurrence that we confront with high dimensional data sets. As in many areas of industrial statistics, this brings forth various challenges in statistical process control (SPC) and monitoring for which the aim...... is to identify “out-of-control” state of a process using control charts in order to reduce the excessive variation caused by so-called assignable causes. In practice, the most common method of monitoring multivariate data is through a statistic akin to the Hotelling’s T2. For high dimensional data with excessive...

2. Mathematical statistics and stochastic processes

CERN Document Server

Bosq, Denis

2013-01-01

Generally, books on mathematical statistics are restricted to the case of independent identically distributed random variables. In this book however, both this case AND the case of dependent variables, i.e. statistics for discrete and continuous time processes, are studied. This second case is very important for today's practitioners.Mathematical Statistics and Stochastic Processes is based on decision theory and asymptotic statistics and contains up-to-date information on the relevant topics of theory of probability, estimation, confidence intervals, non-parametric statistics and rob

3. Probability, Statistics, and Stochastic Processes

CERN Document Server

Olofsson, Peter

2011-01-01

A mathematical and intuitive approach to probability, statistics, and stochastic processes This textbook provides a unique, balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. This text combines a rigorous, calculus-based development of theory with a more intuitive approach that appeals to readers' sense of reason and logic, an approach developed through the author's many years of classroom experience. The text begins with three chapters that d

4. Fundamentals of statistical signal processing

CERN Document Server

Kay, Steven M

1993-01-01

A unified presentation of parameter estimation for those involved in the design and implementation of statistical signal processing algorithms. Covers important approaches to obtaining an optimal estimator and analyzing its performance; and includes numerous examples as well as applications to real- world problems. MARKETS: For practicing engineers and scientists who design and analyze signal processing systems, i.e., to extract information from noisy signals — radar engineer, sonar engineer, geophysicist, oceanographer, biomedical engineer, communications engineer, economist, statistician, physicist, etc.

5. Statistical methods in language processing.

Science.gov (United States)

Abney, Steven

2011-05-01

The term statistical methods here refers to a methodology that has been dominant in computational linguistics since about 1990. It is characterized by the use of stochastic models, substantial data sets, machine learning, and rigorous experimental evaluation. The shift to statistical methods in computational linguistics parallels a movement in artificial intelligence more broadly. Statistical methods have so thoroughly permeated computational linguistics that almost all work in the field draws on them in some way. There has, however, been little penetration of the methods into general linguistics. The methods themselves are largely borrowed from machine learning and information theory. We limit attention to that which has direct applicability to language processing, though the methods are quite general and have many nonlinguistic applications. Not every use of statistics in language processing falls under statistical methods as we use the term. Standard hypothesis testing and experimental design, for example, are not covered in this article. WIREs Cogni Sci 2011 2 315-322 DOI: 10.1002/wcs.111 For further resources related to this article, please visit the WIREs website.

6. Object localization using the statistical behavior of volume speckle fields

Science.gov (United States)

Abregana, Timothy Joseph T.; Almoro, Percival F.

2016-12-01

Speckle noise presents challenges in object localization using reconstructed wavefronts. Here, a technique for axial localization of rough test objects based on a statistical algorithm that processes volume speckle fields is demonstrated numerically and experimentally. The algorithm utilizes the standard deviation of phase difference maps as a metric to characterize the object wavefront at different axial locations. Compared with an amplitude-based localization method utilizing energy of image gradient, the technique is shown to be robust against speckle noise.

7. Statistical thermodynamics of nonequilibrium processes

CERN Document Server

Keizer, Joel

1987-01-01

The structure of the theory ofthermodynamics has changed enormously since its inception in the middle of the nineteenth century. Shortly after Thomson and Clausius enunciated their versions of the Second Law, Clausius, Maxwell, and Boltzmann began actively pursuing the molecular basis of thermo­ dynamics, work that culminated in the Boltzmann equation and the theory of transport processes in dilute gases. Much later, Onsager undertook the elucidation of the symmetry oftransport coefficients and, thereby, established himself as the father of the theory of nonequilibrium thermodynamics. Com­ bining the statistical ideas of Gibbs and Langevin with the phenomenological transport equations, Onsager and others went on to develop a consistent statistical theory of irreversible processes. The power of that theory is in its ability to relate measurable quantities, such as transport coefficients and thermodynamic derivatives, to the results of experimental measurements. As powerful as that theory is, it is linear and...

8. Studies in Mathematics Education: The Teaching of Statistics, Volume 7.

Science.gov (United States)

Morris, Robert, Ed.

This volume examines the teaching of statistics in the whole range of education, but concentrates on primary and secondary schools. It is based upon selected topics from the Second International Congress on Teaching Statistics (ICOTS 2), convened in Canada in August 1986. The contents of this volume divide broadly into four parts: statistics in…

9. Statistical Inference at Work: Statistical Process Control as an Example

Science.gov (United States)

Bakker, Arthur; Kent, Phillip; Derry, Jan; Noss, Richard; Hoyles, Celia

2008-01-01

To characterise statistical inference in the workplace this paper compares a prototypical type of statistical inference at work, statistical process control (SPC), with a type of statistical inference that is better known in educational settings, hypothesis testing. Although there are some similarities between the reasoning structure involved in…

10. Probability, Statistics, and Stochastic Processes

CERN Document Server

Olofsson, Peter

2012-01-01

This book provides a unique and balanced approach to probability, statistics, and stochastic processes.   Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area.  The Second Edition features new coverage of analysis of variance (ANOVA), consistency and efficiency of estimators, asymptotic theory for maximum likelihood estimators, empirical distribution function and the Kolmogorov-Smirnov test, general linear models, multiple comparisons, Markov chain Monte Carlo (MCMC), Brownian motion, martingales, and

11. Representative volume size: A comparison of statistical continuum mechanics and statistical physics

Energy Technology Data Exchange (ETDEWEB)

AIDUN,JOHN B.; TRUCANO,TIMOTHY G.; LO,CHI S.; FYE,RICHARD M.

1999-05-01

In this combination background and position paper, the authors argue that careful work is needed to develop accurate methods for relating the results of fine-scale numerical simulations of material processes to meaningful values of macroscopic properties for use in constitutive models suitable for finite element solid mechanics simulations. To provide a definite context for this discussion, the problem is couched in terms of the lack of general objective criteria for identifying the size of the representative volume (RV) of a material. The objective of this report is to lay out at least the beginnings of an approach for applying results and methods from statistical physics to develop concepts and tools necessary for determining the RV size, as well as alternatives to RV volume-averaging for situations in which the RV is unmanageably large. The background necessary to understand the pertinent issues and statistical physics concepts is presented.

12. Statistical processing of experimental data

OpenAIRE

NAVRÁTIL, Pavel

2012-01-01

This thesis contains theory of probability and statistical sets. Solved and unsolved problems of probability, random variable and distributions random variable, random vector, statistical sets, regression and correlation analysis. Unsolved problems contains solutions.

13. The Statistical Interpretation of Classical Thermodynamic Heating and Expansion Processes

Science.gov (United States)

Cartier, Stephen F.

2011-01-01

A statistical model has been developed and applied to interpret thermodynamic processes typically presented from the macroscopic, classical perspective. Through this model, students learn and apply the concepts of statistical mechanics, quantum mechanics, and classical thermodynamics in the analysis of the (i) constant volume heating, (ii)…

14. Columbia River Basin Seasonal Volumes and Statistics, 1928-1989. 1990 Level Modified Streamflows Computed Seasonal Volumes 61-Year Statistics.

Energy Technology Data Exchange (ETDEWEB)

A.G. Crook Company

1993-04-01

This report was prepared by the A.G. Crook Company, under contract to Bonneville Power Administration, and provides statistics of seasonal volumes and streamflow for 28 selected sites in the Columbia River Basin.

15. Statistical aspects of determinantal point processes

DEFF Research Database (Denmark)

Lavancier, Frédéric; Møller, Jesper; Rubak, Ege

The statistical aspects of determinantal point processes (DPPs) seem largely unexplored. We review the appealing properties of DDPs, demonstrate that they are useful models for repulsiveness, detail a simulation procedure, and provide freely available software for simulation and statistical infer...

16. Statistical inference for Cox processes

DEFF Research Database (Denmark)

Møller, Jesper; Waagepetersen, Rasmus Plenge

2002-01-01

and space-time, spatial and spatio-temporal process modelling, nonparametric methods for clustering, and spatio-temporal cluster modelling.   Many figures, some in full color, complement the text, and a single section of references cited makes it easy to locate source material. Leading specialists...

17. Utilizing Statistical Dialogue Act Processing in Verbmobil

CERN Document Server

Reithinger, N; Reithinger, Norbert; Maier, Elisabeth

1995-01-01

In this paper, we present a statistical approach for dialogue act processing in the dialogue component of the speech-to-speech translation system \\vm. Statistics in dialogue processing is used to predict follow-up dialogue acts. As an application example we show how it supports repair when unexpected dialogue states occur.

18. Statistical process control for IMRT dosimetric verification.

Science.gov (United States)

Breen, Stephen L; Moseley, Douglas J; Zhang, Beibei; Sharpe, Michael B

2008-10-01

Patient-specific measurements are typically used to validate the dosimetry of intensity-modulated radiotherapy (IMRT). To evaluate the dosimetric performance over time of our IMRT process, we have used statistical process control (SPC) concepts to analyze the measurements from 330 head and neck (H&N) treatment plans. The objectives of the present work are to: (i) Review the dosimetric measurements of a large series of consecutive head and neck treatment plans to better understand appropriate dosimetric tolerances; (ii) analyze the results with SPC to develop action levels for measured discrepancies; (iii) develop estimates for the number of measurements that are required to describe IMRT dosimetry in the clinical setting; and (iv) evaluate with SPC a new beam model in our planning system. H&N IMRT cases were planned with the PINNACLE treatment planning system versions 6.2b or 7.6c (Philips Medical Systems, Madison, WI) and treated on Varian (Palo Alto, CA) or Elekta (Crawley, UK) linacs. As part of regular quality assurance, plans were recalculated on a 20-cm-diam cylindrical phantom, and ion chamber measurements were made in high-dose volumes (the PTV with highest dose) and in low-dose volumes (spinal cord organ-at-risk, OR). Differences between the planned and measured doses were recorded as a percentage of the planned dose. Differences were stable over time. Measurements with PINNACLE3 6.2b and Varian linacs showed a mean difference of 0.6% for PTVs (n=149, range, -4.3% to 6.6%), while OR measurements showed a larger systematic discrepancy (mean 4.5%, range -4.5% to 16.3%) that was due to well-known limitations of the MLC model in the earlier version of the planning system. Measurements with PINNACLE3 7.6c and Varian linacs demonstrated a mean difference of 0.2% for PTVs (n=160, range, -3.0%, to 5.0%) and -1.0% for ORs (range -5.8% to 4.4%). The capability index (ratio of specification range to range of the data) was 1.3 for the PTV data, indicating that almost

19. STATISTICAL PROCESS CONTROL IN SERBIAN FOOD PACKAGING

Directory of Open Access Journals (Sweden)

Djekic Ilija

2014-09-01

Full Text Available This paper gives an overview of the food packaging process in seven food companies in the dairy and confectionery sector. A total of 23 production runs have been analyzed regarding the three packers' rules outlined in the Serbian legislation and process capability tests related to statistical process control. None of the companies had any type of statistical process control in place. Results confirmed that more companies show overweight packaging compared to underfilling. Production runs are more accurate than precise, although in some cases the productions are both inaccurate and imprecise. Education / training of the new generation of food industry workers (both on operational and managerial level with courses in the food area covering elements of quality assurance and statistical process control can help in implementing effective food packaging.

20. Detecting Hidden Encrypted Volume Files via Statistical Analysis

Directory of Open Access Journals (Sweden)

Mario Piccinelli

2015-05-01

Full Text Available Nowadays various software tools have been developed for the purpose of creating encrypted volume files. Many of those tools are open source and freely available on the internet. Because of that, the probability of finding encrypted files which could contain forensically useful information has dramatically increased. While decoding these files without the key is still a major challenge, the simple fact of being able to recognize their existence is now a top priority for every digital forensics investigation. In this paper we will present a statistical approach to find elements of a seized filesystem which have a reasonable chance of containing encrypted data.

1. STATISTICAL SPACE-TIME ADAPTIVE PROCESSING ALGORITHM

Institute of Scientific and Technical Information of China (English)

Yang Jie

2010-01-01

For the slowly changed environment-range-dependent non-homogeneity,a new statistical space-time adaptive processing algorithm is proposed,which uses the statistical methods,such as Bayes or likelihood criterion to estimate the approximative covariance matrix in the non-homogeneous condition. According to the statistical characteristics of the space-time snapshot data,via defining the aggregate snapshot data and corresponding events,the conditional probability of the space-time snapshot data which is the effective training data is given,then the weighting coefficients are obtained for the weighting method. The theory analysis indicates that the statistical methods of the Bayes and likelihood criterion for covariance matrix estimation are more reasonable than other methods that estimate the covariance matrix with the use of training data except the detected outliers. The last simulations attest that the proposed algorithms can estimate the covariance in the non-homogeneous condition exactly and have favorable characteristics.

2. EWMA control charts in statistical process monitoring

NARCIS (Netherlands)

Zwetsloot, I.M.

2016-01-01

In today’s world, the amount of available data is steadily increasing, and it is often of interest to detect changes in the data. Statistical process monitoring (SPM) provides tools to monitor data streams and to signal changes in the data. One of these tools is the control chart. The topic of this

3. Statistical Physics for Natural Language Processing

CERN Document Server

Moreno, Juan-Manuel Torres; SanJuan, Eric

2010-01-01

In this paper we study the {\\sc Enertex} model that has been applied to fundamental tasks in Natural Language Processing (NLP) including automatic document summarization and topic segmentation. The model is language independent. It is based on the intuitive concept of Textual Energy, inspired by Neural Networks and Statistical Physics of magnetic systems. It can be implemented using simple matrix operations and on the contrary of PageRank algorithms, it avoids any iterative process.

4. Multivariate Statistical Process Control Process Monitoring Methods and Applications

CERN Document Server

Ge, Zhiqiang

2013-01-01

Given their key position in the process control industry, process monitoring techniques have been extensively investigated by industrial practitioners and academic control researchers. Multivariate statistical process control (MSPC) is one of the most popular data-based methods for process monitoring and is widely used in various industrial areas. Effective routines for process monitoring can help operators run industrial processes efficiently at the same time as maintaining high product quality. Multivariate Statistical Process Control reviews the developments and improvements that have been made to MSPC over the last decade, and goes on to propose a series of new MSPC-based approaches for complex process monitoring. These new methods are demonstrated in several case studies from the chemical, biological, and semiconductor industrial areas.   Control and process engineers, and academic researchers in the process monitoring, process control and fault detection and isolation (FDI) disciplines will be inter...

5. Statistical process control methods for expert system performance monitoring.

Science.gov (United States)

Kahn, M G; Bailey, T C; Steib, S A; Fraser, V J; Dunagan, W C

1996-01-01

The literature on the performance evaluation of medical expert system is extensive, yet most of the techniques used in the early stages of system development are inappropriate for deployed expert systems. Because extensive clinical and informatics expertise and resources are required to perform evaluations, efficient yet effective methods of monitoring performance during the long-term maintenance phase of the expert system life cycle must be devised. Statistical process control techniques provide a well-established methodology that can be used to define policies and procedures for continuous, concurrent performance evaluation. Although the field of statistical process control has been developed for monitoring industrial processes, its tools, techniques, and theory are easily transferred to the evaluation of expert systems. Statistical process tools provide convenient visual methods and heuristic guidelines for detecting meaningful changes in expert system performance. The underlying statistical theory provides estimates of the detection capabilities of alternative evaluation strategies. This paper describes a set of statistical process control tools that can be used to monitor the performance of a number of deployed medical expert systems. It describes how p-charts are used in practice to monitor the GermWatcher expert system. The case volume and error rate of GermWatcher are then used to demonstrate how different inspection strategies would perform.

6. Statistical image processing and multidimensional modeling

CERN Document Server

Fieguth, Paul

2010-01-01

Images are all around us! The proliferation of low-cost, high-quality imaging devices has led to an explosion in acquired images. When these images are acquired from a microscope, telescope, satellite, or medical imaging device, there is a statistical image processing task: the inference of something - an artery, a road, a DNA marker, an oil spill - from imagery, possibly noisy, blurry, or incomplete. A great many textbooks have been written on image processing. However this book does not so much focus on images, per se, but rather on spatial data sets, with one or more measurements taken over

7. The statistical process control methods - SPC

Directory of Open Access Journals (Sweden)

Floreková Ľubica

1998-03-01

Full Text Available Methods of statistical evaluation of quality  SPC (item 20 of the documentation system of quality control of ISO norm, series 900 of various processes, products and services belong amongst basic qualitative methods that enable us to analyse and compare data pertaining to various quantitative parameters. Also they enable, based on the latter, to propose suitable interventions with the aim of improving these processes, products and services. Theoretical basis and applicatibily of the principles of the: - diagnostics of a cause and effects, - Paret analysis and Lorentz curve, - number distribution and frequency curves of random variable distribution, - Shewhart regulation charts, are presented in the contribution.

8. Thermodynamically reversible processes in statistical physics

Science.gov (United States)

Norton, John D.

2017-02-01

Equilibrium states are used as limit states to define thermodynamically reversible processes. When these processes are understood in terms of statistical physics, these limit states can change with time due to thermal fluctuations. For macroscopic systems, the changes are insignificant on ordinary time scales and what little change there is can be suppressed by macroscopically negligible, entropy-creating dissipation. For systems of molecular sizes, the changes are large on short time scales. They can only sometimes be suppressed with significant entropy-creating dissipation, and this entropy creation is unavoidable if any process is to proceed to completion. As a result, at molecular scales, thermodynamically reversible processes are impossible in principle. Unlike the macroscopic case, they cannot be realized even approximately when we account for all sources of dissipation, and argumentation invoking them on molecular scales can lead to spurious conclusions.

9. PROCESS VARIABILITY REDUCTION THROUGH STATISTICAL PROCESS CONTROL FOR QUALITY IMPROVEMENT

Directory of Open Access Journals (Sweden)

B.P. Mahesh

2010-09-01

Full Text Available Quality has become one of the most important customer decision factors in the selection among the competing product and services. Consequently, understanding and improving quality is a key factor leading to business success, growth and an enhanced competitive position. Hence quality improvement program should be an integral part of the overall business strategy. According to TQM, the effective way to improve the Quality of the product or service is to improve the process used to build the product. Hence, TQM focuses on process, rather than results as the results are driven by the processes. Many techniques are available for quality improvement. Statistical Process Control (SPC is one such TQM technique which is widely accepted for analyzing quality problems and improving the performance of the production process. This article illustrates the step by step procedure adopted at a soap manufacturing company to improve the Quality by reducing process variability using Statistical Process Control.

10. Batch Statistical Process Monitoring Approach to a Cocrystallization Process.

Science.gov (United States)

Sarraguça, Mafalda C; Ribeiro, Paulo R S; Santos, Adenilson O Dos; Lopes, João A

2015-12-01

Cocrystals are defined as crystalline structures composed of two or more compounds that are solid at room temperature held together by noncovalent bonds. Their main advantages are the increase of solubility, bioavailability, permeability, stability, and at the same time retaining active pharmaceutical ingredient bioactivity. The cocrystallization between furosemide and nicotinamide by solvent evaporation was monitored on-line using near-infrared spectroscopy (NIRS) as a process analytical technology tool. The near-infrared spectra were analyzed using principal component analysis. Batch statistical process monitoring was used to create control charts to perceive the process trajectory and define control limits. Normal and non-normal operating condition batches were performed and monitored with NIRS. The use of NIRS associated with batch statistical process models allowed the detection of abnormal variations in critical process parameters, like the amount of solvent or amount of initial components present in the cocrystallization.

11. USCIS National Processing Volumes and Trends

Data.gov (United States)

Department of Homeland Security — The dataset provides the public with a comparison of form processing volumes and trend data for specific form types and offices with national levels. This gives the...

12. Mathematical SETI Statistics, Signal Processing, Space Missions

CERN Document Server

Maccone, Claudio

2012-01-01

This book introduces the Statistical Drake Equation where, from a simple product of seven positive numbers, the Drake Equation is turned into the product of seven positive random variables. The mathematical consequences of this transformation are demonstrated and it is proven that the new random variable N for the number of communicating civilizations in the Galaxy must follow the lognormal probability distribution when the number of factors in the Drake equation is allowed to increase at will. Mathematical SETI also studies the proposed FOCAL (Fast Outgoing Cyclopean Astronomical Lens) space mission to the nearest Sun Focal Sphere at 550 AU and describes its consequences for future interstellar precursor missions and truly interstellar missions. In addition the author shows how SETI signal processing may be dramatically improved by use of the Karhunen-Loève Transform (KLT) rather than Fast Fourier Transform (FFT). Finally, he describes the efforts made to persuade the United Nations to make the central part...

13. Statistical physics of media processes: Mediaphysics

Science.gov (United States)

Kuznetsov, Dmitri V.; Mandel, Igor

2007-04-01

The processes of mass communications in complicated social or sociobiological systems such as marketing, economics, politics, animal populations, etc. as a subject for the special scientific subbranch-“mediaphysics”-are considered in its relation with sociophysics. A new statistical physics approach to analyze these phenomena is proposed. A keystone of the approach is an analysis of population distribution between two or many alternatives: brands, political affiliations, or opinions. Relative distances between a state of a “person's mind” and the alternatives are measures of propensity to buy (to affiliate, or to have a certain opinion). The distribution of population by those relative distances is time dependent and affected by external (economic, social, marketing, natural) and internal (influential propagation of opinions, “word of mouth”, etc.) factors, considered as fields. Specifically, the interaction and opinion-influence field can be generalized to incorporate important elements of Ising-spin-based sociophysical models and kinetic-equation ones. The distributions were described by a Schrödinger-type equation in terms of Green's functions. The developed approach has been applied to a real mass-media efficiency problem for a large company and generally demonstrated very good results despite low initial correlations of factors and the target variable.

14. Quantum Informatics View of Statistical Data Processing

OpenAIRE

Bogdanov, Yu. I.; Bogdanova, N. A.

2011-01-01

Application of root density estimator to problems of statistical data analysis is demonstrated. Four sets of basis functions based on Chebyshev-Hermite, Laguerre, Kravchuk and Charlier polynomials are considered. The sets may be used for numerical analysis in problems of reconstructing statistical distributions by experimental data. Examples of numerical modeling are given.

15. Parametric statistical inference for discretely observed diffusion processes

DEFF Research Database (Denmark)

Pedersen, Asger Roer

Part 1: Theoretical results Part 2: Statistical applications of Gaussian diffusion processes in freshwater ecology......Part 1: Theoretical results Part 2: Statistical applications of Gaussian diffusion processes in freshwater ecology...

16. Precision Measurement and Calibration. Volume 1. Statistical Concepts and Procedures

Science.gov (United States)

1969-02-01

Sons, New York. N.Y.). Shewhart, Walter A. (19t39), Statistical Method frin the Galilei , Galileo (1638), Discorsi e Dirnostrazioni Matema- Viewpoint... Galilei . Galileo (1898). Discorsi e Dimostraziorii Matemnaticlie Shiewhart, Walter A. (1941), (Contribiution of statistics to Initorno a Dine Ninove Scienze...Le Opere i Galileo Galilei the science of engineering, University of Pennsylvania Bi- (Edizione Nazionale) Vill, pp 39-448, Firenze. centennial

17. Statistics of extreme objects in the Juropa Hubble Volume simulation

CERN Document Server

Watson, W A; Diego, J M; Gottlöber, S; Knebe, A; Martínez-González, E; Yepes, G

2013-01-01

We present the first results from the JUropa huBbLE volumE (Jubilee) project, based on the output from a large N-body, dark matter-only cosmological simulation with a volume of V=(6Gpc/h)^3, containing 6000^3 particles, performed within the concordance Lambda-CDM cosmological model. The simulation volume is sufficient to probe extremely large length scales in the universe, whilst at the same time the particle count is high enough so that dark matter haloes down to 1.5x10^12 M_sun/h can be resolved. At z = 0 we identify over 400 million haloes, and the first haloes in the simulation form at z = 11. We present an all-sky map of the Integrated Sachs Wolfe signal calculated from the gravitational potential in the box between z = 0-1.4. The cluster mass function is derived using three different halofinders and compared to fitting functions in the literature, with results being consistent with previous studies across most of the mass-range of the simulation. We compare simulated clusters of maximal mass across reds...

18. Statistical aspects of determinantal point processes

DEFF Research Database (Denmark)

Lavancier, Frédéric; Møller, Jesper; Rubak, Ege Holger

inference. We pay special attention to stationary DPPs, where we give a simple condition ensuring their existence, construct parametric models, describe how they can be well approximated so that the likelihood can be evaluated and realizations can be simulated, and discuss how statistical inference...... is conducted using the likelihood or moment properties....

19. STATISTICAL OPTIMIZATION OF PROCESS VARIABLES FOR ...

African Journals Online (AJOL)

2012-11-03

Nov 3, 2012 ... predictive results. The osmotic dehydration process was optimized for water loss and solutes gain. ... Process Variables Optimization for Osmotic Dehydration of Okra in Sucrose Solution. 371 ...... Science des Aliments, Vol. 10,.

20. Statistical properties of several models of fractional random point processes

Science.gov (United States)

Bendjaballah, C.

2011-08-01

Statistical properties of several models of fractional random point processes have been analyzed from the counting and time interval statistics points of view. Based on the criterion of the reduced variance, it is seen that such processes exhibit nonclassical properties. The conditions for these processes to be treated as conditional Poisson processes are examined. Numerical simulations illustrate part of the theoretical calculations.

1. Use of statistical process control in the production of blood components

DEFF Research Database (Denmark)

Magnussen, K; Quere, S; Winkel, P

2008-01-01

Introduction of statistical process control in the setting of a small blood centre was tested, both on the regular red blood cell production and specifically to test if a difference was seen in the quality of the platelets produced, when a change was made from a relatively large inexperienced...... by an experienced staff with four technologists. We applied statistical process control to examine if time series of quality control values were in statistical control. Leucocyte count in red blood cells was out of statistical control. Platelet concentration and volume of the platelets produced by the occasional...

2. An Improved Velocity Volume Processing Method

Institute of Scientific and Technical Information of China (English)

LI Nan; WEI Ming; TANG Xiaowen; PAN Yujie

2007-01-01

Velocity volume processing (VVP) retrieval of single Doppler radar is an effective method which can be used to obtain many wind parameters. However, due to the problem of an ill-conditioned matrix arising from the coefficients of equations not being easily resolved, the VVP method has not been applied adequately and effectively in operation. In this paper, an improved scheme, SVVP (step velocity volume processing), based on the original method, is proposed. The improved algorithm retrieves each group of components of the wind field through a stepwise procedure, which overcomes the problem of an ill-conditioned matrix, which currently limits the application of the VVP method. Variables in a six-parameter model can be retrieved even if the analysis volume is very small. In addition, the source and order of errors which exist in the traditional method are analyzed. The improved method is applied to real cases, which show that it is robust and has the capability to obtain the wind field structure of the local convective system. It is very helpful for studying severe storms.

3. Modern statistics for spatial point processes

DEFF Research Database (Denmark)

Møller, Jesper; Waagepetersen, Rasmus

We summarize and discuss the current state of spatial point process theory and directions for future research, making an analogy with generalized linear models and random effect models, and illustrating the theory with various examples of applications. In particular, we consider Poisson, Gibbs, a......, and Cox process models, diagnostic tools and model checking, Markov chain Monte Carlo algorithms, computational methods for likelihood-based inference, and quick non-likelihood approaches to inference....

4. Modern Statistics for Spatial Point Processes

DEFF Research Database (Denmark)

Møller, Jesper; Waagepetersen, Rasmus

2007-01-01

We summarize and discuss the current state of spatial point process theory and directions for future research, making an analogy with generalized linear models and random effect models, and illustrating the theory with various examples of applications. In particular, we consider Poisson, Gibbs an...... and Cox process models, diagnostic tools and model checking, Markov chain Monte Carlo algorithms, computational methods for likelihood-based inference, and quick non-likelihood approaches to inference....

5. Statistical Processing Methods for Polarimetric Imagery

Science.gov (United States)

2008-09-01

University Press, USA, 1986. 16. Hecht , Eugene . Optics. Addison Wesley, San Fransisco, 4th edition edition, 2002. 17. Jones, K.J. “Wavelet image processing...polarization state many times. Hecht refers to this state as randomly polarized or natural light [16] and points out, along with [42], that unpolarized light

6. Robust control charts in statistical process control

NARCIS (Netherlands)

Nazir, H.Z.

2014-01-01

The presence of outliers and contaminations in the output of the process highly affects the performance of the design structures of commonly used control charts and hence makes them of less practical use. One of the solutions to deal with this problem is to use control charts which are robust agains

7. Dwarf Galaxy Starburst Statistics in the Local Volume

CERN Document Server

Lee, Janice C; Funes, José G S J; Shoko Sakai; Akiyama, Sanae

2008-01-01

An unresolved question in galaxy evolution is whether the star formation histories of low mass systems are preferentially dominated by starbursts or modes that are more quiescent and continuous. Here, we quantify the prevalence of global starbursts in dwarf galaxies at the present epoch, and infer their characteristic durations and amplitudes. The analysis is based on the H-alpha component of the 11 Mpc H-alpha UV Galaxy Survey (11HUGS), which is providing H-alpha and GALEX UV imaging for an approximately volume-limited sample of ~300 star-forming galaxies within 11 Mpc. We first examine the completeness properties of the sample, and then directly tally the number of bursting dwarfs and compute the fraction of star formation that is concentrated in such systems. Our results are consistent with a picture where dwarfs that are currently experiencing massive global bursts are just the ~6% tip of a low-mass galaxy iceberg. Moreover, bursts are only responsible for about a quarter of the total star formation in th...

8. Data Mining Foundations and Intelligent Paradigms Volume 2 Statistical, Bayesian, Time Series and other Theoretical Aspects

CERN Document Server

Jain, Lakhmi

2012-01-01

Data mining is one of the most rapidly growing research areas in computer science and statistics. In Volume 2 of this three volume series, we have brought together contributions from some of the most prestigious researchers in theoretical data mining. Each of the chapters is self contained. Statisticians and applied scientists/ engineers will find this volume valuable. Additionally, it provides a sourcebook for graduate students interested in the current direction of research in data mining.

9. Statistical Inference for Partially Observed Diffusion Processes

DEFF Research Database (Denmark)

Jensen, Anders Christian

-dimensional Ornstein-Uhlenbeck where one coordinate is completely unobserved. This model does not have the Markov property and it makes parameter inference more complicated. Next we take a Bayesian approach and introduce some basic Markov chain Monte Carlo methods. In chapter ve and six we describe an Bayesian method...... to perform parameter inference in multivariate diffusion models that may be only partially observed. The methodology is applied to the stochastic FitzHugh-Nagumo model and the two-dimensional Ornstein-Uhlenbeck process. Chapter seven focus on parameter identifiability in the aprtially observed Ornstein...

10. Investigating intuitive and deliberate processes statistically

Directory of Open Access Journals (Sweden)

Andreas Glockner

2009-04-01

Full Text Available One of the core challenges of decision research is to identify individuals' decision strategies without influencing decision behavior by the method used. Br"oder and Schiffer (2003 suggested a method to classify decision strategies based on a maximum likelihood estimation, comparing the probability of individuals' choices given the application of a certain strategy and a constant error rate. Although this method was shown to be unbiased and practically useful, it obviously does not allow differentiating between models that make the same predictions concerning choices but different predictions for the underlying process, which is often the case when comparing complex to simple models or when comparing intuitive and deliberate strategies. An extended method is suggested that additionally includes decision times and confidence judgments in a simultaneous Multiple-Measure Maximum Likelihood estimation. In simulations, it is shown that the method is unbiased and sensitive to differentiate between strategies if the effects on times and confidence are sufficiently large.

11. Statistical kinetics of processive molecular motors

Science.gov (United States)

Schnitzer, Mark Jacob

1999-10-01

We describe new theoretical and experimental tools for studying biological motor proteins at the single molecule scale. These tools enable measurements of molecular fuel economies, thereby providing insight into the pathways for conversion of biochemical energy into mechanical work. Kinesin is an ATP-dependent motor that moves processively along microtubules in discrete steps of 8 nm. How many molecules of ATP are hydrolysed per step? To determine this coupling ratio, we develop a fluctuation analysis, which relates the variance in records of mechanical displacement to the number of rate-limiting biochemical transitions in the engine cycle. Using fluctuation analysis and optical trapping interferometry, we determine that near zero load, single molecules of kinesin hydrolyse one ATP nucleotide per 8-nm step. To study kinesin behavior under load, we use a molecular force clamp, capable of maintaining constant loads on single kinesin motors moving processively. Analysis of records of motion under variable ATP concentrations and loads reveals that kinesin is a tightly- coupled' motor, maintaining the 1:1 coupling ratio up to loads of ~ 5 pN. Moreover, a Michaelis-Menten analysis of velocity shows that the kinesin cycle contains at least two load- dependent transitions. The rate of one of these transitions affects ATP affinity, while the other does not. Therefore, the kinesin stall force must depend on the ATP concentration, as is demonstrated experimentally. These findings rule out existing theoretical models of kinesin motility. We develop a simple theoretical formalism describing a tightly-coupled mechanism for movement. This energy-landscape' formalism quantitatively accounts for motile properties of RNA polymerase (RNAP), the enzyme that transcribes DNA into RNA. The shapes of RNAP force-velocity curves indicate that biochemical steps limiting transcription rates at low loads do not generate movement. Modeling suggests that high loads may halt RNAP by promoting a

12. Using Statistical Process Control to Enhance Student Progression

Science.gov (United States)

Hanna, Mark D.; Raichura, Nilesh; Bernardes, Ednilson

2012-01-01

Public interest in educational outcomes has markedly increased in the most recent decade; however, quality management and statistical process control have not deeply penetrated the management of academic institutions. This paper presents results of an attempt to use Statistical Process Control (SPC) to identify a key impediment to continuous…

13. Process Model Construction and Optimization Using Statistical Experimental Design,

Science.gov (United States)

1988-04-01

Memo No. 88-442 ~LECTE March 1988 31988 %,.. MvAY 1 98 0) PROCESS MODEL CONSTRUCTION AND OPTIMIZATION USING STATISTICAL EXPERIMENTAL DESIGN Emmanuel...Sachs and George Prueger Abstract A methodology is presented for the construction of process models by the combination of physically based mechanistic...253-8138. .% I " Process Model Construction and Optimization Using Statistical Experimental Design" by Emanuel Sachs Assistant Professor and George

14. JSS Journal of Statistical Software January 2013, Volume 52, Issue 4. http://www.jstatsoft.org/ lgcp : Inference with Spatial and Spatio-Temporal Log-Gaussian Cox Processes in R

Directory of Open Access Journals (Sweden)

Benjamin M. Taylor

2013-01-01

Full Text Available This paper introduces an R package for spatial and spatio-temporal prediction and forecasting for log-Gaussian Cox processes. The main computational tool for these models is Markov chain Monte Carlo (MCMC and the new package, lgcp, therefore also provides an extensible suite of functions for implementing MCMC algorithms for processes of this type. The modeling framework and details of inferential procedures are first presented before a tour of lgcp functionality is given via a walk-through data-analysis. Topics covered include reading in and converting data, estimation of the key components and parameters of the model, specifying output and simulation quantities, computation of Monte Carlo expectations, post-processing and simulation of data sets.

15. STATISTICAL CONTROL OF PROCESSES AND PRODUCTS IN AGRICULTURE

Directory of Open Access Journals (Sweden)

D. Horvat

2006-06-01

Full Text Available Fundamental concept of statistical process control is based on decision-making about the process on the basis of comparison of data collected from process with calculated control limits. Statistical process and quality control of agricultural products is used to provide agricultural products that will satisfy customer requirements in a view of quality pretension as well as costumer requirements in a cost price. In accordance with ISO 9000, quality standards for process and products are defined. There are many institutions in Croatia that work in accordance with these standards. Implementation of statistical process control and usage of a control charts can greatly help in convergence to the standards and in decreasing of production costs. To illustrate the above mentioned we tested a work quality of a nozzle at the eighteen meter clutch sprayer.

16. Comparison of Statistically Modeled Contaminated Soil Volume Estimates and Actual Excavation Volumes at the Maywood FUSRAP Site - 13555

Energy Technology Data Exchange (ETDEWEB)

Moore, James [U.S. Army Corps of Engineers - New York District 26 Federal Plaza, New York, New York 10278 (United States); Hays, David [U.S. Army Corps of Engineers - Kansas City District 601 E. 12th Street, Kansas City, Missouri 64106 (United States); Quinn, John; Johnson, Robert; Durham, Lisa [Argonne National Laboratory, Environmental Science Division 9700 S. Cass Ave., Argonne, Illinois 60439 (United States)

2013-07-01

As part of the ongoing remediation process at the Maywood Formerly Utilized Sites Remedial Action Program (FUSRAP) properties, Argonne National Laboratory (Argonne) assisted the U.S. Army Corps of Engineers (USACE) New York District by providing contaminated soil volume estimates for the main site area, much of which is fully or partially remediated. As part of the volume estimation process, an initial conceptual site model (ICSM) was prepared for the entire site that captured existing information (with the exception of soil sampling results) pertinent to the possible location of surface and subsurface contamination above cleanup requirements. This ICSM was based on historical anecdotal information, aerial photographs, and the logs from several hundred soil cores that identified the depth of fill material and the depth to bedrock under the site. Specialized geostatistical software developed by Argonne was used to update the ICSM with historical sampling results and down-hole gamma survey information for hundreds of soil core locations. The updating process yielded both a best guess estimate of contamination volumes and a conservative upper bound on the volume estimate that reflected the estimate's uncertainty. Comparison of model results to actual removed soil volumes was conducted on a parcel-by-parcel basis. Where sampling data density was adequate, the actual volume matched the model's average or best guess results. Where contamination was un-characterized and unknown to the model, the actual volume exceeded the model's conservative estimate. Factors affecting volume estimation were identified to assist in planning further excavations. (authors)

17. A statistical approach to the initial volume problem in Single Particle Analysis by Electron Microscopy.

Science.gov (United States)

Sorzano, C O S; Vargas, J; de la Rosa-Trevín, J M; Otón, J; Álvarez-Cabrera, A L; Abrishami, V; Sesmero, E; Marabini, R; Carazo, J M

2015-03-01

Cryo Electron Microscopy is a powerful Structural Biology technique, allowing the elucidation of the three-dimensional structure of biological macromolecules. In particular, the structural study of purified macromolecules -often referred as Single Particle Analysis(SPA)- is normally performed through an iterative process that needs a first estimation of the three-dimensional structure that is progressively refined using experimental data. It is well-known the local optimisation nature of this refinement, so that the initial choice of this first structure may substantially change the final result. Computational algorithms aiming to providing this first structure already exist. However, the question is far from settled and more robust algorithms are still needed so that the refinement process can be performed with sufficient guarantees. In this article we present a new algorithm that addresses the initial volume problem in SPA by setting it in a Weighted Least Squares framework and calculating the weights through a statistical approach based on the cumulative density function of different image similarity measures. We show that the new algorithm is significantly more robust than other state-of-the-art algorithms currently in use in the field. The algorithm is available as part of the software suite Xmipp (http://xmipp.cnb.csic.es) and Scipion (http://scipion.cnb.csic.es) under the name "Significant".

18. Memory-type control charts in statistical process control

NARCIS (Netherlands)

Abbas, N.

2012-01-01

Control chart is the most important statistical tool to manage the business processes. It is a graph of measurements on a quality characteristic of the process on the vertical axis plotted against time on the horizontal axis. The graph is completed with control limits that cause variation mark. Once

19. Statistical Analysis of Processes of Bankruptcy is in Ukraine

OpenAIRE

Berest Marina Nikolaevna

2012-01-01

The statistical analysis of processes of bankruptcy in Ukraine is conducted. Quantitative and high-quality indexes, characterizing efficiency of functioning of institute of bankruptcy of enterprises, are analyzed; the analysis of processes, related to bankruptcy of enterprises, being in state administration is conducted.

20. Manufacturing Squares: An Integrative Statistical Process Control Exercise

Science.gov (United States)

Coy, Steven P.

2016-01-01

In the exercise, students in a junior-level operations management class are asked to manufacture a simple product. Given product specifications, they must design a production process, create roles and design jobs for each team member, and develop a statistical process control plan that efficiently and effectively controls quality during…

1. What's statistical about learning? Insights from modelling statistical learning as a set of memory processes.

Science.gov (United States)

Thiessen, Erik D

2017-01-05

2. Hand surgery volume and the US economy: is there a statistical correlation?

Science.gov (United States)

Gordon, Chad R; Pryor, Landon; Afifi, Ahmed M; Gatherwright, James R; Evans, Peter J; Hendrickson, Mark; Bernard, Steven; Zins, James E

2010-11-01

To the best of our knowledge, there have been no previous studies evaluating the correlation of the US economy and hand surgery volume. Therefore, in light of the current recession, our objective was to study our institution's hand surgery volume over the last 17 years in relation to the nation's economy. A retrospective analysis of our institution's hand surgery volume, as represented by our most common procedure (ie, carpal tunnel release), was performed between January 1992 and October 2008. Liposuction and breast augmentation volumes were chosen to serve as cosmetic plastic surgery comparison groups. Pearson correlation statistics were used to estimate the relationship between the surgical volume and the US economy, as represented by the 3 market indices (Dow Jones, NASDAQ, and S&P500). A combined total of 7884 hand surgery carpal tunnel release (open or endoscopic) patients were identified. There were 1927 (24%) and 5957 (76%) patients within the departments of plastic and orthopedic surgery, respectively. In the plastic surgery department, there was a strong negative (ie, inverse relationship) correlation between hand surgery volume and the economy (P US economy, as represented by the 3 major market indices. In contrast, orthopedic hand surgery volume and cosmetic surgery show a parallel (ie, positive) correlation. This data suggests that plastic surgeons are increasing their cosmetic surgery-to-reconstructive/hand surgery ratio during strong economic times and vice versa during times of economic slowdown.

3. Statistical Process Control in a Modern Production Environment

DEFF Research Database (Denmark)

Windfeldt, Gitte Bjørg

gathered here and standard statistical software. In Paper 2 a new method for process monitoring is introduced. The method uses a statistical model of the quality characteristic and a sliding window of observations to estimate the probability that the next item will not respect the specications....... If the estimated probability exceeds a pre-determined threshold the process will be stopped. The method is exible, allowing a complexity in modeling that remains invisible to the end user. Furthermore, the method allows to build diagnostic plots based on the parameters estimates that can provide valuable insight...... into the process. The method is explored numerically and a case study is provided. In Paper 3 the method is explored in a bivariate setting. Paper 4 is a case study on a problem regarding missing values in an industrial process. The impact of the missing values on the quality measures of the process is assessed...

4. Processes for an Architecture of Volume

DEFF Research Database (Denmark)

Mcgee, Wes; Feringa, Jelle; Søndergaard, Asbjørn

2013-01-01

This paper addresses both the architectural, conceptual motivations and the tools and techniques necessary for the digital production of an architecture of volume. The robotic manufacturing techniques of shaping volumetric materials by hot wire and abrasive wire cutting are discussed through...

5. Using statistical methods of quality management in logistics processes

Directory of Open Access Journals (Sweden)

Tkachenko Alla

2016-04-01

Full Text Available The purpose of the paper is to study the application of statistical methods of logistics process quality management at a large industrial enterprise and testing the theoretical studies. The analysis of the publications shows that a significant number of works by both Ukrainian and foreign authors has been dedicated to the research of quality management, while statistical methods of quality management have only been thoroughly analyzed by a small number of researchers, since these methods are referred to as classical, that is, those that are considered well-known and do not require special attention of modern scholars. In the authors’ opinion, the logistics process is a process of transformation and movement of material and accompanying flows by ensuring management freedom under the conditions of sequential interdependencies; standardization; synchronization; sharing information, and consistency of incentives, using innovative methods and models. In our study, we have shown that the management of logistics processes should use such statistical methods of quality management as descriptive statistics, experiment planning, hypotheses testing, measurement analysis, process opportunities analysis, regression analysis, reliability analysis, sampling, modeling, maps of statistical process control, specification of statistical tolerance, time series analysis. The proposed statistical methods of logistics processes quality management have been tested at the large industrial enterprise JSC "Dniepropetrovsk Aggregate Plant" that specializes in manufacturing hydraulic control valves. The findings suggest that the main purpose in the sphere of logistics processes quality is the continuous improvement of the mining equipment production quality through the use of innovative processes, advanced management systems and information technology. This will enable the enterprise to meet the requirements and expectations of their customers. It has been proved that the

6. Statistical Modeling of Ultrawideband Body-Centric Wireless Channels Considering Room Volume

Directory of Open Access Journals (Sweden)

Miyuki Hirose

2012-01-01

Full Text Available This paper presents the results of a statistical modeling of onbody ultrawideband (UWB radio channels for wireless body area network (WBAN applications. Measurements were conducted in five different rooms. A measured delay profile can be divided into two domains; in the first domain (04 ns has multipath components that are dominant and dependent on room volume. The first domain was modeled with a conventional power decay law model, and the second domain with a modified Saleh-Valenzuela model considering the room volume. Realizations of the impulse responses are presented based on the composite model and compared with the measured average power delay profiles.

7. Sintering as a process of transport of activated volume

Directory of Open Access Journals (Sweden)

Nikolić Nataša S.

2002-01-01

Full Text Available Starting with the fact that sintering is the consequence of the process of transport of activated volume, it has been shown how the kinetics of the sintering process can be defined. The activated volume was in principle defined as a parameter which describes a system’s deffectivity on an atomic level.

8. High Volume Colour Image Processing with Massively Parallel Embedded Processors

NARCIS (Netherlands)

Jacobs, Jan W.M.; Bond, W.; Pouls, R.; Smit, Gerard J.M.; Joubert, G.R.; Peters, F.J.; Tirado, P.; Nagel, W.E.; Plata, O.; Zapata, E.

2006-01-01

Currently Oce uses FPGA technology for implementing colour image processing for their high volume colour printers. Although FPGA technology provides enough performance it, however, has a rather tedious development process. This paper describes the research conducted on an alternative implementation

9. Reaming process improvement and control: An application of statistical engineering

DEFF Research Database (Denmark)

Müller, Pavel; Genta, G.; Barbato, G.

2012-01-01

A reaming operation had to be performed within given technological and economical constraints. Process improvement under realistic conditions was the goal of a statistical engineering project, supported by a comprehensive experimental investigation providing detailed information on single...... and combined effects of several parameters on key responses. Results supported selection of production parameters meeting specified quality and cost targets, as well as substantial improvements....

10. Advanced Statistical Signal Processing Techniques for Landmine Detection Using GPR

Science.gov (United States)

2014-07-12

Processing Techniques for Landmine Detection Using GPR The views, opinions and/or findings contained in this report are those of the author(s) and should not...AGENCY NAME(S) AND ADDRESS (ES) U.S. Army Research Office P.O. Box 12211 Research Triangle Park, NC 27709-2211 landmine Detection, Signal...310 Jesse Hall Columbia, MO 65211 -1230 654808 633606 ABSTRACT Advanced Statistical Signal Processing Techniques for Landmine Detection Using GPR Report

11. Statistical process management: An essential element of quality improvement

Science.gov (United States)

Buckner, M. R.

Successful quality improvement requires a balanced program involving the three elements that control quality: organization, people and technology. The focus of the SPC/SPM User's Group is to advance the technology component of Total Quality by networking within the Group and by providing an outreach within Westinghouse to foster the appropriate use of statistic techniques to achieve Total Quality. SPM encompasses the disciplines by which a process is measured against its intrinsic design capability, in the face of measurement noise and other obscuring variability. SPM tools facilitate decisions about the process that generated the data. SPM deals typically with manufacturing processes, but with some flexibility of definition and technique it accommodates many administrative processes as well. The techniques of SPM are those of Statistical Process Control, Statistical Quality Control, Measurement Control, and Experimental Design. In addition, techniques such as job and task analysis, and concurrent engineering are important elements of systematic planning and analysis that are needed early in the design process to ensure success. The SPC/SPM User's Group is endeavoring to achieve its objectives by sharing successes that have occurred within the member's own Westinghouse department as well as within other US and foreign industry. In addition, failures are reviewed to establish lessons learned in order to improve future applications. In broader terms, the Group is interested in making SPM the accepted way of doing business within Westinghouse.

12. Interdependence of the volume and stress ensembles and equipartition in statistical mechanics of granular systems.

Science.gov (United States)

Blumenfeld, Raphael; Jordan, Joe F; Edwards, Sam F

2012-12-07

We discuss the statistical mechanics of granular matter and derive several significant results. First, we show that, contrary to common belief, the volume and stress ensembles are interdependent, necessitating the use of both. We use the combined ensemble to calculate explicitly expectation values of structural and stress-related quantities for two-dimensional systems. We thence demonstrate that structural properties may depend on the angoricity tensor and that stress-based quantities may depend on the compactivity. This calls into question previous statistical mechanical analyses of static granular systems and related derivations of expectation values. Second, we establish the existence of an intriguing equipartition principle-the total volume is shared equally amongst both structural and stress-related degrees of freedom. Third, we derive an expression for the compactivity that makes it possible to quantify it from macroscopic measurements.

13. Discussion of "Modern Statistics for Spatial Point Processes"

DEFF Research Database (Denmark)

Møller, Jesper; Waagepetersen, Rasmus; Jensen, Eva B. Vedel;

2007-01-01

The paper ‘Modern statistics for spatial point processes' by Jesper Møller and Rasmus P. Waagepetersen is based on a special invited lecture given by the authors at the 21st Nordic Conference on Mathematical Statistics, held at Rebild, Denmark, in June 2006. At the conference, Antti Penttinen...... and Eva B. Vedel Jensen were invited to discuss the paper. We here present the comments from the two invited discussants and from a number of other scholars, as well as the authors' responses to these comments. Below Figure 1, Figure 2, etc., refer to figures in the paper under discussion, while Figure...

14. Large deviations of ergodic counting processes: a statistical mechanics approach.

Science.gov (United States)

2011-07-01

The large-deviation method allows to characterize an ergodic counting process in terms of a thermodynamic frame where a free energy function determines the asymptotic nonstationary statistical properties of its fluctuations. Here we study this formalism through a statistical mechanics approach, that is, with an auxiliary counting process that maximizes an entropy function associated with the thermodynamic potential. We show that the realizations of this auxiliary process can be obtained after applying a conditional measurement scheme to the original ones, providing is this way an alternative measurement interpretation of the thermodynamic approach. General results are obtained for renewal counting processes, that is, those where the time intervals between consecutive events are independent and defined by a unique waiting time distribution. The underlying statistical mechanics is controlled by the same waiting time distribution, rescaled by an exponential decay measured by the free energy function. A scale invariance, shift closure, and intermittence phenomena are obtained and interpreted in this context. Similar conclusions apply for nonrenewal processes when the memory between successive events is induced by a stochastic waiting time distribution.

15. Monitoring Software Reliability using Statistical Process Control: An MMLE Approach

Directory of Open Access Journals (Sweden)

Bandla Sreenivasa Rao

2011-11-01

Full Text Available This paper consider an MMLE (Modified Maximum Likelihood Estimation based scheme to estimatesoftware reliability using exponential distribution. The MMLE is one of the generalized frameworks ofsoftware reliability models of Non Homogeneous Poisson Processes (NHPPs. The MMLE givesanalytical estimators rather than an iterative approximation to estimate the parameters. In this paper weproposed SPC (Statistical Process Control Charts mechanism to determine the software quality usinginter failure times data. The Control charts can be used to measure whether the software process isstatistically under control or not.

16. Statistical representative elementary volumes of porous media determined using greyscale analysis of 3D tomograms

Science.gov (United States)

Bruns, S.; Stipp, S. L. S.; Sørensen, H. O.

2017-09-01

Digital rock physics carries the dogmatic concept of having to segment volume images for quantitative analysis but segmentation rejects huge amounts of signal information. Information that is essential for the analysis of difficult and marginally resolved samples, such as materials with very small features, is lost during segmentation. In X-ray nanotomography reconstructions of Hod chalk we observed partial volume voxels with an abundance that limits segmentation based analysis. Therefore, we investigated the suitability of greyscale analysis for establishing statistical representative elementary volumes (sREV) for the important petrophysical parameters of this type of chalk, namely porosity, specific surface area and diffusive tortuosity, by using volume images without segmenting the datasets. Instead, grey level intensities were transformed to a voxel level porosity estimate using a Gaussian mixture model. A simple model assumption was made that allowed formulating a two point correlation function for surface area estimates using Bayes' theory. The same assumption enables random walk simulations in the presence of severe partial volume effects. The established sREVs illustrate that in compacted chalk, these simulations cannot be performed in binary representations without increasing the resolution of the imaging system to a point where the spatial restrictions of the represented sample volume render the precision of the measurement unacceptable. We illustrate this by analyzing the origins of variance in the quantitative analysis of volume images, i.e. resolution dependence and intersample and intrasample variance. Although we cannot make any claims on the accuracy of the approach, eliminating the segmentation step from the analysis enables comparative studies with higher precision and repeatability.

17. Statistical Inference for Point Process Models of Rainfall

Science.gov (United States)

Smith, James A.; Karr, Alan F.

1985-01-01

In this paper we develop maximum likelihood procedures for parameter estimation and model selection that apply to a large class of point process models that have been used to model rainfall occurrences, including Cox processes, Neyman-Scott processes, and renewal processes. The statistical inference procedures are based on the stochastic intensity λ(t) = lims→0,s>0 (1/s)E[N(t + s) - N(t)|N(u), u process is shown to have a simple expression in terms of the stochastic intensity. The main result of this paper is a recursive procedure for computing stochastic intensities; the procedure is applicable to a broad class of point process models, including renewal Cox process with Markovian intensity processes and an important class of Neyman-Scott processes. The model selection procedure we propose, which is based on likelihood ratios, allows direct comparison of two classes of point processes to determine which provides a better model for a given data set. The estimation and model selection procedures are applied to two data sets of simulated Cox process arrivals and a data set of daily rainfall occurrences in the Potomac River basin.

18. Statistically Appraising Process Quality of Affinity Isolation Experiments

Energy Technology Data Exchange (ETDEWEB)

Sharp, Julia L.; Borkowski, John J.; Schmoyer, Denise A.; Daly, Don S.; Purvine, Samuel O.; Cannon, William R.; Hurst, G. B.

2009-03-15

Quality affinity isolation experiments are necessary to identify valid protein-protein interactions. Biological error, processing error, and random variability can reduce the quality of an experiment, and thus hinder the identification of protein interaction pairs. Appraising affinity isolation assay quality is essential to inferring protein associations. An important step of the assay is the mass spectrometric identification of proteins. To evaluate this step, a known mixture of proteins is processed through a mass spectrometer as a quality control mixture. If the mass spectrometer yields unexpected results, the process is currently qualitatively evaluated, tuned, and reset. Statistical quality control (SQC) procedures, including the use of cumulative sum, the individual measurement, and moving range charts are implemented to analyze the stability of the mass spectrometric analysis. The SQC measures presented here can assist in establishing preliminary control limits to identify an out-of-control process and investigate assignable causes for shifts in the process mean in real-time.

19. Statistically Appraising Process Quality of Affinity-Isolation Experiments

Energy Technology Data Exchange (ETDEWEB)

Sharp, Julia L. [Clemson University; Borkowski, John J [Montana State University; Schmoyer, Denise D [ORNL; Daly, Don S. [Pacific Northwest National Laboratory (PNNL); Purvine, Samuel [Pacific Northwest National Laboratory (PNNL); Cannon, Bill [Pacific Northwest National Laboratory (PNNL); Hurst, Gregory {Greg} B [ORNL

2009-01-01

Quality affinity isolation experiments are necessary to identify valid protein-protein interactions. Biological error, processing error, and random variability can reduce the quality of an experiment, and thus hinder the identification of protein interaction pairs. Appraising a nity isolation assay quality is essential to inferring protein associations. An important step of the assay is the mass spectrometric identification of proteins. To evaluate this step, a known mixture of proteins is processed through a mass spectrometer as a quality control mixture. If the mass spectrometer yields unexpected results, the process is currently qualitatively evaluated, tuned, and reset. Statistical quality control (SQC) procedures, including the use of cumulative sum, the individual measurement, and moving range charts are implemented to analyze the stability of the mass spectrometric analysis. The SQC measures presented here can assist in establishing preliminary control limits to identify an out-of-control process and investigate assignable causes for shifts in the process mean in real-time.

20. The Pearson diffusions: A class of statistically tractable diffusion processes

DEFF Research Database (Denmark)

Forman, Julie Lyng; Sørensen, Michael

The Pearson diffusions is a flexible class of diffusions defined by having linear drift and quadratic squared diffusion coefficient. It is demonstrated that for this class explicit statistical inference is feasible. Explicit optimal martingale estimating func- tions are found, and the corresponding...... volatility models with Pearson volatility process. For the non-Markov models explicit optimal prediction based estimating functions are found and shown to yield consistent and asymptotically normal estimators...

1. Application of statistical process control to qualitative molecular diagnostic assays.

Directory of Open Access Journals (Sweden)

Cathal P O'brien

2014-11-01

Full Text Available Modern pathology laboratories and in particular high throughput laboratories such as clinical chemistry have developed a reliable system for statistical process control. Such a system is absent from the majority of molecular laboratories and where present is confined to quantitative assays. As the inability to apply statistical process control to assay is an obvious disadvantage this study aimed to solve this problem by using a frequency estimate coupled with a confidence interval calculation to detect deviations from an expected mutation frequency. The results of this study demonstrate the strengths and weaknesses of this approach and highlight minimum sample number requirements. Notably, assays with low mutation frequencies and detection of small deviations from an expected value require greater samples with a resultant protracted time to detection. Modelled laboratory data was also used to highlight how this approach might be applied in a routine molecular laboratory. This article is the first to describe the application of statistical process control to qualitative laboratory data.

2. Ironmaking Process Alternative Screening Study, Volume 1

Energy Technology Data Exchange (ETDEWEB)

Lockwood Greene, . .

2005-01-06

Iron in the United States is largely produced from iron ore mined in the United States or imported from Canada or South America. The iron ore is typically smelted in Blast Furnaces that use primarily iron ore, iron concentrate pellets metallurgical coke, limestone and lime as the raw materials. Under current operating scenarios, the iron produced from these Blast Furnaces is relatively inexpensive as compared to current alternative iron sources, e.g. direct iron reduction, imported pig iron, etc. The primary problem the Blast Furnace Ironmaking approach is that many of these Blast furnaces are relatively small, as compared to the newer, larger Blast Furnaces; thus are relatively costly and inefficient to operate. An additional problem is also that supplies of high-grade metallurgical grade coke are becoming increasingly in short supply and costs are also increasing. In part this is due to the short supply and costs of high-grade metallurgical coals, but also this is due to the increasing necessity for environmental controls for coke production. After year 2003 new regulations for coke product environmental requirement will likely be promulgated. It is likely that this also will either increase the cost of high-quality coke production or will reduce the available domestic U.S. supply. Therefore, iron production in the United States utilizing the current, predominant Blast Furnace process will be more costly and would likely be curtailed due to a coke shortage. Therefore, there is a significant need to develop or extend the economic viability of Alternate Ironmaking Processes to at least partially replace current and declining blast furnace iron sources and to provide incentives for new capacity expansion. The primary conclusions of this comparative Study of Alternative Ironmaking Process scenarios are: (1) The processes with the best combined economics (CAPEX and OPEX impacts in the I.R.R. calculation) can be grouped into those Fine Ore based processes with no scrap

3. Post-processing for statistical image analysis in light microscopy.

Science.gov (United States)

Cardullo, Richard A; Hinchcliffe, Edward H

2013-01-01

Image processing of images serves a number of important functions including noise reduction, contrast enhancement, and feature extraction. Whatever the final goal, an understanding of the nature of image acquisition and digitization and subsequent mathematical manipulations of that digitized image is essential. Here we discuss the basic mathematical and statistical processes that are routinely used by microscopists to routinely produce high quality digital images and to extract key features of interest using a variety of extraction and thresholding tools. Copyright © 2013 Elsevier Inc. All rights reserved.

4. Fuel quality processing study, volume 1

Science.gov (United States)

Ohara, J. B.; Bela, A.; Jentz, N. E.; Syverson, H. T.; Klumpe, H. W.; Kessler, R. E.; Kotzot, H. T.; Loran, B. L.

1981-01-01

A fuel quality processing study to provide a data base for an intelligent tradeoff between advanced turbine technology and liquid fuel quality, and also, to guide the development of specifications of future synthetic fuels anticipated for use in the time period 1985 to 2000 is given. Four technical performance tests are discussed: on-site pretreating, existing refineries to upgrade fuels, new refineries to upgrade fuels, and data evaluation. The base case refinery is a modern Midwest refinery processing 200,000 BPD of a 60/40 domestic/import petroleum crude mix. The synthetic crudes used for upgrading to marketable products and turbine fuel are shale oil and coal liquids. Of these syncrudes, 50,000 BPD are processed in the existing petroleum refinery, requiring additional process units and reducing petroleum feed, and in a new refinery designed for processing each syncrude to produce gasoline, distillate fuels, resid fuels, and turbine fuel, JPGs and coke. An extensive collection of synfuel properties and upgrading data was prepared for the application of a linear program model to investigate the most economical production slate meeting petroleum product specifications and turbine fuels of various quality grades. Technical and economic projections were developed for 36 scenarios, based on 4 different crude feeds to either modified existing or new refineries operated in 2 different modes to produce 7 differing grades of turbine fuels. A required product selling price of turbine fuel for each processing route was calculated. Procedures and projected economics were developed for on-site treatment of turbine fuel to meet limitations of impurities and emission of pollutants.

5. Failure of the Volume Function in Granular Statistical Mechanics and an Alternative Formulation.

Science.gov (United States)

Blumenfeld, Raphael; Amitai, Shahar; Jordan, Joe F; Hihinashvili, Rebecca

2016-04-08

We first show that the currently accepted statistical mechanics for granular matter is flawed. The reason is that it is based on the volume function, which depends only on a minute fraction of all the structural degrees of freedom and is unaffected by most of the configurational microstates. Consequently, the commonly used partition function underestimates the entropy severely. We then propose a new formulation, replacing the volume function with a connectivity function that depends on all the structural degrees of freedom and accounts correctly for the entire entropy. We discuss the advantages of the new formalism and derive explicit results for two- and three-dimensional systems. We test the formalism by calculating the entropy of an experimental two-dimensional system, as a function of system size, and showing that it is an extensive variable.

6. Competent statistical programmer: Need of business process outsourcing industry.

Science.gov (United States)

Khan, Imran

2014-07-01

Over the last two decades Business Process Outsourcing (BPO) has evolved as much mature practice. India is looked as preferred destination for pharmaceutical outsourcing over a cost arbitrage. Among the biometrics outsourcing, statistical programming and analysis required very niche skill for service delivery. The demand and supply ratios are imbalance due to high churn out rate and less supply of competent programmer. Industry is moving from task delivery to ownership and accountability. The paradigm shift from an outsourcing to consulting is triggering the need for competent statistical programmer. Programmers should be trained in technical, analytical, problem solving, decision making and soft skill as the expectations from the customer are changing from task delivery to accountability of the project. This paper will highlight the common issue SAS programming service industry is facing and skills the programmers need to develop to cope up with these changes.

7. Competent statistical programmer: Need of business process outsourcing industry

Directory of Open Access Journals (Sweden)

Imran Khan

2014-01-01

Full Text Available Over the last two decades Business Process Outsourcing (BPO has evolved as much mature practice. India is looked as preferred destination for pharmaceutical outsourcing over a cost arbitrage. Among the biometrics outsourcing, statistical programming and analysis required very niche skill for service delivery. The demand and supply ratios are imbalance due to high churn out rate and less supply of competent programmer. Industry is moving from task delivery to ownership and accountability. The paradigm shift from an outsourcing to consulting is triggering the need for competent statistical programmer. Programmers should be trained in technical, analytical, problem solving, decision making and soft skill as the expectations from the customer are changing from task delivery to accountability of the project. This paper will highlight the common issue SAS programming service industry is facing and skills the programmers need to develop to cope up with these changes.

8. Design and Statistics in Quantitative Translation (Process) Research

DEFF Research Database (Denmark)

Balling, Laura Winther; Hvelplund, Kristian Tangsgaard

2015-01-01

is unfamiliar. In this article, we attempt to mitigate these problems by outlining our approach to good quantitative research, all the way from research questions and study design to data preparation and statistics. We concentrate especially on the nature of the variables involved, both in terms of their scale......Traditionally, translation research has been qualitative, but quantitative research is becoming increasingly important, especially in translation process research but also in other areas of translation studies. This poses problems to many translation scholars since this way of thinking...... and their role in the design; this has implications for both design and choice of statistics. Although we focus on quantitative research, we also argue that such research should be supplemented with qualitative analyses and considerations of the translation product....

9. Fast Quantum Algorithm for Predicting Descriptive Statistics of Stochastic Processes

Science.gov (United States)

Williams Colin P.

1999-01-01

Stochastic processes are used as a modeling tool in several sub-fields of physics, biology, and finance. Analytic understanding of the long term behavior of such processes is only tractable for very simple types of stochastic processes such as Markovian processes. However, in real world applications more complex stochastic processes often arise. In physics, the complicating factor might be nonlinearities; in biology it might be memory effects; and in finance is might be the non-random intentional behavior of participants in a market. In the absence of analytic insight, one is forced to understand these more complex stochastic processes via numerical simulation techniques. In this paper we present a quantum algorithm for performing such simulations. In particular, we show how a quantum algorithm can predict arbitrary descriptive statistics (moments) of N-step stochastic processes in just O(square root of N) time. That is, the quantum complexity is the square root of the classical complexity for performing such simulations. This is a significant speedup in comparison to the current state of the art.

10. On the joint statistics of stable random processes

Energy Technology Data Exchange (ETDEWEB)

Hopcraft, K I [School of Mathematical Sciences, University of Nottingham, NG7 2RD (United Kingdom); Jakeman, E, E-mail: keith.hopcraft@nottingham.ac.uk [School of Electrical and Electronic Engineering, University of Nottingham, NG7 2RD (United Kingdom)

2011-10-28

A utilitarian continuous bi-variate random process whose first-order probability density function is a stable random variable is constructed. Results paralleling some of those familiar from the theory of Gaussian noise are derived. In addition to the joint-probability density for the process, these include fractional moments and structure functions. Although the correlation functions for stable processes other than Gaussian do not exist, we show that there is coherence between values adopted by the process at different times, which identifies a characteristic evolution with time. The distribution of the derivative of the process, and the joint-density function of the value of the process and its derivative measured at the same time are evaluated. These enable properties to be calculated analytically such as level crossing statistics and those related to the random telegraph wave. When the stable process is fractal, the proportion of time it spends at zero is finite and some properties of this quantity are evaluated, an optical interpretation for which is provided. (paper)

11. Statistical method for detecting structural change in the growth process.

Science.gov (United States)

Ninomiya, Yoshiyuki; Yoshimoto, Atsushi

2008-03-01

Due to competition among individual trees and other exogenous factors that change the growth environment, each tree grows following its own growth trend with some structural changes in growth over time. In the present article, a new method is proposed to detect a structural change in the growth process. We formulate the method as a simple statistical test for signal detection without constructing any specific model for the structural change. To evaluate the p-value of the test, the tube method is developed because the regular distribution theory is insufficient. Using two sets of tree diameter growth data sampled from planted forest stands of Cryptomeria japonica in Japan, we conduct an analysis of identifying the effect of thinning on the growth process as a structural change. Our results demonstrate that the proposed method is useful to identify the structural change caused by thinning. We also provide the properties of the method in terms of the size and power of the test.

12. Statistical modeling of volume of alcohol exposure for epidemiological studies of population health: the US example

Directory of Open Access Journals (Sweden)

Gmel Gerrit

2010-03-01

Full Text Available Abstract Background Alcohol consumption is a major risk factor in the global burden of disease, with overall volume of exposure as the principal underlying dimension. Two main sources of data on volume of alcohol exposure are available: surveys and per capita consumption derived from routine statistics such as taxation. As both sources have significant problems, this paper presents an approach that triangulates information from both sources into disaggregated estimates in line with the overall level of per capita consumption. Methods A modeling approach was applied to the US using data from a large and representative survey, the National Epidemiologic Survey on Alcohol and Related Conditions. Different distributions (log-normal, gamma, Weibull were used to model consumption among drinkers in subgroups defined by sex, age, and ethnicity. The gamma distribution was used to shift the fitted distributions in line with the overall volume as derived from per capita estimates. Implications for alcohol-attributable fractions were presented, using liver cirrhosis as an example. Results The triangulation of survey data with aggregated per capita consumption data proved feasible and allowed for modeling of alcohol exposure disaggregated by sex, age, and ethnicity. These models can be used in combination with risk relations for burden of disease calculations. Sensitivity analyses showed that the gamma distribution chosen yielded very similar results in terms of fit and alcohol-attributable mortality as the other tested distributions. Conclusions Modeling alcohol consumption via the gamma distribution was feasible. To further refine this approach, research should focus on the main assumptions underlying the approach to explore differences between volume estimates derived from surveys and per capita consumption figures.

13. Statistical evaluation of the mechanical properties of high-volume class F fly ash concretes

KAUST Repository

Yoon, Seyoon

2014-03-01

High-Volume Fly Ash (HVFA) concretes are seen by many as a feasible solution for sustainable, low embodied carbon construction. At the moment, fly ash is classified as a waste by-product, primarily of thermal power stations. In this paper the authors experimentally and statistically investigated the effects of mix-design factors on the mechanical properties of high-volume class F fly ash concretes. A total of 240 and 32 samples were produced and tested in the laboratory to measure compressive strength and Young\\'s modulus respectively. Applicability of the CEB-FIP (Comite Euro-international du Béton - Fédération Internationale de la Précontrainte) and ACI (American Concrete Institute) Building Model Code (Thomas, 2010; ACI Committee 209, 1982) [1,2] to the experimentally-derived mechanical property data for HVFA concretes was established. Furthermore, using multiple linear regression analysis, Mean Squared Residuals (MSRs) were obtained to determine whether a weight- or volume-based mix proportion is better to predict the mechanical properties of HVFA concrete. The significance levels of the design factors, which indicate how significantly the factors affect the HVFA concrete\\'s mechanical properties, were determined using analysis of variance (ANOVA) tests. The results show that a weight-based mix proportion is a slightly better predictor of mechanical properties than volume-based one. The significance level of fly ash substitution rate was higher than that of w/b ratio initially but reduced over time. © 2014 Elsevier Ltd. All rights reserved.

14. Statistical Optimization of Process Variables for Antibiotic Activity of Xenorhabdus bovienii

Science.gov (United States)

Cao, Xue-Qiang; Zhu, Ming-Xuan; Zhang, Xing; Wang, Yong-Hong

2012-01-01

The production of secondary metabolites with antibiotic properties is a common characteristic to entomopathogenic bacteria Xenorhabdus spp. These metabolites not only have diverse chemical structures but also have a wide range of bioactivities of medicinal and agricultural interests. Culture variables are critical to the production of secondary metabolites of microorganisms. Manipulating culture process variables can promote secondary metabolite biosynthesis and thus facilitate the discovery of novel natural products. This work was conducted to evaluate the effects of five process variables (initial pH, medium volume, rotary speed, temperature, and inoculation volume) on the antibiotic production of Xenorhabdus bovienii YL002 using response surface methodology. A 25–1 factorial central composite design was chosen to determine the combined effects of the five variables, and to design a minimum number of experiments. The experimental and predicted antibiotic activity of X. bovienii YL002 was in close agreement. Statistical analysis of the results showed that initial pH, medium volume, rotary speed and temperature had a significant effect (Pantibiotic production of X. bovienii YL002 at their individual level; medium volume and rotary speed showed a significant effect at a combined level and was most significant at an individual level. The maximum antibiotic activity (287.5 U/mL) was achieved at the initial pH of 8.24, medium volume of 54 mL in 250 mL flask, rotary speed of 208 rpm, temperature of 32.0°C and inoculation volume of 13.8%. After optimization, the antibiotic activity was improved by 23.02% as compared with that of unoptimized conditions. PMID:22701637

15. Estimating the volume and age of water stored in global lakes using a geo-statistical approach

Science.gov (United States)

Messager, Mathis Loïc; Lehner, Bernhard; Grill, Günther; Nedeva, Irena; Schmitt, Oliver

2016-12-01

Lakes are key components of biogeochemical and ecological processes, thus knowledge about their distribution, volume and residence time is crucial in understanding their properties and interactions within the Earth system. However, global information is scarce and inconsistent across spatial scales and regions. Here we develop a geo-statistical model to estimate the volume of global lakes with a surface area of at least 10 ha based on the surrounding terrain information. Our spatially resolved database shows 1.42 million individual polygons of natural lakes with a total surface area of 2.67 × 106 km2 (1.8% of global land area), a total shoreline length of 7.2 × 106 km (about four times longer than the world's ocean coastline) and a total volume of 181.9 × 103 km3 (0.8% of total global non-frozen terrestrial water stocks). We also compute mean and median hydraulic residence times for all lakes to be 1,834 days and 456 days, respectively.

16. The statistical approach for turbulent processes in the Earth's magnetosphere

Science.gov (United States)

Kozak, Liudmyla; Savin, Sergey; Budaev, Vyacheslav; Pilipenko, Viacheslav

The scaling features of the probability distribution functions (PDFs) of the magnetic field fluctuations in different regions of Earth magnetosphere and the solar wind plasma at different timescales were considered. Data obtained by Interball spacecraft were used. Changes of shape and parameters of the probability distribution function for periods of the satellite position in different magnetosphere regions were examined. The probabilities of return P(0) with t, and kurtosis values at different timescales were used for the analysis. Two asymptotic regimes of P(0) characterized by different power laws were founded. In particular, while the large timescale the scaling is quite well in agreement with the typical scaling features for a normal Gaussian process, in the limit of small timescale the observed scaling resembles the behavior of a Levy process. The crossover characteristic timescale is corresponding to 1 sec. This value can be connected with ion gyrofrequency. In addition, for the analysis of turbulent processes the structure functions of different orders were investigated, and comparison of the obtained results with log-Poisson cascade model was made. Near magnetospheric boundaries the statistical study reveals super-diffusive character of the transport processes. In the quiet magnetosheath and in solar wind classical diffusion is recovered.

17. Finite Element Surface Registration Incorporating Curvature, Volume Preservation, and Statistical Model Information

Directory of Open Access Journals (Sweden)

Thomas Albrecht

2013-01-01

Full Text Available We present a novel method for nonrigid registration of 3D surfaces and images. The method can be used to register surfaces by means of their distance images, or to register medical images directly. It is formulated as a minimization problem of a sum of several terms representing the desired properties of a registration result: smoothness, volume preservation, matching of the surface, its curvature, and possible other feature images, as well as consistency with previous registration results of similar objects, represented by a statistical deformation model. While most of these concepts are already known, we present a coherent continuous formulation of these constraints, including the statistical deformation model. This continuous formulation renders the registration method independent of its discretization. The finite element discretization we present is, while independent of the registration functional, the second main contribution of this paper. The local discontinuous Galerkin method has not previously been used in image registration, and it provides an efficient and general framework to discretize each of the terms of our functional. Computational efficiency and modest memory consumption are achieved thanks to parallelization and locally adaptive mesh refinement. This allows for the first time the use of otherwise prohibitively large 3D statistical deformation models.

18. Statistics

CERN Document Server

Hayslett, H T

1991-01-01

Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the

19. A Statistics-Based Cracking Criterion of Resin-Bonded Silica Sand for Casting Process Simulation

Science.gov (United States)

Wang, Huimin; Lu, Yan; Ripplinger, Keith; Detwiler, Duane; Luo, Alan A.

2017-02-01

Cracking of sand molds/cores can result in many casting defects such as veining. A robust cracking criterion is needed in casting process simulation for predicting/controlling such defects. A cracking probability map, relating to fracture stress and effective volume, was proposed for resin-bonded silica sand based on Weibull statistics. Three-point bending test results of sand samples were used to generate the cracking map and set up a safety line for cracking criterion. Tensile test results confirmed the accuracy of the safety line for cracking prediction. A laboratory casting experiment was designed and carried out to predict cracking of a cup mold during aluminum casting. The stress-strain behavior and the effective volume of the cup molds were calculated using a finite element analysis code ProCAST®. Furthermore, an energy dispersive spectroscopy fractographic examination of the sand samples confirmed the binder cracking in resin-bonded silica sand.

20. Application of statistical process control to qualitative molecular diagnostic assays

LENUS (Irish Health Repository)

O'Brien, Cathal P.

2014-11-01

Modern pathology laboratories and in particular high throughput laboratories such as clinical chemistry have developed a reliable system for statistical process control (SPC). Such a system is absent from the majority of molecular laboratories and where present is confined to quantitative assays. As the inability to apply SPC to an assay is an obvious disadvantage this study aimed to solve this problem by using a frequency estimate coupled with a confidence interval calculation to detect deviations from an expected mutation frequency. The results of this study demonstrate the strengths and weaknesses of this approach and highlight minimum sample number requirements. Notably, assays with low mutation frequencies and detection of small deviations from an expected value require greater sample numbers to mitigate a protracted time to detection. Modeled laboratory data was also used to highlight how this approach might be applied in a routine molecular laboratory. This article is the first to describe the application of SPC to qualitative laboratory data.

1. Introduction to statistical physics of media processes: Mediaphysics

CERN Document Server

Kuznetsov, D V; Kuznetsov, Dmitri V.; Mandel, Igor

2005-01-01

Processes of mass communications in complicated social or sociobiological systems such as marketing, economics, politics, animal populations, etc. as a subject for the special scientific discipline - "mediaphysics" - are considered in its relation with sociophysics. A new statistical physics approach to analyze these phenomena is proposed. A keystone of the approach is an analysis of population distribution between two or many alternatives: brands, political affiliations, or opinions. Relative distances between a state of a "person's mind" and the alternatives are measures of propensity to buy (to affiliate, or to have a certain opinion). The distribution of population by those relative distances is time dependent and affected by external (economic, social, marketing, natural) and internal (mean-field influential propagation of opinions, synergy effects, etc.) factors, considered as fields. Specifically, the interaction and opinion-influence field can be generalized to incorporate important elements of Ising-...

2. Processes and subdivisions in diogenites, a multivariate statistical analysis

Science.gov (United States)

Harriott, T. A.; Hewins, R. H.

1984-01-01

Multivariate statistical techniques used on diogenite orthopyroxene analyses show the relationships that occur within diogenites and the two orthopyroxenite components (class I and II) in the polymict diogenite Garland. Cluster analysis shows that only Peckelsheim is similar to Garland class I (Fe-rich) and the other diogenites resemble Garland class II. The unique diogenite Y 75032 may be related to type I by fractionation. Factor analysis confirms the subdivision and shows that Fe does not correlate with the weakly incompatible elements across the entire pyroxene composition range, indicating that igneous fractionation is not the process controlling total diogenite composition variation. The occurrence of two groups of diogenites is interpreted as the result of sampling or mixing of two main sequences of orthopyroxene cumulates with slightly different compositions.

3. Scaling limits for Hawkes processes and application to financial statistics

CERN Document Server

Bacry, Emmanuel; Hoffmann, Marc; Muzy, Jean François

2012-01-01

We prove a law of large numbers and a functional central limit theorem for multivariate Hawkes processes observed over a time interval $[0,T]$ in the limit $T \\rightarrow \\infty$. We further exhibit the asymptotic behaviour of the covariation of the increments of the components of a multivariate Hawkes process, when the observations are imposed by a discrete scheme with mesh $\\Delta$ over $[0,T]$ up to some further time shift $\\tau$. The behaviour of this functional depends on the relative size of $\\Delta$ and $\\tau$ with respect to $T$ and enables to give a full account of the second-order structure. As an application, we develop our results in the context of financial statistics. We introduced in a previous work a microscopic stochastic model for the variations of a multivariate financial asset, based on Hawkes processes and that is confined to live on a tick grid. We derive and characterise the exact macroscopic diffusion limit of this model and show in particular its ability to reproduce important empiric...

4. Determination of the minimum size of a statistical representative volume element from a fibre-reinforced composite based on point pattern statistics

DEFF Research Database (Denmark)

Hansen, Jens Zangenberg; Brøndsted, Povl

2013-01-01

In a previous study, Trias et al. [1] determined the minimum size of a statistical representative volume element (SRVE) of a unidirectional fibre-reinforced composite primarily based on numerical analyses of the stress/strain field. In continuation of this, the present study determines the minimum...

5. PEMODELAN STATISTICAL CONTROL DETECTION ADAPTIVE (SCDA UNTUK MONITORING DAN PREDIKSI VOLUME PRODUKSI CRUDE PALM OIL (CPO NASIONAL

Directory of Open Access Journals (Sweden)

Wahyu Widji Pamungkas

2017-07-01

Full Text Available Achievement of national palm oil industry as a producer and exporter of crude palm oil (CPO in the world, it is now giving birth insecurity issues. This is because the growth of upstream and downstream industries of national palm oil that has not been balanced, which in turn encourages the national palm oil industry players to be oriented to the export of CPO which eliminates the added value in the country. On the other hand, though bring in foreign exchange for the country, but is prone commodity export orientation encountered a barriers problem in the international market. It is therefore important to provide a means of monitoring, prediction and assessment to facilitate the formulation of policies more about the marketing of national CPO industry. This research proposed the development of a model framework called adaptive threshold statistical control detection adaptive (SCDA as a means of monitoring, prediction, and assessment of the movement of national CPO production volume. SCDA idea is to determine the dynamic threshold based mapping pattern historical data and predictions from the aspect of the frequency and trends. SCDA model adapted the techniques of statistical process control (SPC, while the values of the predictions generated from the simulation prediction model developed using the techniques of artificial neural network back propagation (ANN-BP based on historical data of the national CPO production volume. The data used was the average volume of annual national CPO production period 1967 to 2015. The simulation results showed that the prediction model of national CPO production volume in 2016 until 2018 predicted were31.025 million, 32.214 million, and 34.504 million tons, respectively, while the values of maximum and minimum threshold that was formed in the model predictions SCDA for the period 2016-2018 each sequence were 33,322,065 and 29,246,547, respectively. As far as the literature search results, modeling SCDA has never

6. Recurrence-time statistics in non-Hamiltonian volume preserving maps and flows

CERN Document Server

da Silva, Rafael M; Manchein, Cesar

2015-01-01

We analyze the recurrence-time statistics (RTS) in three-dimensional non-Hamiltonian volume preserving systems (VPS): an extended standard map, and a fluid model. The extended map is a standard map weakly coupled to an extra-dimension which contains a deterministic regular, mixed (regular and chaotic) or chaotic motion. The extra-dimension strongly enhances the trapping times inducing plateaus and distinct algebraic and exponential decays in the RTS plots. The combined analysis of the RTS with the classification of ordered and chaotic regimes and scaling properties, allows us to describe the intricate way trajectories penetrate the before impenetrable regular islands from the uncoupled case. Essentially the plateaus found in the RTS are related to trajectories that stay long times inside trapping tubes, not allowing recurrences, and then penetrates diffusively the islands (from the uncoupled case) by a diffusive motion along such tubes in the extra-dimension. All asymptotic exponential decays for the RTS are ...

7. Advances in statistical monitoring of complex multivariate processes with applications in industrial process control

CERN Document Server

Kruger, Uwe

2012-01-01

The development and application of multivariate statistical techniques in process monitoring has gained substantial interest over the past two decades in academia and industry alike.  Initially developed for monitoring and fault diagnosis in complex systems, such techniques have been refined and applied in various engineering areas, for example mechanical and manufacturing, chemical, electrical and electronic, and power engineering.  The recipe for the tremendous interest in multivariate statistical techniques lies in its simplicity and adaptability for developing monitoring applica

8. Statistics

Science.gov (United States)

Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.

9. [Near infrared spectroscopy and multivariate statistical process analysis for real-time monitoring of production process].

Science.gov (United States)

Wang, Yi; Ma, Xiang; Wen, Ya-Dong; Zou, Quan; Wang, Jun; Tu, Jia-Run; Cai, Wen-Sheng; Shao, Xue-Guang

2013-05-01

Near infrared diffusive reflectance spectroscopy has been applied in on-site or on-line analysis due to its characteristics of fastness, non-destruction and the feasibility for real complex sample analysis. The present work reported a real-time monitoring method for industrial production by using near infrared spectroscopic technique and multivariate statistical process analysis. In the method, the real-time near infrared spectra of the materials are collected on the production line, and then the evaluation of the production process can be achieved by a statistic Hotelling T2 calculated with the established model. In this work, principal component analysis (PCA) is adopted for building the model, and the statistic is calculated by projecting the real-time spectra onto the PCA model. With an application of the method in a practical production, it was demonstrated that a real-time evaluation of the variations in the production can be realized by investigating the changes in the statistic, and the comparison of the products in different batches can be achieved by further statistics of the statistic. Therefore, the proposed method may provide a practical way for quality insurance of production processes.

10. Ultrasonic attenuation spectroscopy for multivariate statistical process control in nanomaterial processing

Institute of Scientific and Technical Information of China (English)

Bundit Boonkhao; Xue Z. Wang

2012-01-01

Ultrasonic attenuation spectroscopy (UAS) is an attractive process analytical technology (PAT) for on-line real-time characterisation of slurries for particle size distribution (PSD) estimation.It is however only applicable to relatively low solid concentrations since existing instrument process models still cannot fully take into account the phenomena of particle-particle interaction and multiple scattering,leading to errors in PSD estimation.This paper investigates an alternative use of the raw attenuation spectra for direct multivariate statistical process control (MSPC).The UAS raw spectra were processed using principal component analysis.The selected principal components were used to derive two MSPC statistics,the Hotelling's T2 and square prediction error (SPE).The method is illustrated and demonstrated by reference to a wet milling process for processing nanoparticles.

11. Agricultural Handling and Processing Industries; Data Pertinent to an Evaluation of Overtime Exemptions Available under the Fair Labor Standards Act. Volume II, Appendices.

Science.gov (United States)

Wage and Labor Standards Administration (DOL), Washington, DC.

Definitions of terms used in the Fair Labor Standards Act and statistical tables compiled from a survey of agricultural processing firms comprise this appendix, which is the second volume of a two volume report. Volume I is available as VT 012 247. (BH)

12. Analyzing a Mature Software Inspection Process Using Statistical Process Control (SPC)

Science.gov (United States)

Barnard, Julie; Carleton, Anita; Stamper, Darrell E. (Technical Monitor)

1999-01-01

This paper presents a cooperative effort where the Software Engineering Institute and the Space Shuttle Onboard Software Project could experiment applying Statistical Process Control (SPC) analysis to inspection activities. The topics include: 1) SPC Collaboration Overview; 2) SPC Collaboration Approach and Results; and 3) Lessons Learned.

13. Analyzing a Mature Software Inspection Process Using Statistical Process Control (SPC)

Science.gov (United States)

Barnard, Julie; Carleton, Anita; Stamper, Darrell E. (Technical Monitor)

1999-01-01

This paper presents a cooperative effort where the Software Engineering Institute and the Space Shuttle Onboard Software Project could experiment applying Statistical Process Control (SPC) analysis to inspection activities. The topics include: 1) SPC Collaboration Overview; 2) SPC Collaboration Approach and Results; and 3) Lessons Learned.

14. Statistical mechanics of fragmentation processes of ice and rock bodies

Science.gov (United States)

Bashkirov, A. G.; Vityazev, A. V.

1996-09-01

It is a well-known experimental fact that impact fragmentation, specifically of ice and rock bodies, causes a two-step ("knee"-shaped) power distribution of fragment masses with exponent values within the limits -4 and -1.5 (here and henceforth the differential distribution is borne in mind). A new theoretical approach is proposed to determine the exponent values, a minimal fracture mass, and properties of the knee. As a basis for construction of non-equilibrium statistical mechanics of condensed matter fragmentation the maximum-entropy variational principle is used. In contrast to the usual approach founded on the Boltzmann entropy the more general Tsallis entropy allowing stationary solutions not only in the exponential Boltzmann-Gibbs form but in the form of the power (fractal) law distribution as well is invoked. Relying on the analysis of a lot of published experiments a parameter β is introduced to describe an inhomogeneous distribution of the impact energy over the target. It varies from 0 (for an utterly inhomogeneous distribution of the impact energy) to 1 (for a homogeneous distribution). The lower limit of fragment masses is defined as a characteristic fragment mass for which the energy of fragment formation is minimal. This mass value depends crucially on the value of β. It is shown that for β≪1 only small fragments can be formed, and the maximal permitted fragment (of mass m1) is the upper boundary of the first stage of the fracture process and the point where the knee takes place. The second stage may be realized after a homogeneous redistribution of the remainder of the impact energy over the remainder of the target (when β→1). Here, the formation of great fragments is permitted only and the smallest of them (of mass m2) determines a lower boundary of the second stage. Different forms of the knee can be observed depending on relations between m1 and m2.

15. Recurrence-time statistics in non-Hamiltonian volume-preserving maps and flows

Science.gov (United States)

da Silva, Rafael M.; Beims, Marcus W.; Manchein, Cesar

2015-08-01

We analyze the recurrence-time statistics (RTS) in three-dimensional non-Hamiltonian volume-preserving systems (VPS): an extended standard map and a fluid model. The extended map is a standard map weakly coupled to an extra dimension which contains a deterministic regular, mixed (regular and chaotic), or chaotic motion. The extra dimension strongly enhances the trapping times inducing plateaus and distinct algebraic and exponential decays in the RTS plots. The combined analysis of the RTS with the classification of ordered and chaotic regimes and scaling properties allows us to describe the intricate way trajectories penetrate the previously impenetrable regular islands from the uncoupled case. Essentially the plateaus found in the RTS are related to trajectories that stay for long times inside trapping tubes, not allowing recurrences, and then penetrate diffusively the islands (from the uncoupled case) by a diffusive motion along such tubes in the extra dimension. All asymptotic exponential decays for the RTS are related to an ordered regime (quasiregular motion), and a mixing dynamics is conjectured for the model. These results are compared to the RTS of the standard map with dissipation or noise, showing the peculiarities obtained by using three-dimensional VPS. We also analyze the RTS for a fluid model and show remarkable similarities to the RTS in the extended standard map problem.

16. Spatio-temporal statistical models with applications to atmospheric processes

Energy Technology Data Exchange (ETDEWEB)

Wikle, C.K.

1996-12-31

This doctoral dissertation is presented as three self-contained papers. An introductory chapter considers traditional spatio-temporal statistical methods used in the atmospheric sciences from a statistical perspective. Although this section is primarily a review, many of the statistical issues considered have not been considered in the context of these methods and several open questions are posed. The first paper attempts to determine a means of characterizing the semiannual oscillation (SAO) spatial variation in the northern hemisphere extratropical height field. It was discovered that the midlatitude SAO in 500hPa geopotential height could be explained almost entirely as a result of spatial and temporal asymmetries in the annual variation of stationary eddies. It was concluded that the mechanism for the SAO in the northern hemisphere is a result of land-sea contrasts. The second paper examines the seasonal variability of mixed Rossby-gravity waves (MRGW) in lower stratospheric over the equatorial Pacific. Advanced cyclostationary time series techniques were used for analysis. It was found that there are significant twice-yearly peaks in MRGW activity. Analyses also suggested a convergence of horizontal momentum flux associated with these waves. In the third paper, a new spatio-temporal statistical model is proposed that attempts to consider the influence of both temporal and spatial variability. This method is mainly concerned with prediction in space and time, and provides a spatially descriptive and temporally dynamic model.

17. Spatio-temporal statistical models with applications to atmospheric processes

Energy Technology Data Exchange (ETDEWEB)

Wikle, Christopher K. [Iowa State Univ., Ames, IA (United States)

1996-01-01

This doctoral dissertation is presented as three self-contained papers. An introductory chapter considers traditional spatio-temporal statistical methods used in the atmospheric sciences from a statistical perspective. Although this section is primarily a review, many of the statistical issues considered have not been considered in the context of these methods and several open questions are posed. The first paper attempts to determine a means of characterizing the semiannual oscillation (SAO) spatial variation in the northern hemisphere extratropical height field. It was discovered that the midlatitude SAO in 500hPa geopotential height could be explained almost entirely as a result of spatial and temporal asymmetries in the annual variation of stationary eddies. It was concluded that the mechanism for the SAO in the northern hemisphere is a result of land-sea contrasts. The second paper examines the seasonal variability of mixed Rossby-gravity waves (MRGW) in lower stratospheric over the equatorial Pacific. Advanced cyclostationary time series techniques were used for analysis. It was found that there are significant twice-yearly peaks in MRGW activity. Analyses also suggested a convergence of horizontal momentum flux associated with these waves. In the third paper, a new spatio-temporal statistical model is proposed that attempts to consider the influence of both temporal and spatial variability. This method is mainly concerned with prediction in space and time, and provides a spatially descriptive and temporally dynamic model.

18. [AN OVERALL SOUND PROCESS] Syntactic parameters, statistic parameters, and universals

Directory of Open Access Journals (Sweden)

Nicolas Meeùs

2016-05-01

My paper intends to show that comparative musicology, in facts if not in principles, appears inherently linked to the syntactic elements of music – and so also any encyclopedic project aiming at uncovering universals in music. Not that statistic elements cannot be universal, but that they cannot be commented as such, because they remain largely unquantifiable.

19. Multivariate statistical analysis of a multi-step industrial processes

DEFF Research Database (Denmark)

Reinikainen, S.P.; Høskuldsson, Agnar

2007-01-01

Monitoring and quality control of industrial processes often produce information on how the data have been obtained. In batch processes, for instance, the process is carried out in stages; some process or control parameters are set at each stage. However, the obtained data might not be utilized....... This approach will show how the process develops from a data point of view. The procedure is illustrated on a relatively simple industrial batch process, but it is also applicable in a general context, where knowledge about the variables is available....

20. Statistical tests for power-law cross-correlated processes.

Science.gov (United States)

Podobnik, Boris; Jiang, Zhi-Qiang; Zhou, Wei-Xing; Stanley, H Eugene

2011-12-01

For stationary time series, the cross-covariance and the cross-correlation as functions of time lag n serve to quantify the similarity of two time series. The latter measure is also used to assess whether the cross-correlations are statistically significant. For nonstationary time series, the analogous measures are detrended cross-correlations analysis (DCCA) and the recently proposed detrended cross-correlation coefficient, ρ(DCCA)(T,n), where T is the total length of the time series and n the window size. For ρ(DCCA)(T,n), we numerically calculated the Cauchy inequality -1 ≤ ρ(DCCA)(T,n) ≤ 1. Here we derive -1 ≤ ρ DCCA)(T,n) ≤ 1 for a standard variance-covariance approach and for a detrending approach. For overlapping windows, we find the range of ρ(DCCA) within which the cross-correlations become statistically significant. For overlapping windows we numerically determine-and for nonoverlapping windows we derive--that the standard deviation of ρ(DCCA)(T,n) tends with increasing T to 1/T. Using ρ(DCCA)(T,n) we show that the Chinese financial market's tendency to follow the U.S. market is extremely weak. We also propose an additional statistical test that can be used to quantify the existence of cross-correlations between two power-law correlated time series.

1. Statistical Processes Under Change: Enhancing Data Quality with Pretests

Science.gov (United States)

Statistical offices in Europe, in particular the Federal Statistical Office in Germany, are meeting users’ ever more demanding requirements with innovative and appropriate responses, such as the multiple sources mixed-mode design model. This combines various objectives: reducing survey costs and the burden on interviewees, and maximising data quality. The same improvements are also being sought by way of the systematic use of pretests to optimise survey documents. This paper provides a first impression of the many procedures available. An ideal pretest combines both quantitative and qualitative test methods. Quantitative test procedures can be used to determine how often particular input errors arise. The questionnaire is tested in the field in the corresponding survey mode. Qualitative test procedures can find the reasons for input errors. Potential interviewees are included in the questionnaire tests, and their feedback on the survey documentation is systematically analysed and used to upgrade the questionnaire. This was illustrated in our paper by an example from business statistics (“Umstellung auf die Wirtschaftszweigklassifikation 2008” - Change-over to the 2008 economic sector classification). This pretest not only gave important clues about how to improve the contents, but also helped to realistically estimate the organisational cost of the main survey.

2. Application of Statistical Process Control Methods for IDS

Directory of Open Access Journals (Sweden)

2012-11-01

Full Text Available As technology improves, attackers are trying to get access to the network system resources by so many means. Open loop holes in the network allow them to penetrate in the network more easily; statistical methods have great importance in the area of computer and network security, in detecting the malfunctioning of the network system. Development of internet security solution needed to protect the system and to with stand prolonged and diverse attack. In this paper Statistical approach has been used, conventionally Statistical Control Charts has been used for quality characteristics however in IDS abnormal access can be easily detected and appropriate control limit can be established. Two different charts are investigated and Shewhart chart based on average has produced better accuracy. The approach used here for intrusion detection in such a way that if the data packet is drastically different from normal variation then it can be classified as attack. In other words a system variation may be due to some special reason. If these causes are investigated then natural variation and abnormal variation can be distinguished which can be used for distinction of behaviors of the system.

3. Processing of toxicological studies results in the statistical program R

Directory of Open Access Journals (Sweden)

Fedoseeva Elena Vasilyevna

2015-09-01

Full Text Available The presented article is devoted to the analysis of the experimental values and the applicability of the toxicological studies results in the statistical environment R. This freely distributed program has great functional potential and well-designed algorithm, these make it "...the undisputed leader among the freely distributed systems for statistical analysis..." As the data, the experimental results to assess the toxicity of a highly- mineralized sample in the industrial production wastes were used. We evaluated two test-functions: the change in the population increase of cells and the fluorescence level of laboratory culture of the marine diatom algae Phaeodactylum tricornutum. The detailed algorithm of the analysis, namely: data initialization, evaluation of selective parameters of descriptive statistics, toxicity assessment, single-factor analysis of variance (ANOVA, Tukey and Dunnett multiple comparison tests, evaluation of correlation between the observed variable (Spearman and Pearson correlation coefficients are presented in the article. The complete list of scripts in the program R allows to reproduce a similar analysis.

4. Modification of codes NUALGAM and BREMRAD. Volume 3: Statistical considerations of the Monte Carlo method

Science.gov (United States)

Firstenberg, H.

1971-01-01

The statistics are considered of the Monte Carlo method relative to the interpretation of the NUGAM2 and NUGAM3 computer code results. A numerical experiment using the NUGAM2 code is presented and the results are statistically interpreted.

5. Alternating event processes during lifetimes: population dynamics and statistical inference.

Science.gov (United States)

Shinohara, Russell T; Sun, Yifei; Wang, Mei-Cheng

2017-08-07

In the literature studying recurrent event data, a large amount of work has been focused on univariate recurrent event processes where the occurrence of each event is treated as a single point in time. There are many applications, however, in which univariate recurrent events are insufficient to characterize the feature of the process because patients experience nontrivial durations associated with each event. This results in an alternating event process where the disease status of a patient alternates between exacerbations and remissions. In this paper, we consider the dynamics of a chronic disease and its associated exacerbation-remission process over two time scales: calendar time and time-since-onset. In particular, over calendar time, we explore population dynamics and the relationship between incidence, prevalence and duration for such alternating event processes. We provide nonparametric estimation techniques for characteristic quantities of the process. In some settings, exacerbation processes are observed from an onset time until death; to account for the relationship between the survival and alternating event processes, nonparametric approaches are developed for estimating exacerbation process over lifetime. By understanding the population dynamics and within-process structure, the paper provide a new and general way to study alternating event processes.

6. Automated force volume image processing for biological samples.

Directory of Open Access Journals (Sweden)

Pavel Polyakov

Full Text Available Atomic force microscopy (AFM has now become a powerful technique for investigating on a molecular level, surface forces, nanomechanical properties of deformable particles, biomolecular interactions, kinetics, and dynamic processes. This paper specifically focuses on the analysis of AFM force curves collected on biological systems, in particular, bacteria. The goal is to provide fully automated tools to achieve theoretical interpretation of force curves on the basis of adequate, available physical models. In this respect, we propose two algorithms, one for the processing of approach force curves and another for the quantitative analysis of retraction force curves. In the former, electrostatic interactions prior to contact between AFM probe and bacterium are accounted for and mechanical interactions operating after contact are described in terms of Hertz-Hooke formalism. Retraction force curves are analyzed on the basis of the Freely Jointed Chain model. For both algorithms, the quantitative reconstruction of force curves is based on the robust detection of critical points (jumps, changes of slope or changes of curvature which mark the transitions between the various relevant interactions taking place between the AFM tip and the studied sample during approach and retraction. Once the key regions of separation distance and indentation are detected, the physical parameters describing the relevant interactions operating in these regions are extracted making use of regression procedure for fitting experiments to theory. The flexibility, accuracy and strength of the algorithms are illustrated with the processing of two force-volume images, which collect a large set of approach and retraction curves measured on a single biological surface. For each force-volume image, several maps are generated, representing the spatial distribution of the searched physical parameters as estimated for each pixel of the force-volume image.

7. Automated Force Volume Image Processing for Biological Samples

Science.gov (United States)

Duan, Junbo; Duval, Jérôme F. L.; Brie, David; Francius, Grégory

2011-01-01

Atomic force microscopy (AFM) has now become a powerful technique for investigating on a molecular level, surface forces, nanomechanical properties of deformable particles, biomolecular interactions, kinetics, and dynamic processes. This paper specifically focuses on the analysis of AFM force curves collected on biological systems, in particular, bacteria. The goal is to provide fully automated tools to achieve theoretical interpretation of force curves on the basis of adequate, available physical models. In this respect, we propose two algorithms, one for the processing of approach force curves and another for the quantitative analysis of retraction force curves. In the former, electrostatic interactions prior to contact between AFM probe and bacterium are accounted for and mechanical interactions operating after contact are described in terms of Hertz-Hooke formalism. Retraction force curves are analyzed on the basis of the Freely Jointed Chain model. For both algorithms, the quantitative reconstruction of force curves is based on the robust detection of critical points (jumps, changes of slope or changes of curvature) which mark the transitions between the various relevant interactions taking place between the AFM tip and the studied sample during approach and retraction. Once the key regions of separation distance and indentation are detected, the physical parameters describing the relevant interactions operating in these regions are extracted making use of regression procedure for fitting experiments to theory. The flexibility, accuracy and strength of the algorithms are illustrated with the processing of two force-volume images, which collect a large set of approach and retraction curves measured on a single biological surface. For each force-volume image, several maps are generated, representing the spatial distribution of the searched physical parameters as estimated for each pixel of the force-volume image. PMID:21559483

8. Radar Derived Spatial Statistics of Summer Rain. Volume 2; Data Reduction and Analysis

Science.gov (United States)

Konrad, T. G.; Kropfli, R. A.

1975-01-01

Data reduction and analysis procedures are discussed along with the physical and statistical descriptors used. The statistical modeling techniques are outlined and examples of the derived statistical characterization of rain cells in terms of the several physical descriptors are presented. Recommendations concerning analyses which can be pursued using the data base collected during the experiment are included.

9. Statistical Process Control for Evaluating Contract Service at Army Installations

Science.gov (United States)

1990-09-01

Technical Information Service, 5285 Port Royal Road, Springfield, VA 22161 12a. DISTRIBUTION/ AVAILABILTY STATEMENT 12b. DISTRIBUTION CODE Approved for...requirements. In addition to their usage in fault diagnosis and process improvement, process control methods are recommended for supporting acceptance

10. Dynamic Statistical Characterization of Variation in Source Processes of Microseismic Events

Science.gov (United States)

Smith-Boughner, L.; Viegas, G. F.; Urbancic, T.; Baig, A. M.

2015-12-01

During a hydraulic fracture, water is pumped at high pressure into a formation. A proppant, typically sand is later injected in the hope that it will make its way into a fracture, keep it open and provide a path for the hydrocarbon to enter the well. This injection can create micro-earthquakes, generated by deformation within the reservoir during treatment. When these injections are monitored, thousands of microseismic events are recorded within several hundred cubic meters. For each well-located event, many source parameters are estimated e.g. stress drop, Savage-Wood efficiency and apparent stress. However, because we are evaluating outputs from a power-law process, the extent to which the failure is impacted by fluid injection or stress triggering is not immediately clear. To better detect differences in source processes, we use a set of dynamic statistical parameters which characterize various force balance assumptions using the average distance to the nearest event, event rate, volume enclosed by the events, cumulative moment and energy from a group of events. One parameter, the Fracability index, approximates the ratio of viscous to elastic forcing and highlights differences in the response time of a rock to changes in stress. These dynamic parameters are applied to a database of more than 90 000 events in a shale-gas play in the Horn River Basin to characterize spatial-temporal variations in the source processes. In order to resolve these differences, a moving window, nearest neighbour approach was used. First, the center of mass of the local distribution was estimated for several source parameters. Then, a set of dynamic parameters, which characterize the response of the rock were estimated. These techniques reveal changes in seismic efficiency and apparent stress and often coincide with marked changes in the Fracability index and other dynamic statistical parameters. Utilizing these approaches allowed for the characterization of fluid injection related

11. Statistical Process Control of a Kalman Filter Model

Science.gov (United States)

Gamse, Sonja; Nobakht-Ersi, Fereydoun; Sharifi, Mohammad A.

2014-01-01

For the evaluation of measurement data, different functional and stochastic models can be used. In the case of time series, a Kalman filtering (KF) algorithm can be implemented. In this case, a very well-known stochastic model, which includes statistical tests in the domain of measurements and in the system state domain, is used. Because the output results depend strongly on input model parameters and the normal distribution of residuals is not always fulfilled, it is very important to perform all possible tests on output results. In this contribution, we give a detailed description of the evaluation of the Kalman filter model. We describe indicators of inner confidence, such as controllability and observability, the determinant of state transition matrix and observing the properties of the a posteriori system state covariance matrix and the properties of the Kalman gain matrix. The statistical tests include the convergence of standard deviations of the system state components and normal distribution beside standard tests. Especially, computing controllability and observability matrices and controlling the normal distribution of residuals are not the standard procedures in the implementation of KF. Practical implementation is done on geodetic kinematic observations. PMID:25264959

12. Statistical Process Control of a Kalman Filter Model

Directory of Open Access Journals (Sweden)

Sonja Gamse

2014-09-01

Full Text Available For the evaluation of measurement data, different functional and stochastic models can be used. In the case of time series, a Kalman filtering (KF algorithm can be implemented. In this case, a very well-known stochastic model, which includes statistical tests in the domain of measurements and in the system state domain, is used. Because the output results depend strongly on input model parameters and the normal distribution of residuals is not always fulfilled, it is very important to perform all possible tests on output results. In this contribution, we give a detailed description of the evaluation of the Kalman filter model. We describe indicators of inner confidence, such as controllability and observability, the determinant of state transition matrix and observing the properties of the a posteriori system state covariance matrix and the properties of the Kalman gain matrix. The statistical tests include the convergence of standard deviations of the system state components and normal distribution beside standard tests. Especially, computing controllability and observability matrices and controlling the normal distribution of residuals are not the standard procedures in the implementation of KF. Practical implementation is done on geodetic kinematic observations.

13. Statistical analysis of the breaking processes of Ni nanowires

Energy Technology Data Exchange (ETDEWEB)

Garcia-Mochales, P [Departamento de Fisica de la Materia Condensada, Facultad de Ciencias, Universidad Autonoma de Madrid, c/ Francisco Tomas y Valiente 7, Campus de Cantoblanco, E-28049-Madrid (Spain); Paredes, R [Centro de Fisica, Instituto Venezolano de Investigaciones CientIficas, Apartado 20632, Caracas 1020A (Venezuela); Pelaez, S; Serena, P A [Instituto de Ciencia de Materiales de Madrid, Consejo Superior de Investigaciones CientIficas, c/ Sor Juana Ines de la Cruz 3, Campus de Cantoblanco, E-28049-Madrid (Spain)], E-mail: pedro.garciamochales@uam.es

2008-06-04

We have performed a massive statistical analysis on the breaking behaviour of Ni nanowires using molecular dynamic simulations. Three stretching directions, five initial nanowire sizes and two temperatures have been studied. We have constructed minimum cross-section histograms and analysed for the first time the role played by monomers and dimers. The shape of such histograms and the absolute number of monomers and dimers strongly depend on the stretching direction and the initial size of the nanowire. In particular, the statistical behaviour of the breakage final stages of narrow nanowires strongly differs from the behaviour obtained for large nanowires. We have analysed the structure around monomers and dimers. Their most probable local configurations differ from those usually appearing in static electron transport calculations. Their non-local environments show disordered regions along the nanowire if the stretching direction is [100] or [110]. Additionally, we have found that, at room temperature, [100] and [110] stretching directions favour the appearance of non-crystalline staggered pentagonal structures. These pentagonal Ni nanowires are reported in this work for the first time. This set of results suggests that experimental Ni conducting histograms could show a strong dependence on the orientation and temperature.

14. Statistical process control of a Kalman filter model.

Science.gov (United States)

Gamse, Sonja; Nobakht-Ersi, Fereydoun; Sharifi, Mohammad A

2014-09-26

For the evaluation of measurement data, different functional and stochastic models can be used. In the case of time series, a Kalman filtering (KF) algorithm can be implemented. In this case, a very well-known stochastic model, which includes statistical tests in the domain of measurements and in the system state domain, is used. Because the output results depend strongly on input model parameters and the normal distribution of residuals is not always fulfilled, it is very important to perform all possible tests on output results. In this contribution, we give a detailed description of the evaluation of the Kalman filter model. We describe indicators of inner confidence, such as controllability and observability, the determinant of state transition matrix and observing the properties of the a posteriori system state covariance matrix and the properties of the Kalman gain matrix. The statistical tests include the convergence of standard deviations of the system state components and normal distribution beside standard tests. Especially, computing controllability and observability matrices and controlling the normal distribution of residuals are not the standard procedures in the implementation of KF. Practical implementation is done on geodetic kinematic observations.

15. Preliminary evaluation of alternative waste form solidification processes. Volume II. Evaluation of the processes

Energy Technology Data Exchange (ETDEWEB)

1980-08-01

This Volume II presents engineering feasibility evaluations of the eleven processes for solidification of nuclear high-level liquid wastes (HHLW) described in Volume I of this report. Each evaluation was based in a systematic assessment of the process in respect to six principal evaluation criteria: complexity of process; state of development; safety; process requirements; development work required; and facility requirements. The principal criteria were further subdivided into a total of 22 subcriteria, each of which was assigned a weight. Each process was then assigned a figure of merit, on a scale of 1 to 10, for each of the subcriteria. A total rating was obtained for each process by summing the products of the subcriteria ratings and the subcriteria weights. The evaluations were based on the process descriptions presented in Volume I of this report, supplemented by information obtained from the literature, including publications by the originators of the various processes. Waste form properties were, in general, not evaluated. This document describes the approach which was taken, the developent and application of the rating criteria and subcriteria, and the evaluation results. A series of appendices set forth summary descriptions of the processes and the ratings, together with the complete numerical ratings assigned; two appendices present further technical details on the rating process.

16. Hydrocarbonization process evaluation report. Volume II. Evaluation of process feasibility. [49 refs

Energy Technology Data Exchange (ETDEWEB)

Holmes, J.M.; Dyslin, D.A.; Edwards, M.S.; Joy, D.S.; Peterson, G.R.

1977-07-01

Volume II of a two-volume study concerning the preliminary design and economic evaluation of a Hydrocarbonization Facility includes: (1) a review of the current status of the major processing units, (2) an assessment of operating problems, (3) considerations of possible process alternatives, (4) an evaluation of the overall process feasibility, and (5) recommendations for future process development. Results of the study emphasize the need for testing the evaluated process, which is based on the Clean Coke Process, in a continuous pilot plant using a wide variety of highly caking bituminous coals as feed material. A program suggested for the pilot plant would encompass: (1) development of improved methods for the prevention of agglomeration of highly caking coals during hydrocarbonization, (2) optimization of the yields of coal liquids, (3) investigation of a single-stage high-temperature hydrocarbonizer optimized for char production, and (4) optimization of beneficiation ratios employed during coal preparation.

17. Process simulator for wind turbine control. Volume 2. System analyses; Processimulator voor windturbinebesturingen. Volume 2. Systeemanalyses

Energy Technology Data Exchange (ETDEWEB)

Van der Hooft, E.L.; Verbruggen, T.W.; Van Engelen, T.G. [ECN Wind Energy, Petten (Netherlands)

2002-10-01

Because of upscaling and less accessible offshore sites control systems of wind turbines must be tested in advance to fulfill requirements of reliability. By means of process simulations the operation and performance of the control system and how it deals with failures of components and subsystems and extreme operating conditions can be assessed. In a previous report (Volume 1) attention is paid to the development of the planned real-time process simulation tool WindConTest (Wind Turbine Control Systems Test, Evaluation and Simulation Tool). In this volume results of system analyses for wind turbines with constant speed and variable speed are presented. Based on those results process models and programs can be deducted in order to realize a process simulation tool in phase 2 of the project. [Dutch] Door opschaling en minder toegankelijke offshore locaties is vooraf testen van besturingssystemen van windturbines steeds belangrijker om te kunnen voldoen aan hoge betrouwbaarheidseisen. Met processimulaties kan beoordeeld worden of het besturingssysteem het falen van componenten en (deel)systemen naar behoren afhandelt en of extreme bedrijfstoestanden goed worden doorstaan. Het project, met als einddoel het real-time processimulatie-gereedschap voor windturbine besturingssystemen, WINDCONTEST, bestaat uit twee fasen. De werkzaamheden in fase I betreffen probleemanalyse en systeemanalyse, in de voorziene fase II zijn dit modellering en implementatie. Systeemanalyses zijn uitgevoerd voor windturbines met constant toerental en variabel toerental. De analyses geven invullingen aan de inventarisatie- en definitietaak. In het rapport 'systeemanalyses' worden de definitie resultaten beschreven volgens de werkwijze zoals bepaald in het eerste rapport (probleemanalyse). De inventarisatieresultaten bevatten veelal specifieke windturbine gegevens en zijn daarom opgenomen in vertrouwelijke annexen, die apart zijn uitgebracht. Op basis van de analyseresultaten kunnen in

18. Nonlinear Statistical Process Monitoring and Fault Detection Using Kernel ICA

Institute of Scientific and Technical Information of China (English)

ZHANG Xi; YAN Wei-wu; ZHAO Xu; SHAO Hui-he

2007-01-01

A novel nonlinear process monitoring and fault detection method based on kernel independent component analysis (ICA) is proposed. The kernel ICA method is a two-phase algorithm: whitened kernel principal component (KPCA) plus ICA. KPCA spheres data and makes the data structure become as linearly separable as possible by virtue of an implicit nonlinear mapping determined by kernel. ICA seeks the projection directions in the KPCA whitened space, making the distribution of the projected data as non-gaussian as possible. The application to the fluid catalytic cracking unit (FCCU) simulated process indicates that the proposed process monitoring method based on kernel ICA can effectively capture the nonlinear relationship in process variables. Its performance significantly outperforms monitoring method based on ICA or KPCA.

19. Harnessing graphics processing units for improved neuroimaging statistics.

Science.gov (United States)

Eklund, Anders; Villani, Mattias; Laconte, Stephen M

2013-09-01

Simple models and algorithms based on restrictive assumptions are often used in the field of neuroimaging for studies involving functional magnetic resonance imaging, voxel based morphometry, and diffusion tensor imaging. Nonparametric statistical methods or flexible Bayesian models can be applied rather easily to yield more trustworthy results. The spatial normalization step required for multisubject studies can also be improved by taking advantage of more robust algorithms for image registration. A common drawback of algorithms based on weaker assumptions, however, is the increase in computational complexity. In this short overview, we will therefore present some examples of how inexpensive PC graphics hardware, normally used for demanding computer games, can be used to enable practical use of more realistic models and accurate algorithms, such that the outcome of neuroimaging studies really can be trusted.

20. Breaking processes in nickel nanocontacts: a statistical description

Energy Technology Data Exchange (ETDEWEB)

Garcia-Mochales, P.; Pelaez, S.; Serena, P.A. [Consejo Superior de Investigaciones Cientificas, Instituto de Ciencia de Materiales de Madrid, Madrid (Spain); Medina, E.; Hasmy, A. [Instituto Venezolano de Investigaciones Cientificas, Centro de Fisica, Apdo. 21827, Caracas (Venezuela)

2005-12-01

In this work we perform a statistical study of favorable atomic configurations of nickel nanocontacts during their stretching at 4 K and 300 K. Nanowire breaking events are simulated using molecular dynamics (MD) where atomic interactions are represented with state-of-the-art embedded atom (EAM) interatomic potentials. The full determination of atomic positions during the contact evolution allows determination of the evolution of the minimum-cross section S{sub m} during stretching. By accumulating many breaking traces, we built minimum cross-section histograms H(S{sub m}). These simulated histograms reveal the presence of preferential geometrical arrangements during the nanocontact breaking, showing that no remarkable differences should appear between the low (4 K) and room temperature (300 K) situations. These results show that differences observed between low and room temperature experimental Ni conductance histograms, are not caused by the different structural evolution and, that therefore, other phenomena are involved. (orig.)

1. Statistical issues in the comparison of quantitative imaging biomarker algorithms using pulmonary nodule volume as an example.

Science.gov (United States)

Obuchowski, Nancy A; Barnhart, Huiman X; Buckler, Andrew J; Pennello, Gene; Wang, Xiao-Feng; Kalpathy-Cramer, Jayashree; Kim, Hyun J Grace; Reeves, Anthony P

2015-02-01

Quantitative imaging biomarkers are being used increasingly in medicine to diagnose and monitor patients' disease. The computer algorithms that measure quantitative imaging biomarkers have different technical performance characteristics. In this paper we illustrate the appropriate statistical methods for assessing and comparing the bias, precision, and agreement of computer algorithms. We use data from three studies of pulmonary nodules. The first study is a small phantom study used to illustrate metrics for assessing repeatability. The second study is a large phantom study allowing assessment of four algorithms' bias and reproducibility for measuring tumor volume and the change in tumor volume. The third study is a small clinical study of patients whose tumors were measured on two occasions. This study allows a direct assessment of six algorithms' performance for measuring tumor change. With these three examples we compare and contrast study designs and performance metrics, and we illustrate the advantages and limitations of various common statistical methods for quantitative imaging biomarker studies.

2. Full current statistics for a disordered open exclusion process

Science.gov (United States)

Ayyer, Arvind

2016-04-01

We consider the nonabelian sandpile model defined on directed trees by Ayyer et al (2015 Commun. Math. Phys. 335 1065) and restrict it to the special case of a one-dimensional lattice of n sites which has open boundaries and disordered hopping rates. We focus on the joint distribution of the integrated currents across each bond simultaneously, and calculate its cumulant generating function exactly. Surprisingly, the process conditioned on seeing specified currents across each bond turns out to be a renormalised version of the same process. We also remark on a duality property of the large deviation function. Lastly, all eigenvalues and both Perron eigenvectors of the tilted generator are determined.

3. Statistical Considerations of Data Processing in Giovanni Online Tool

Science.gov (United States)

Suhung, Shen; Leptoukh, G.; Acker, J.; Berrick, S.

2005-01-01

The GES DISC Interactive Online Visualization and Analysis Infrastructure (Giovanni) is a web-based interface for the rapid visualization and analysis of gridded data from a number of remote sensing instruments. The GES DISC currently employs several Giovanni instances to analyze various products, such as Ocean-Giovanni for ocean products from SeaWiFS and MODIS-Aqua; TOMS & OM1 Giovanni for atmospheric chemical trace gases from TOMS and OMI, and MOVAS for aerosols from MODIS, etc. (http://giovanni.gsfc.nasa.gov) Foremost among the Giovanni statistical functions is data averaging. Two aspects of this function are addressed here. The first deals with the accuracy of averaging gridded mapped products vs. averaging from the ungridded Level 2 data. Some mapped products contain mean values only; others contain additional statistics, such as number of pixels (NP) for each grid, standard deviation, etc. Since NP varies spatially and temporally, averaging with or without weighting by NP will be different. In this paper, we address differences of various weighting algorithms for some datasets utilized in Giovanni. The second aspect is related to different averaging methods affecting data quality and interpretation for data with non-normal distribution. The present study demonstrates results of different spatial averaging methods using gridded SeaWiFS Level 3 mapped monthly chlorophyll a data. Spatial averages were calculated using three different methods: arithmetic mean (AVG), geometric mean (GEO), and maximum likelihood estimator (MLE). Biogeochemical data, such as chlorophyll a, are usually considered to have a log-normal distribution. The study determined that differences between methods tend to increase with increasing size of a selected coastal area, with no significant differences in most open oceans. The GEO method consistently produces values lower than AVG and MLE. The AVG method produces values larger than MLE in some cases, but smaller in other cases. Further

4. Processing and statistical analysis of soil-root images

Science.gov (United States)

Razavi, Bahar S.; Hoang, Duyen; Kuzyakov, Yakov

2016-04-01

Importance of the hotspots such as rhizosphere, the small soil volume that surrounds and is influenced by plant roots, calls for spatially explicit methods to visualize distribution of microbial activities in this active site (Kuzyakov and Blagodatskaya, 2015). Zymography technique has previously been adapted to visualize the spatial dynamics of enzyme activities in rhizosphere (Spohn and Kuzyakov, 2014). Following further developing of soil zymography -to obtain a higher resolution of enzyme activities - we aimed to 1) quantify the images, 2) determine whether the pattern (e.g. distribution of hotspots in space) is clumped (aggregated) or regular (dispersed). To this end, we incubated soil-filled rhizoboxes with maize Zea mays L. and without maize (control box) for two weeks. In situ soil zymography was applied to visualize enzymatic activity of β-glucosidase and phosphatase at soil-root interface. Spatial resolution of fluorescent images was improved by direct application of a substrate saturated membrane to the soil-root system. Furthermore, we applied "spatial point pattern analysis" to determine whether the pattern (e.g. distribution of hotspots in space) is clumped (aggregated) or regular (dispersed). Our results demonstrated that distribution of hotspots at rhizosphere is clumped (aggregated) compare to control box without plant which showed regular (dispersed) pattern. These patterns were similar in all three replicates and for both enzymes. We conclude that improved zymography is promising in situ technique to identify, analyze, visualize and quantify spatial distribution of enzyme activities in the rhizosphere. Moreover, such different patterns should be considered in assessments and modeling of rhizosphere extension and the corresponding effects on soil properties and functions. Key words: rhizosphere, spatial point pattern, enzyme activity, zymography, maize.

5. Counting statistics of non-markovian quantum stochastic processes

DEFF Research Database (Denmark)

Flindt, Christian; Novotny, T.; Braggio, A.

2008-01-01

We derive a general expression for the cumulant generating function (CGF) of non-Markovian quantum stochastic transport processes. The long-time limit of the CGF is determined by a single dominating pole of the resolvent of the memory kernel from which we extract the zero-frequency cumulants of t...

6. GASP cloud- and particle-encounter statistics and their application to LFC aircraft studies. Volume 2: Appendixes

Science.gov (United States)

Jasperson, W. H.; Nastron, G. D.; Davis, R. E.; Holdeman, J. D.

1984-01-01

Summary studies are presented for the entire cloud observation archive from the NASA Global Atmospheric Sampling Program (GASP). Studies are also presented for GASP particle-concentration data gathered concurrently with the cloud observations. Cloud encounters are shown on about 15 percent of the data samples overall, but the probability of cloud encounter is shown to vary significantly with altitude, latitude, and distance from the tropopause. Several meteorological circulation features are apparent in the latitudinal distribution of cloud cover, and the cloud-encounter statistics are shown to be consistent with the classical mid-latitude cyclone model. Observations of clouds spaced more closely than 90 minutes are shown to be statistically dependent. The statistics for cloud and particle encounter are utilized to estimate the frequency of cloud encounter on long-range airline routes, and to assess the probability and extent of laminaar flow loss due to cloud or particle encounter by aircraft utilizing laminar flow control (LFC). It is shown that the probability of extended cloud encounter is too low, of itself, to make LFC impractical. This report is presented in two volumes. Volume I contains the narrative, analysis, and conclusions. Volume II contains five supporting appendixes.

7. High Statistics Analysis using Anisotropic Clover Lattices: (IV) The Volume Dependence of the Light Hadron Masses

Energy Technology Data Exchange (ETDEWEB)

Beane, S R; Detmold, W; Lin, H W; Luu, T C; Orginos, K; Parreno, A; Savage, M J; Torok, A; Walker-Loud, A

2011-07-01

The volume dependence of the octet baryon masses and relations among them are explored with Lattice QCD. Calculations are performed with nf = 2 + 1 clover fermion discretization in four lattice volumes, with spatial extent L ? 2.0, 2.5, 3.0 and 4.0 fm, with an anisotropic lattice spacing of b_s ? 0.123 fm in the spatial direction, and b_t = b_s/3.5 in the time direction, and at a pion mass of m_\\pi ? 390 MeV. The typical precision of the ground-state baryon mass determination is volume dependence of the masses, the Gell-Mann Okubo mass-relation, and of other mass combinations. A comparison with the predictions of heavy baryon chiral perturbation theory is performed in both the SU(2)L ? SU(2)R and SU(3)L ? SU(3)R expansions. Predictions of the three-flavor expansion for the hadron masses are found to describe the observed volume dependences reasonably well. Further, the ?N? axial coupling constant is extracted from the volume dependence of the nucleon mass in the two-flavor expansion, with only small modifications in the three-flavor expansion from the inclusion of kaons and eta's. At a given value of m?L, the finite-volume contributions to the nucleon mass are predicted to be significantly smaller at m_\\pi ? 140 MeV than at m_\\pi ? 390 MeV due to a coefficient that scales as ? m_\\pi^3. This is relevant for the design of future ensembles of lattice gauge-field configurations. Finally, the volume dependence of the pion and kaon masses are analyzed with two-flavor and three-flavor chiral perturbation theory.

8. Use of statistical process control in evaluation of academic performance

Directory of Open Access Journals (Sweden)

Ezequiel Gibbon Gautério

2014-05-01

Full Text Available The aim of this article was to study some indicators of academic performance (number of students per class, dropout rate, failure rate and scores obtained by the students to identify a pattern of behavior that would enable to implement improvements in the teaching-learning process. The sample was composed of five classes of undergraduate courses in Engineering. The data were collected for three years. Initially an exploratory analysis with analytical and graphical techniques was performed. An analysis of variance and Tukey’s test investigated some sources of variability. This information was used in the construction of control charts. We have found evidence that classes with more students are associated with higher failure rates and lower mean. Moreover, when the course was later in the curriculum, the students had higher scores. The results showed that although they have been detected some special causes interfering in the process, it was possible to stabilize it and to monitor it.

9. Multivariate Statistical Process Optimization in the Industrial Production of Enzymes

DEFF Research Database (Denmark)

Klimkiewicz, Anna

ultrafiltration operation is limited by the membrane fouling phenomenawhere the production capacity - monitored as flow through the membrane or flux -decreases over time. The flux varies considerably from run to run within the sameproduct and likewise between different products. This variability clearly affects......, the study revealed that the less demanding in-line flow cellsetup outperformed the on-line arrangement. The former worked satisfactory robusttowards different products (amylases and proteases) and associated processingparameters such temperature and processing speed.This dissertation work shows......In modern biotech production, a massive number of diverse measurements, with a broad diversity in information content and quality, is stored in data historians. The potential of this enormous amount of data is currently under-employed in process optimization efforts. This is a result...

10. A statistical approach to define some tofu processing conditions

Directory of Open Access Journals (Sweden)

Vera de Toledo Benassi

2011-12-01

Full Text Available The aim of this work was to make tofu from soybean cultivar BRS 267 under different processing conditions in order to evaluate the influence of each treatment on the product quality. A fractional factorial 2(5-1 design was used, in which independent variables (thermal treatment, coagulant concentration, coagulation time, curd cutting, and draining time were tested at two different levels. The response variables studied were hardness, yield, total solids, and protein content of tofu. Polynomial models were generated for each response. To obtain tofu with desirable characteristics (hardness ~4 N, yield 306 g tofu.100 g-1 soybeans, 12 g proteins.100 g-1 tofu and 22 g solids.100 g-1 tofu, the following processing conditions were selected: heating until boiling plus 10 minutes in water bath, 2% dihydrated CaSO4 w/w, 10 minutes coagulation, curd cutting, and 30 minutes draining time.

11. Multivariate Statistical Process Optimization in the Industrial Production of Enzymes

DEFF Research Database (Denmark)

Klimkiewicz, Anna

In modern biotech production, a massive number of diverse measurements, with a broad diversity in information content and quality, is stored in data historians. The potential of this enormous amount of data is currently under-employed in process optimization efforts. This is a result of the deman......In modern biotech production, a massive number of diverse measurements, with a broad diversity in information content and quality, is stored in data historians. The potential of this enormous amount of data is currently under-employed in process optimization efforts. This is a result...... and difficulties related to ‘recycling’ of historical data from a full-scale manufacturing of industrial enzymes. First, the crucial and tedious step of retrieving the data from the systems is presented. The prerequisites that need to be comprehended are discussed, such as sensors accuracy and reliability, aspects...... related to the actual measuring frequency and non-equidistance retaining strategies in data storage. Different regimes of data extraction can be employed, and some might introduce undesirable artifacts in the final analysis results (POSTER II1). Several signal processing techniques are also briefly...

12. Statistical process control (SPC) for coordinate measurement machines

Energy Technology Data Exchange (ETDEWEB)

Escher, R.N.

2000-01-04

The application of process capability analysis, using designed experiments, and gage capability studies as they apply to coordinate measurement machine (CMM) uncertainty analysis and control will be demonstrated. The use of control standards in designed experiments, and the use of range charts and moving range charts to separate measurement error into it's discrete components will be discussed. The method used to monitor and analyze the components of repeatability and reproducibility will be presented with specific emphasis on how to use control charts to determine and monitor CMM performance and capability, and stay within your uncertainty assumptions.

13. Intertime jump statistics of state-dependent Poisson processes.

Science.gov (United States)

Daly, Edoardo; Porporato, Amilcare

2007-01-01

A method to obtain the probability distribution of the interarrival times of jump occurrences in systems driven by state-dependent Poisson noise is proposed. Such a method uses the survivor function obtained by a modified version of the master equation associated to the stochastic process under analysis. A model for the timing of human activities shows the capability of state-dependent Poisson noise to generate power-law distributions. The application of the method to a model for neuron dynamics and to a hydrological model accounting for land-atmosphere interaction elucidates the origin of characteristic recurrence intervals and possible persistence in state-dependent Poisson models.

14. Intertime jump statistics of state-dependent Poisson processes

Science.gov (United States)

Daly, Edoardo; Porporato, Amilcare

2007-01-01

A method to obtain the probability distribution of the interarrival times of jump occurrences in systems driven by state-dependent Poisson noise is proposed. Such a method uses the survivor function obtained by a modified version of the master equation associated to the stochastic process under analysis. A model for the timing of human activities shows the capability of state-dependent Poisson noise to generate power-law distributions. The application of the method to a model for neuron dynamics and to a hydrological model accounting for land-atmosphere interaction elucidates the origin of characteristic recurrence intervals and possible persistence in state-dependent Poisson models.

15. Nonlinear Statistical Signal Processing: A Particle Filtering Approach

Energy Technology Data Exchange (ETDEWEB)

Candy, J

2007-09-19

A introduction to particle filtering is discussed starting with an overview of Bayesian inference from batch to sequential processors. Once the evolving Bayesian paradigm is established, simulation-based methods using sampling theory and Monte Carlo realizations are discussed. Here the usual limitations of nonlinear approximations and non-gaussian processes prevalent in classical nonlinear processing algorithms (e.g. Kalman filters) are no longer a restriction to perform Bayesian inference. It is shown how the underlying hidden or state variables are easily assimilated into this Bayesian construct. Importance sampling methods are then discussed and shown how they can be extended to sequential solutions implemented using Markovian state-space models as a natural evolution. With this in mind, the idea of a particle filter, which is a discrete representation of a probability distribution, is developed and shown how it can be implemented using sequential importance sampling/resampling methods. Finally, an application is briefly discussed comparing the performance of the particle filter designs with classical nonlinear filter implementations.

16. STUDY OF SEASONAL TREND-PROCESS WITH THE METHOD OF CLASSICAL STATISTICS

Directory of Open Access Journals (Sweden)

Kymratova A. M.

2014-11-01

Full Text Available This work is devoted to the methods of multicriteria optimization and classical statistics of obtaining pre-estimated information for time series that have long-term memory, which is why their levels do not satisfy the independence property, and therefore the classical prediction methods may be inadequate. The developed methods of obtaining such information are based on classical statistics methods such as mathematical statistics, multicriteria optimization and extreme value theory. The effectiveness of the proposed approach has been demonstrated on the example of specific time series of volumes of mountain rivers

17. An introduction to stochastic processes and nonequilibrium statistical physics

CERN Document Server

Wio, Horacio S; Lopez, Juan M

2012-01-01

This book aims to provide a compact and unified introduction to the most important aspects in the physics of non-equilibrium systems. It first introduces stochastic processes and some modern tools and concepts that have proved their usefulness to deal with non-equilibrium systems from a purely probabilistic angle. The aim is to show the important role played by fluctuations in far-from-equilibrium situations, where noise can promote order and organization, switching among non-equilibrium states, etc. The second part adopts a more historical perspective, retracing the first steps taken from the purely thermodynamic as well as from the kinetic points of view to depart (albeit slightly) from equilibrium. The third part revisits the path outlined in the first one, but now undertakes the mesoscopic description of extended systems, where new phenomena (patterns, long-range correlations, scaling far from equilibrium, etc.) are observed.

18. Child Development and the Housing Environment. Volume 1: Statistical Design and Analysis.

Science.gov (United States)

Urban Systems Research and Engineering, Inc., Cambridge, MA.

The first part of a three-volume study, this report presents and justifies a research design for investigation of the relationship between the housing environment and the range of child development and family measures. The recommended design is nonexperimental in nature and focuses on comparisons of residents in selected housing programs (publicly…

19. Statistical Comparisons of Pre and Post Del Mod Data, Final Report, Volume V.

Science.gov (United States)

Bolig, John R.; Wall, Charles A.

This is one of five volumes prepared to describe various aspects of the Del Mod System. This report attempts to compare baseline data gathered in 1970-71 to post experimental data gathered in 1975-76. Sections include analyses of achievement test scores, research conducted under the auspices of the Del Mod System, and a follow-up study of…

20. Predicting uncertainty in future marine ice sheet volume using Bayesian statistical methods

Science.gov (United States)

Davis, A. D.

2015-12-01

The marine ice instability can trigger rapid retreat of marine ice streams. Recent observations suggest that marine ice systems in West Antarctica have begun retreating. However, unknown ice dynamics, computationally intensive mathematical models, and uncertain parameters in these models make predicting retreat rate and ice volume difficult. In this work, we fuse current observational data with ice stream/shelf models to develop probabilistic predictions of future grounded ice sheet volume. Given observational data (e.g., thickness, surface elevation, and velocity) and a forward model that relates uncertain parameters (e.g., basal friction and basal topography) to these observations, we use a Bayesian framework to define a posterior distribution over the parameters. A stochastic predictive model then propagates uncertainties in these parameters to uncertainty in a particular quantity of interest (QoI)---here, the volume of grounded ice at a specified future time. While the Bayesian approach can in principle characterize the posterior predictive distribution of the QoI, the computational cost of both the forward and predictive models makes this effort prohibitively expensive. To tackle this challenge, we introduce a new Markov chain Monte Carlo method that constructs convergent approximations of the QoI target density in an online fashion, yielding accurate characterizations of future ice sheet volume at significantly reduced computational cost.Our second goal is to attribute uncertainty in these Bayesian predictions to uncertainties in particular parameters. Doing so can help target data collection, for the purpose of constraining the parameters that contribute most strongly to uncertainty in the future volume of grounded ice. For instance, smaller uncertainties in parameters to which the QoI is highly sensitive may account for more variability in the prediction than larger uncertainties in parameters to which the QoI is less sensitive. We use global sensitivity

1. Calcul statistique du volume des blocs matriciels d'un gisement fissuré The Statistical Computing of Matrix Block Volume in a Fissured Reservoir

Directory of Open Access Journals (Sweden)

Guez F.

2006-11-01

the distribution of block volumes. But it is precisely this distribution that qoverns the choice of one or several successive recovery methods. Therefore, this article describes an original method for statistically computing the distribution law for matrix-block volumes. This method con be applied of any point in a reservoir. The reservoir portion involved with blocks having a given volume is deduced from this method. A general understanding of the fracturing phenomenon acts as the basis for the model. Subsurface observations on reservoir fracturing provide the data (histogrom of fracture direction and spacing. An application to the Eschau field (Alsace, France is described here to illustrate the method.

2. How log-normal is your country? An analysis of the statistical distribution of the exported volumes of products

Science.gov (United States)

Annunziata, Mario Alberto; Petri, Alberto; Pontuale, Giorgio; Zaccaria, Andrea

2016-10-01

We have considered the statistical distributions of the volumes of 1131 products exported by 148 countries. We have found that the form of these distributions is not unique but heavily depends on the level of development of the nation, as expressed by macroeconomic indicators like GDP, GDP per capita, total export and a recently introduced measure for countries' economic complexity called fitness. We have identified three major classes: a) an incomplete log-normal shape, truncated on the left side, for the less developed countries, b) a complete log-normal, with a wider range of volumes, for nations characterized by intermediate economy, and c) a strongly asymmetric shape for countries with a high degree of development. Finally, the log-normality hypothesis has been checked for the distributions of all the 148 countries through different tests, Kolmogorov-Smirnov and Cramér-Von Mises, confirming that it cannot be rejected only for the countries of intermediate economy.

3. Fast statistical delay evaluation of RC interconnect in the presence of process variations

Energy Technology Data Exchange (ETDEWEB)

Li Jianwei; Dong Gang; Yang Yintang; Wang Zeng, E-mail: lijianwei_zz@sina.co [Key Laboratory of Ministry of Education for Wide Band-Gap Semiconductor Materials and Devices, Microelectronics Institute, Xidian University, Xi' an 710071 (China)

2010-04-15

Fast statistical methods of interconnect delay and slew in the presence of process fluctuations are proposed. Using an optimized quadratic model to describe the effects of process variations, the proposed method enables closed-form expressions of interconnect delay and slew for the given variations in relevant process parameters. Simulation results show that the method, which has a statistical characteristic similar to traditional methodology, is more efficient compared to HSPICE-based Monte Carlo simulations and traditional methodology. (semiconductor integrated circuits)

4. Frothing in flotation. Volume 2: Recent advances in coal processing

Energy Technology Data Exchange (ETDEWEB)

Laskowski, J.S. [ed.] [Univ. of British Columbia, Vancouver, British Columbia (Canada); Woodburn, E.T. [ed.] [Univ. of Manchester Inst. of Science and Technology (United Kingdom)

1998-11-01

This volume summarizes the achievements on various aspects of flotation froth properties and behavior, and relationship between froth appearance and flotation performance. Flotation kinetics involves a number of mass transfer processes with some of them being critically determined by the behavior of froth. Since froth is complex, and controlled experimentation is difficult, the froth phase was, until recently, either ignored or treated entirely empirically. With wide applications of flotation columns, the behavior of the froth is now often recognized as being dominant in determining flotation performance, and the research in this area is one of the most actively pursued. Contents include: Frothers and frothing; Effect of particle and bubble size on flotation kinetics; Water content and distribution in flotation froths; Mechanisms operating in flotation froths; Characterization of flotation froth; Simultaneous determination of collection zone rate constant and froth zone recovery factor; Modelling of froth dynamics with implications for feed-back control; The interrelationship between flotation variables and froth appearance; Froth image analysis in a flotation control system; Kinetic flotation modelling using froth imaging data; and Dependence of froth behavior on galvanic interactions.

5. Measurement of Work Processes Using Statistical Process Control: Instructor’s Manual

Science.gov (United States)

1987-03-01

Iron Age, 22. (29), 59, 61, 63. Box, G. E. P., & Draper, N. R. (1969). Evolutionary operation. New York: John Wiley. Box, G. E. P., Hunter, W. G...Hunter, 3. S. (1978). Statistics for experimenters. New York: John Wiley. tv Brittanica Films (Producers). Management’s five deadly diseases (videotape...Fundamentals of statistical quality control seminar. Beaverton, OR: Author. Terninko , J. (1983). Statistical aplications in automotive urethane

6. VolumeExplorer: Roaming Large Volumes to Couple Visualization and Data Processing for Oil and Gas Exploration

OpenAIRE

2005-01-01

http://ieeexplore.ieee.org/; In this paper, we present a volume roaming system dedicated to oil and gas exploration. Our system combines probe-based volume rendering with data processing and computing. The daily oil production and the estimation of the world proven-reserves directly affect the barrel price and have a strong impact on the economy. Among others, production and correct estimation are linked to the accuracy of the subsurface model used for predicting oil reservoirs shape and size...

7. Halo statistics analysis within medium volume cosmological N-body simulation

Directory of Open Access Journals (Sweden)

Martinović N.

2015-01-01

Full Text Available In this paper we present halo statistics analysis of a ΛCDM N body cosmological simulation (from first halo formation until z = 0. We study mean major merger rate as a function of time, where for time we consider both per redshift and per Gyr dependence. For latter we find that it scales as the well known power law (1 + zn for which we obtain n = 2.4. The halo mass function and halo growth function are derived and compared both with analytical and empirical fits. We analyse halo growth through out entire simulation, making it possible to continuously monitor evolution of halo number density within given mass ranges. The halo formation redshift is studied exploring possibility for a new simple preliminary analysis during the simulation run. Visualization of the simulation is portrayed as well. At redshifts z = 0−7 halos from simulation have good statistics for further analysis especially in mass range of 1011 − 1014 M./h. [176021 ’Visible and invisible matter in nearby galaxies: theory and observations

8. 77 FR 46096 - Statistical Process Controls for Blood Establishments; Public Workshop

Science.gov (United States)

2012-08-02

...-available basis beginning at 7:30 a.m. If you need special accommodations due to a disability, please... workshop is to educate participants on statistical process control theory and options for the...

9. Statistics to the Rescue!: Using Data to Evaluate a Manufacturing Process

Science.gov (United States)

Keithley, Michael G.

2009-01-01

The use of statistics and process controls is too often overlooked in educating students. This article describes an activity appropriate for high school students who have a background in material processing. It gives them a chance to advance their knowledge by determining whether or not a manufacturing process works well. The activity follows a…

10. Impact of Autocorrelation on Principal Components and Their Use in Statistical Process Control

DEFF Research Database (Denmark)

Vanhatalo, Erik; Kulahci, Murat

2015-01-01

A basic assumption when using principal component analysis (PCA) for inferential purposes, such as in statistical process control (SPC), is that the data are independent in time. In many industrial processes, frequent sampling and process dynamics make this assumption unrealistic rendering sampled...

11. Neutrophil volume, conductivity and scatter parameters with effective modeling of molecular activity statistical program gives better results in neonatal sepsis.

Science.gov (United States)

Celik, I H; Demirel, G; Sukhachev, D; Erdeve, O; Dilmen, U

2013-02-01

Neonatal sepsis remains an important clinical syndrome despite advances in neonatology. Current hematology analyzers can determine cell volume (V), conductivity for internal composition of cell (C) and light scatter for cytoplasmic granularity and nuclear structure (S), and standard deviations which are effective in the diagnosis of sepsis. Statistical models can be used to strengthen the diagnosis. Effective modeling of molecular activity (EMMA) uses combinatorial algorithm of the selection parameters for regression equation based on modified stepwise procedure. It allows obtaining different regression models with different combinations of parameters. We investigated these parameters in screening of neonatal sepsis. We used LH780 hematological analyzer (Beckman Coulter, Fullerton, CA, USA). We combined these parameters with interleukin-6 (IL-6) and C-reactive protein (CRP) and developed models by EMMA. A total of 304 newborns, 76 proven sepsis, 130 clinical sepsis and 98 controls, were enrolled in the study. Mean neutrophil volume (MNV) and volume distribution width (VDW) were higher in both proven and clinical sepsis groups. We developed three models using MNV, VDW, IL-6, and CRP. These models gave more sensitivity and specificity than the usage of each marker alone. We suggest to use the combination of MNV and VDW with markers such as CRP and IL-6, and use diagnostic models created by EMMA. © 2012 Blackwell Publishing Ltd.

12. Improving the process quality using statistical design of experiments: a case study.

Science.gov (United States)

Antony, J; Roy, R K

1998-01-01

A technique known as Statistical design of experiments is a powerful technique for process characterization, optimization, and modeling. It has been widely accepted in manufacturing industry for improving product performance and reliability, process capability, and yield. This article illustrates the application of statistical design of experiments based on the Taguchi approach in a certain company that manufactures electromagnetic clutch coils. The objective of the study was to improve the quality of the existing process and thereby achieve heightened customer satisfaction for the product. An eight-trial experiment was conducted with the aim of reducing the number of rejects from the process. The expected savings per month was estimated to be over $11,500. The results of the study have provided a greater stimulus for the wider application of statistical design of experiments in other core processes within the company. 13. On the statistical implications of certain Random permutations in Markovian Arrival Processes (MAPs) and second order self-similar processes DEFF Research Database (Denmark) Andersen, Allan T.; Nielsen, Bo Friis 2000-01-01 . The implications for the correlation structure when shuffling an exactly second-order self-similar process are examined. We apply the Markovian arrival process (MAP) as a tool to investigate whether general conclusions can be made with regard to the statistical implications of the shuffling experiments... 14. A local and global statistics pattern analysis method and its application to process fault identification☆ Institute of Scientific and Technical Information of China (English) Hanyuan Zhang; Xuemin Tian; Xiaogang Deng; Lianfang Cai 2015-01-01 Traditional principal component analysis (PCA) is a second-order method and lacks the ability to provide higher-order representations for data variables. Recently, a statistics pattern analysis (SPA) framework has been incor-porated into PCA model to make full use of various statistics of data variables effectively. However, these methods omit the local information, which is also important for process monitoring and fault diagnosis. In this paper, a local and global statistics pattern analysis (LGSPA) method, which integrates SPA framework and locality pre-serving projections within the PCA, is proposed to utilize various statistics and preserve both local and global in-formation in the observed data. For the purpose of fault detection, two monitoring indices are constructed based on the LGSPA model. In order to identify fault variables, an improved reconstruction based contribution (IRBC) plot based on LGSPA model is proposed to locate fault variables. The RBC of various statistics of original process variables to the monitoring indices is calculated with the proposed RBC method. Based on the calculated RBC of process variables' statistics, a new contribution of process variables is built to locate fault variables. The simula-tion results on a simple six-variable system and a continuous stirred tank reactor system demonstrate that the proposed fault diagnosis method can effectively detect fault and distinguish the fault variables from normal variables. 15. Statistical data generated through CFD to aid in the scale-up of shear sensitive processes Science.gov (United States) Khan, Irfan; Das, Shankhadeep; Cloeter, Mike; Gillis, Paul; Poindexter, Michael 2016-11-01 A number of industrial processes are considered shear-sensitive, where the product quality depends on achieving the right balance between mixing energy input and the resulting strain rate distribution in the process. Examples of such industrial processes are crystallization, flocculation and suspension polymerization. Scale-up of such processes are prone to a number of challenges including the optimization of mixing and shear rate distribution in the process. Computational Fluid Dynamics (CFD) can be a valuable tool to aid in the process scale-up; however for modeling purpose, the process will often need to be simplified appropriately to reduce the computational complexity. Commercial CFD tools with appropriate Lagrangian particle tracking models can be used to gather statistical data such as maximum strain rate distribution and maximum number of passes through a specific strain rate. This presentation will discuss such statistical tools and their application to a model scale-up problem. 16. Statistical relation between particle contaminations in ultra pure water and defects generated by process tools NARCIS (Netherlands) Wali, F.; Knotter, D. Martin; Wortelboer, Ronald; Mud, Auke 2007-01-01 Ultra pure water supplied inside the Fab is used in different tools at different stages of processing. Data of the particles measured in ultra pure water was compared with the defect density on wafers processed on these tools and a statistical relation is found Keywords— Yield, defect density, 17. Plasma-statistical models of the atom in the theory of some collisional and radiative processes NARCIS (Netherlands) Astapenko, VA 2002-01-01 A plasma-statistical model was used to describe collisional and radiative processes involving target ionization, namely, collisional ionization of atoms and incoherent polarization bremsstrahlung. The cross sections of these processes were expressed through the Compton profile of X-ray scattering, f 18. Higher order antibunching and subpossonian photon statistics in five wave mixing process CERN Document Server Verma, Amit 2009-01-01 We have investigated the possibility of observing higher order antibunching (HOA) and higher order subpossonian photon statistics (HOSPS) in five wave mixing and third harmonic generation process. It had been shown that both processes satisfy the criteria of HOA and HOSPS. Further, some observations on the nature of interaction which produces HOA and HOSPS are reported. 19. Improving the Process-Variation Tolerance of Digital Circuits Using Gate Sizing and Statistical Techniques CERN Document Server Neiroukh, Osama 2011-01-01 A new approach for enhancing the process-variation tolerance of digital circuits is described. We extend recent advances in statistical timing analysis into an optimization framework. Our objective is to reduce the performance variance of a technology-mapped circuit where delays across elements are represented by random variables which capture the manufacturing variations. We introduce the notion of statistical critical paths, which account for both means and variances of performance variation. An optimization engine is used to size gates with a goal of reducing the timing variance along the statistical critical paths. We apply a pair of nested statistical analysis methods deploying a slower more accurate approach for tracking statistical critical paths and a fast engine for evaluation of gate size assignments. We derive a new approximation for the max operation on random variables which is deployed for the faster inner engine. Circuit optimization is carried out using a gain-based algorithm that terminates w... 20. Processing speed in normal aging: effects of white matter hyperintensities and hippocampal volume loss. Science.gov (United States) Papp, Kathryn V; Kaplan, Richard F; Springate, Beth; Moscufo, Nicola; Wakefield, Dorothy B; Guttmann, Charles R G; Wolfson, Leslie 2014-01-01 Changes in cognitive functioning are said to be part of normal aging. Quantitative MRI has made it possible to measure structural brain changes during aging which may underlie these decrements which include slowed information processing and memory loss. Much has been written on white matter hyperintensities (WMH), which are associated with cognitive deficits on tasks requiring processing speed and executive functioning, and hippocampal volume loss, which is associated with memory decline. Here we examine volumetric MRI measures of WMH and hippocampal volume loss together in relation to neuropsychological tests considered to be measures of executive functioning and processing speed in 81 non-demented elderly individuals, aged 75-90. Correlational analysis showed that when controlling for age, both greater WMH volume and smaller hippocampal volume were correlated with slower performances on most tests with the exception of a battery of continuous performance tests in which only WMH was correlated with slower reaction time (RT). We then performed a series of hierarchical multiple regression analyses to examine the independent contributions of greater WMH volume and reduced hippocampal volume to executive functioning and processing speed. The results showed that for the four measures requiring executive functioning and speed of processing, WMH volume and hippocampal volume combined predicted between 21.4% and 37% of the explained variance. These results suggest that WM integrity and hippocampal volume influence cognitive decline independently on tasks involving processing speed and executive function independent of age. 1. Statistical Inference on Memory Structure of Processes and Its Applications to Information Theory Science.gov (United States) 2016-05-12 SECURITY CLASSIFICATION OF: Three areas were investigated. First, new memory models of discrete-time and finitely-valued information sources are...computational and storage complexities are proved. Second, a statistical method is developed to estimate the memory depth of discrete-time and continuously...Distribution Unlimited UU UU UU UU 12-05-2016 15-May-2014 14-Feb-2015 Final Report: Statistical Inference on Memory Structure of Processes and Its Applications 2. Riesz transforms in statistical signal processing and their applications to speckle metrology: a review DEFF Research Database (Denmark) Wang, Wei; Zhang, Shun; Ma, Ning 2015-01-01 In this paper, a high-dimensional statistical signal processing is revisited with the aim of introducing the concept of vector signal representation derived from the Riesz transforms, which are the natural extension and generalization of the one-dimensional Hilbert transform. Under the new concepts...... of vector correlations proposed recently, the statistical properties of the vector signal representation for random signal are presented and some applications to speckle metrology developed recently are reviewed to demonstrate the unique capability of Riesz transforms.... 3. Statistical error in simulations of Poisson processes: Example of diffusion in solids Science.gov (United States) Nilsson, Johan O.; Leetmaa, Mikael; Vekilova, Olga Yu.; Simak, Sergei I.; Skorodumova, Natalia V. 2016-08-01 Simulations of diffusion in solids often produce poor statistics of diffusion events. We present an analytical expression for the statistical error in ion conductivity obtained in such simulations. The error expression is not restricted to any computational method in particular, but valid in the context of simulation of Poisson processes in general. This analytical error expression is verified numerically for the case of Gd-doped ceria by running a large number of kinetic Monte Carlo calculations. 4. Repeated head trauma is associated with smaller thalamic volumes and slower processing speed: the Professional Fighters' Brain Health Study. Science.gov (United States) Bernick, Charles; Banks, Sarah J; Shin, Wanyong; Obuchowski, Nancy; Butler, Sam; Noback, Michael; Phillips, Michael; Lowe, Mark; Jones, Stephen; Modic, Michael 2015-08-01 Cumulative head trauma may alter brain structure and function. We explored the relationship between exposure variables, cognition and MRI brain structural measures in a cohort of professional combatants. 224 fighters (131 mixed martial arts fighters and 93 boxers) participating in the Professional Fighters Brain Health Study, a longitudinal cohort study of licensed professional combatants, were recruited, as were 22 controls. Each participant underwent computerised cognitive testing and volumetric brain MRI. Fighting history including years of fighting and fights per year was obtained from self-report and published records. Statistical analyses of the baseline evaluations were applied cross-sectionally to determine the relationship between fight exposure variables and volumes of the hippocampus, amygdala, thalamus, caudate, putamen. Moreover, the relationship between exposure and brain volumes with cognitive function was assessed. Increasing exposure to repetitive head trauma measured by number of professional fights, years of fighting, or a Fight Exposure Score (FES) was associated with lower brain volumes, particularly the thalamus and caudate. In addition, speed of processing decreased with decreased thalamic volumes and with increasing fight exposure. Higher scores on a FES used to reflect exposure to repetitive head trauma were associated with greater likelihood of having cognitive impairment. Greater exposure to repetitive head trauma is associated with lower brain volumes and lower processing speed in active professional fighters. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions. 5. Determination Of The Wear Fault In Spur Gear System Using Statistical Process Control Method Directory of Open Access Journals (Sweden) Sinan Maraş 2014-01-01 Full Text Available Vibration analysis is one of the early warning methods widely used in obtaining information about faults occurring on the machine elements and structures. In this method, gear fault detection can be performed by analyzing of the vibration test results using signal processing, artificial intelligence and statistical analysis methods. The objective of this study is detection the existence of wear by examining changes in the vibrations of spur gears due to wear faults in statistical process control carts. In this study, artificial wear was created on the surfaces of spur gears in order to be examined in gears test rig. Then, these gears were attached and vibrations data were recorded by operating the system at various loading and number of cycles conditions. Detection of fault was demonstrated by analyzing undeformed and worn gears data in statistical process control carts by means of real-time experimental studies. 6. Comparing estimates of climate change impacts from process-based and statistical crop models Science.gov (United States) Lobell, David B.; Asseng, Senthold 2017-01-01 The potential impacts of climate change on crop productivity are of widespread interest to those concerned with addressing climate change and improving global food security. Two common approaches to assess these impacts are process-based simulation models, which attempt to represent key dynamic processes affecting crop yields, and statistical models, which estimate functional relationships between historical observations of weather and yields. Examples of both approaches are increasingly found in the scientific literature, although often published in different disciplinary journals. Here we compare published sensitivities to changes in temperature, precipitation, carbon dioxide (CO2), and ozone from each approach for the subset of crops, locations, and climate scenarios for which both have been applied. Despite a common perception that statistical models are more pessimistic, we find no systematic differences between the predicted sensitivities to warming from process-based and statistical models up to +2 °C, with limited evidence at higher levels of warming. For precipitation, there are many reasons why estimates could be expected to differ, but few estimates exist to develop robust comparisons, and precipitation changes are rarely the dominant factor for predicting impacts given the prominent role of temperature, CO2, and ozone changes. A common difference between process-based and statistical studies is that the former tend to include the effects of CO2 increases that accompany warming, whereas statistical models typically do not. Major needs moving forward include incorporating CO2 effects into statistical studies, improving both approaches’ treatment of ozone, and increasing the use of both methods within the same study. At the same time, those who fund or use crop model projections should understand that in the short-term, both approaches when done well are likely to provide similar estimates of warming impacts, with statistical models generally 7. 6823 Volume 12 No. 6 October 2012 PROCESSING PINEAPPLE ... African Journals Online (AJOL) CRSP 2012-10-06 Oct 6, 2012 ... investigate the processing of pineapple pulp waste from a processing plant, into a ... The pasting characteristics or properties of wheat flour fortified with the ... loaded with vitamins and minerals and especially rich in vitamin C ... 8. Management of Uncertainty by Statistical Process Control and a Genetic Tuned Fuzzy System Directory of Open Access Journals (Sweden) Stephan Birle 2016-01-01 Full Text Available In food industry, bioprocesses like fermentation often are a crucial part of the manufacturing process and decisive for the final product quality. In general, they are characterized by highly nonlinear dynamics and uncertainties that make it difficult to control these processes by the use of traditional control techniques. In this context, fuzzy logic controllers offer quite a straightforward way to control processes that are affected by nonlinear behavior and uncertain process knowledge. However, in order to maintain process safety and product quality it is necessary to specify the controller performance and to tune the controller parameters. In this work, an approach is presented to establish an intelligent control system for oxidoreductive yeast propagation as a representative process biased by the aforementioned uncertainties. The presented approach is based on statistical process control and fuzzy logic feedback control. As the cognitive uncertainty among different experts about the limits that define the control performance as still acceptable may differ a lot, a data-driven design method is performed. Based upon a historic data pool statistical process corridors are derived for the controller inputs control error and change in control error. This approach follows the hypothesis that if the control performance criteria stay within predefined statistical boundaries, the final process state meets the required quality definition. In order to keep the process on its optimal growth trajectory (model based reference trajectory a fuzzy logic controller is used that alternates the process temperature. Additionally, in order to stay within the process corridors, a genetic algorithm was applied to tune the input and output fuzzy sets of a preliminarily parameterized fuzzy controller. The presented experimental results show that the genetic tuned fuzzy controller is able to keep the process within its allowed limits. The average absolute error to the 9. Multivariate Statistical Process Monitoring and Control：Recent Developments and Applications to Chemical Industry Institute of Scientific and Technical Information of China (English) 梁军; 钱积新 2003-01-01 Multivariate statistical process monitoring and control (MSPM& C) methods for chemical process monitoring with statistical projection techniques such as principal component analysis (PCA) and partial least squares (PLS) are surveyed in this paper,The four-step procedure of performing MSPM &C for chemical process ,modeling of processes ,detecting abnormal events or faults,identifying the variable(s) responible for the faults and diagnosing the source cause for the abnormal behavior,is analyzed,Several main research directions of MSPM&C reported in the literature are discussed,such as multi-way principal component analysis (MPCA) for batch process ,statistical monitoring and control for nonlinear process,dynamic PCA and dynamic PLS,and on -line quality control by infer-ential models,Industrial applications of MSPM&C to several typical chemical processes ,such as chemical reactor,distillation column,polymeriztion process ,petroleum refinery units,are summarized,Finally,some concluding remarks and future considerations are made. 10. Assessing segmentation processes by click detection: online measure of statistical learning, or simple interference? Science.gov (United States) Franco, Ana; Gaillard, Vinciane; Cleeremans, Axel; Destrebecqz, Arnaud 2015-12-01 Statistical learning can be used to extract the words from continuous speech. Gómez, Bion, and Mehler (Language and Cognitive Processes, 26, 212-223, 2011) proposed an online measure of statistical learning: They superimposed auditory clicks on a continuous artificial speech stream made up of a random succession of trisyllabic nonwords. Participants were instructed to detect these clicks, which could be located either within or between words. The results showed that, over the length of exposure, reaction times (RTs) increased more for within-word than for between-word clicks. This result has been accounted for by means of statistical learning of the between-word boundaries. However, even though statistical learning occurs without an intention to learn, it nevertheless requires attentional resources. Therefore, this process could be affected by a concurrent task such as click detection. In the present study, we evaluated the extent to which the click detection task indeed reflects successful statistical learning. Our results suggest that the emergence of RT differences between within- and between-word click detection is neither systematic nor related to the successful segmentation of the artificial language. Therefore, instead of being an online measure of learning, the click detection task seems to interfere with the extraction of statistical regularities. 11. Principles for Checking the Statistical Results Processing Correctness of the Cavendish Classic Experiment Directory of Open Access Journals (Sweden) V. N. Tutubalin 2016-01-01 Full Text Available In teaching mathematical statistics it is desirable that students of engineering and natural sciences could study the methods of statistical processing based on data of real experiments. Conditions for these experiments are of critical importance to justify the application of statistical methods.The article considers a classic Henry Cavendish’s experiment to determine a mean density of the Earth from this point of view. The article gives a detailed description of the experimental Cavendish’s setup, ideas, his experiments are based on, and a method to determine the values used for assessment of the mean density of the Earth. It also concretizes the equation of a pendulum model with friction on which Cavendish implicitly (and neglecting a friction relied.It is shown that the formal use of methods of mathematical statistics is not always justified. Detailed records of all experiments, published by Cavendish, enable us to study these data in terms of mathematical statistics, convince us of their statistical inhomogeneity and impossibility to construct a confidence interval to estimate accuracy.The article proposes an alternative way for processing Cavendish's data implicitly using the pendulum model equation with friction to reduce an effect of systematic errors and improve matching the Cavendish results with modern data. 12. Cogeneration technology alternatives study. Volume 2: Industrial process characteristics Science.gov (United States) 1980-01-01 Information and data for 26 industrial processes are presented. The following information is given for each process: (1) a description of the process including the annual energy consumption and product production and plant capacity; (2) the energy requirements of the process for each unit of production and the detailed data concerning electrical energy requirements and also hot water, steam, and direct fired thermal requirements; (3) anticipated trends affecting energy requirements with new process or production technologies; and (4) representative plant data including capacity and projected requirements through the year 2000. 13. Statistics of Infima and Stopping Times of Entropy Production and Applications to Active Molecular Processes Science.gov (United States) Neri, Izaak; Roldán, Édgar; Jülicher, Frank 2017-01-01 We study the statistics of infima, stopping times, and passage probabilities of entropy production in nonequilibrium steady states, and we show that they are universal. We consider two examples of stopping times: first-passage times of entropy production and waiting times of stochastic processes, which are the times when a system reaches a given state for the first time. Our main results are as follows: (i) The distribution of the global infimum of entropy production is exponential with mean equal to minus Boltzmann's constant; (ii) we find exact expressions for the passage probabilities of entropy production; (iii) we derive a fluctuation theorem for stopping-time distributions of entropy production. These results have interesting implications for stochastic processes that can be discussed in simple colloidal systems and in active molecular processes. In particular, we show that the timing and statistics of discrete chemical transitions of molecular processes, such as the steps of molecular motors, are governed by the statistics of entropy production. We also show that the extreme-value statistics of active molecular processes are governed by entropy production; for example, we derive a relation between the maximal excursion of a molecular motor against the direction of an external force and the infimum of the corresponding entropy-production fluctuations. Using this relation, we make predictions for the distribution of the maximum backtrack depth of RNA polymerases, which follow from our universal results for entropy-production infima. 14. Statistical Monitoring of Chemical Processes Based on Sensitive Kernel Principal Components Institute of Scientific and Technical Information of China (English) JIANG Qingchao; YAN Xuefeng 2013-01-01 The kernel principal component analysis (KPCA) method employs the first several kernel principal components (KPCs),which indicate the most variance information of normal observations for process monitoring,but may not reflect the fault information.In this study,sensitive kernel principal component analysis (SKPCA) is proposed to improve process monitoring performance,i.e.,to deal with the discordance of T2 statistic and squared prediction error δspE statistic and reduce missed detection rates.T2 statistic can be used to measure the variation directly along each KPC and analyze the detection performance as well as capture the most useful information in a process.With the calculation of the change rate of T2 statistic along each KPC,SKPCA selects the sensitive kernel principal components for process monitoring.A simulated simple system and Tennessee Eastman process are employed to demonstrate the efficiency of SKPCA on online monitoring.The results indicate that the monitoring performance is improved significantly. 15. STUDY ON COPOLYMER EPOXY RESIN MATRIX WITHOUT SHRINKAGE PART I VOLUME CHANGE DURING CURE PROCESSES Institute of Scientific and Technical Information of China (English) HE Pingsheng; ZHOU Zhiqiang; WANG Gengchao; PAN Caiyuan; WU Renjie 1988-01-01 The volume change of the copolymer epoxy resins can be controlled by copolymerizing epoxy resin E51 with 3,9-di (5-norbornene-2, 2)-1, 5, 7, 11-tetraoxaspiro [5, 5] undecane (NSOC). During curing,the volume changes of copolymer epoxy resins with various amounts of NSOC were measured with a dilatometer. Cure process does not produce volume change when epoxy resin E51: NSOC is 5.88: 1 in equivalent. 16. Theoretical test of Jarzynski's equality for reversible volume-switching processes of an ideal gas system. Science.gov (United States) Sung, Jaeyoung 2007-07-01 We present an exact theoretical test of Jarzynski's equality (JE) for reversible volume-switching processes of an ideal gas system. The exact analysis shows that the prediction of JE for the free energy difference is the same as the work done on the gas system during the reversible process that is dependent on the shape of path of the reversible volume-switching process. 17. IMPROVING QUALITY OF STATISTICAL PROCESS CONTROL BY DEALING WITH NON‐NORMAL DATA IN AUTOMOTIVE INDUSTRY Directory of Open Access Journals (Sweden) Zuzana ANDRÁSSYOVÁ 2012-07-01 Full Text Available Study deals with an analysis of data to the effect that it improves the quality of statistical tools in processes of assembly of automobile seats. Normal distribution of variables is one of inevitable conditions for the analysis, examination, and improvement of the manufacturing processes (f. e.: manufacturing process capability although, there are constantly more approaches to non‐normal data handling. An appropriate probability distribution of measured data is firstly tested by the goodness of fit of empirical distribution with theoretical normal distribution on the basis of hypothesis testing using programme StatGraphics Centurion XV.II. Data are collected from the assembly process of 1st row automobile seats for each characteristic of quality (Safety Regulation ‐S/R individually. Study closely processes the measured data of an airbag´s assembly and it aims to accomplish the normal distributed data and apply it the statistical process control. Results of the contribution conclude in a statement of rejection of the null hypothesis (measured variables do not follow the normal distribution therefore it is necessary to begin to work on data transformation supported by Minitab15. Even this approach does not reach a normal distributed data and so should be proposed a procedure that leads to the quality output of whole statistical control of manufacturing processes. 18. Large-Deviation Results for Discriminant Statistics of Gaussian Locally Stationary Processes Directory of Open Access Journals (Sweden) Junichi Hirukawa 2012-01-01 Full Text Available This paper discusses the large-deviation principle of discriminant statistics for Gaussian locally stationary processes. First, large-deviation theorems for quadratic forms and the log-likelihood ratio for a Gaussian locally stationary process with a mean function are proved. Their asymptotics are described by the large deviation rate functions. Second, we consider the situations where processes are misspecified to be stationary. In these misspecified cases, we formally make the log-likelihood ratio discriminant statistics and derive the large deviation theorems of them. Since they are complicated, they are evaluated and illustrated by numerical examples. We realize the misspecification of the process to be stationary seriously affecting our discrimination. 19. Using Statistical Process Control Charts to Study Stuttering Frequency Variability during a Single Day Science.gov (United States) Karimi, Hamid; O'Brian, Sue; Onslow, Mark; Jones, Mark; Menzies, Ross; Packman, Ann 2013-01-01 Purpose: Stuttering varies between and within speaking situations. In this study, the authors used statistical process control charts with 10 case studies to investigate variability of stuttering frequency. Method: Participants were 10 adults who stutter. The authors counted the percentage of syllables stuttered (%SS) for segments of their speech… 20. Using Statistical Process Control Charts to Study Stuttering Frequency Variability during a Single Day Science.gov (United States) Karimi, Hamid; O'Brian, Sue; Onslow, Mark; Jones, Mark; Menzies, Ross; Packman, Ann 2013-01-01 Purpose: Stuttering varies between and within speaking situations. In this study, the authors used statistical process control charts with 10 case studies to investigate variability of stuttering frequency. Method: Participants were 10 adults who stutter. The authors counted the percentage of syllables stuttered (%SS) for segments of their speech… 1. Learning Curves and Bootstrap Estimates for Inference with Gaussian Processes: A Statistical Mechanics Study DEFF Research Database (Denmark) Malzahn, Dorthe; Opper, Manfred 2003-01-01 We employ the replica method of statistical physics to study the average case performance of learning systems. The new feature of our theory is that general distributions of data can be treated, which enables applications to real data. For a class of Bayesian prediction models which are based on ...... on Gaussian processes, we discuss Bootstrap estimates for learning curves.... 2. Riesz transforms in statistical signal processing and their applications to speckle metrology: a review DEFF Research Database (Denmark) Wang, Wei; Zhang, Shun; Ma, Ning; 2015-01-01 In this paper, a high-dimensional statistical signal processing is revisited with the aim of introducing the concept of vector signal representation derived from the Riesz transforms, which are the natural extension and generalization of the one-dimensional Hilbert transform. Under the new concep... 3. Must a process be in statistical control before conducting designed experiments? NARCIS (Netherlands) Bisgaard, S. 2008-01-01 Fisher demonstrated three quarters of a century ago that the three key concepts of randomization, blocking, and replication make it possible to conduct experiments on processes that are not necessarily in a state of statistical control. However, even today there persists confusion about whether stat 4. Fieldcrest Cannon, Inc. Advanced Technical Preparation. Statistical Process Control (SPC). PRE-SPC I. Instructor Book. Science.gov (United States) Averitt, Sallie D. This instructor guide, which was developed for use in a manufacturing firm's advanced technical preparation program, contains the materials required to present a learning module that is designed to prepare trainees for the program's statistical process control module by improving their basic math skills and instructing them in basic calculator… 5. Eliciting and Developing Teachers' Conceptions of Random Processes in a Probability and Statistics Course Science.gov (United States) Smith, Toni M.; Hjalmarson, Margret A. 2013-01-01 The purpose of this study is to examine prospective mathematics specialists' engagement in an instructional sequence designed to elicit and develop their understandings of random processes. The study was conducted with two different sections of a probability and statistics course for K-8 teachers. Thirty-two teachers participated. Video analyses… 6. Design of U-Geometry Parameters Using Statistical Analysis Techniques in the U-Bending Process OpenAIRE Wiriyakorn Phanitwong; Untika Boochakul; Sutasn Thipprakmas 2017-01-01 The various U-geometry parameters in the U-bending process result in processing difficulties in the control of the spring-back characteristic. In this study, the effects of U-geometry parameters, including channel width, bend angle, material thickness, tool radius, as well as workpiece length, and their design, were investigated using a combination of finite element method (FEM) simulation, and statistical analysis techniques. Based on stress distribution analyses, the FEM simulation results ... 7. Gasoline from coal in the state of Illinois: feasibility study. Volume I. Design. [KBW gasification process, ICI low-pressure methanol process and Mobil M-gasoline process Energy Technology Data Exchange (ETDEWEB) 1980-01-01 Volume 1 describes the proposed plant: KBW gasification process, ICI low-pressure methanol process and Mobil M-gasoline process, and also with ancillary processes, such as oxygen plant, shift process, RECTISOL purification process, sulfur recovery equipment and pollution control equipment. Numerous engineering diagrams are included. (LTN) 8. Gasoline from coal in the state of Illinois: feasibility study. Volume I. Design. [KBW gasification process, ICI low-pressure methanol process and Mobil M-gasoline process Energy Technology Data Exchange (ETDEWEB) 1980-01-01 Volume 1 describes the proposed plant: KBW gasification process, ICI low-pressure methanol process and Mobil M-gasoline process, and also with ancillary processes, such as oxygen plant, shift process, RECTISOL purification process, sulfur recovery equipment and pollution control equipment. Numerous engineering diagrams are included. (LTN) 9. Efficient visual system processing of spatial and luminance statistics in representational and non-representational art Science.gov (United States) Graham, Daniel J.; Friedenberg, Jay D.; Rockmore, Daniel N. 2009-02-01 An emerging body of research suggests that artists consistently seek modes of representation that are efficiently processed by the human visual system, and that these shared properties could leave statistical signatures. In earlier work, we showed evidence that perceived similarity of representational art could be predicted using intensity statistics to which the early visual system is attuned, though semantic content was also found to be an important factor. Here we report two studies that examine the visual perception of similarity. We test a collection of non-representational art, which we argue possesses useful statistical and semantic properties, in terms of the relationship between image statistics and basic perceptual responses. We find two simple statistics-both expressed as single values-that predict nearly a third of the overall variance in similarity judgments of abstract art. An efficient visual system could make a quick and reasonable guess as to the relationship of a given image to others (i.e., its context) by extracting these basic statistics early in the visual stream, and this may hold for natural scenes as well as art. But a major component of many types of art is representational content. In a second study, we present findings related to efficient representation of natural scene luminances in landscapes by a well-known painter. We show empirically that elements of contemporary approaches to high-dynamic range tone-mapping-which are themselves deeply rooted in an understanding of early visual system coding-are present in the way Vincent Van Gogh transforms scene luminances into painting luminances. We argue that global tone mapping functions are a useful descriptor of an artist's perceptual goals with respect to global illumination and we present evidence that mapping the scene to a painting with different implied lighting properties produces a less efficient mapping. Together, these studies suggest that statistical regularities in art can shed 10. Statistically Qualified Neuro-Analytic system and Method for Process Monitoring Energy Technology Data Exchange (ETDEWEB) Vilim, Richard B.; Garcia, Humberto E.; Chen, Frederick W. 1998-11-04 An apparatus and method for monitoring a process involves development and application of a statistically qualified neuro-analytic (SQNA) model to accurately and reliably identify process change. The development of the SQNA model is accomplished in two steps: deterministic model adaption and stochastic model adaptation. Deterministic model adaption involves formulating an analytic model of the process representing known process characteristics,augmenting the analytic model with a neural network that captures unknown process characteristics, and training the resulting neuro-analytic model by adjusting the neural network weights according to a unique scaled equation emor minimization technique. Stochastic model adaptation involves qualifying any remaining uncertainty in the trained neuro-analytic model by formulating a likelihood function, given an error propagation equation, for computing the probability that the neuro-analytic model generates measured process output. Preferably, the developed SQNA model is validated using known sequential probability ratio tests and applied to the process as an on-line monitoring system. 11. A survey of image processing techniques and statistics for ballistic specimens in forensic science. Science.gov (United States) Gerules, George; Bhatia, Sanjiv K; Jackson, Daniel E 2013-06-01 This paper provides a review of recent investigations on the image processing techniques used to match spent bullets and cartridge cases. It is also, to a lesser extent, a review of the statistical methods that are used to judge the uniqueness of fired bullets and spent cartridge cases. We review 2D and 3D imaging techniques as well as many of the algorithms used to match these images. We also provide a discussion of the strengths and weaknesses of these methods for both image matching and statistical uniqueness. The goal of this paper is to be a reference for investigators and scientists working in this field. 12. A new ordering principle for the classical statistical analysis of Poisson processes with background CERN Document Server Giunti, C 1999-01-01 Inspired by the recent proposal by Feldman and Cousins of a unified approach to the classical statistical analysis of small signals'' based on a choice of ordering in Neyman's construction of classical confidence intervals, I propose a new ordering principle for the classical statistical analysis of Poisson processes with background which minimizes the effect on the resulting confidence intervals of the observation of less background events than expected. The new ordering principle is applied to the calculation of the confidence region implied by the recent null result of the KARMEN neutrino oscillation experiment. 13. New ordering principle for the classical statistical analysis of Poisson processes with background Science.gov (United States) Giunti, C. 1999-03-01 Inspired by the recent proposal by Feldman and Cousins of a unified approach to the classical statistical analysis of small signals'' based on a choice of ordering in Neyman's construction of classical confidence intervals, I propose a new ordering principle for the classical statistical analysis of Poisson processes with a background which minimizes the effect on the resulting confidence intervals of the observation of fewer background events than expected. The new ordering principle is applied to the calculation of the confidence region implied by the recent null result of the KARMEN neutrino oscillation experiment. 14. A numerical-statistical approach to determining the representative elementary volume (REV of cement paste for measuring diffusivity Directory of Open Access Journals (Sweden) Zhang, M. Z. 2010-12-01 Full Text Available Concrete diffusivity is a function of its microstructure on many scales, ranging from nanometres to millimetres. Multi-scale techniques are therefore needed to model this parameter. Representative elementary volume (REV, in conjunction with the homogenization principle, is one of the most common multi-scale approaches. This study aimed to establish a procedure for establishing the REV required to determine cement paste diffusivity based on a three-step, numerical-statistical approach. First, several series of 3D cement paste microstructures were generated with HYMOSTRUC3D, a cement hydration and microstructure model, for different volumes of cement paste and w/c ratios ranging from 0.30 to 0.60. Second, the finite element method was used to simulate the diffusion of tritiated water through these microstructures. Effective cement paste diffusivity values for different REVs were obtained by applying Fick’s law. Finally, statistical analysis was used to find the fluctuation in effective diffusivity with cement paste volume, from which the REV was then determined. The conclusion drawn was that the REV for measuring diffusivity in cement paste is 100x100x100 μm3. La difusividad del hormigón depende de su microestructura a numerosas escalas, desde nanómetros hasta milímetros, por lo que se precisa de técnicas multiescala para representar este parámetro. Junto con el principio de homogeneización, uno de los métodos multiescala más habituales es el volumen elemental representativo (VER. El objeto de este estudio era establecer un procedimiento que permitiera determinar el VER necesario para calcular la difusividad de la pasta de cemento, basándose en un método numéricoestadístico que consta de tres etapas. Primero, se crearon varias series de microestructuras de pasta de cemento en 3D con HYMOSTRUC3D, un programa que permite crear un modelo de la hidratación y microestructura del cemento. Luego se empleó el método de 15. The use of process models to inform and improve statistical models of nitrate occurrence, Great Miami River Basin, southwestern Ohio Science.gov (United States) Walter, Donald A.; Starn, J. Jeffrey 2013-01-01 Statistical models of nitrate occurrence in the glacial aquifer system of the northern United States, developed by the U.S. Geological Survey, use observed relations between nitrate concentrations and sets of explanatory variables—representing well-construction, environmental, and source characteristics— to predict the probability that nitrate, as nitrogen, will exceed a threshold concentration. However, the models do not explicitly account for the processes that control the transport of nitrogen from surface sources to a pumped well and use area-weighted mean spatial variables computed from within a circular buffer around the well as a simplified source-area conceptualization. The use of models that explicitly represent physical-transport processes can inform and, potentially, improve these statistical models. Specifically, groundwater-flow models simulate advective transport—predominant in many surficial aquifers— and can contribute to the refinement of the statistical models by (1) providing for improved, physically based representations of a source area to a well, and (2) allowing for more detailed estimates of environmental variables. A source area to a well, known as a contributing recharge area, represents the area at the water table that contributes recharge to a pumped well; a well pumped at a volumetric rate equal to the amount of recharge through a circular buffer will result in a contributing recharge area that is the same size as the buffer but has a shape that is a function of the hydrologic setting. These volume-equivalent contributing recharge areas will approximate circular buffers in areas of relatively flat hydraulic gradients, such as near groundwater divides, but in areas with steep hydraulic gradients will be elongated in the upgradient direction and agree less with the corresponding circular buffers. The degree to which process-model-estimated contributing recharge areas, which simulate advective transport and therefore account for 16. Effective application of statistical process control (SPC on the lengthwise tonsure rolled plates process Directory of Open Access Journals (Sweden) D. Noskievičová 2012-01-01 Full Text Available This paper deals with the effective application of SPC on the lengthwise tonsure rolled plates process on double side scissors. After explanation of the SPC fundamentals, goals and mistakes during the SPC implementation, the methodical framework for the effective SPC application is defined. In the next part of the paper the description of practical application of SPC and its analysis from the point of view of this framework is accomplished. 17. Effect of hospital volume on processes of breast cancer care: A National Cancer Data Base study. Science.gov (United States) Yen, Tina W F; Pezzin, Liliana E; Li, Jianing; Sparapani, Rodney; Laud, Purushuttom W; Nattinger, Ann B 2017-05-15 The purpose of this study was to examine variations in delivery of several breast cancer processes of care that are correlated with lower mortality and disease recurrence, and to determine the extent to which hospital volume explains this variation. Women who were diagnosed with stage I-III unilateral breast cancer between 2007 and 2011 were identified within the National Cancer Data Base. Multiple logistic regression models were developed to determine whether hospital volume was independently associated with each of 10 individual process of care measures addressing diagnosis and treatment, and 2 composite measures assessing appropriateness of systemic treatment (chemotherapy and hormonal therapy) and locoregional treatment (margin status and radiation therapy). Among 573,571 women treated at 1755 different hospitals, 38%, 51%, and 10% were treated at high-, medium-, and low-volume hospitals, respectively. On multivariate analysis controlling for patient sociodemographic characteristics, treatment year and geographic location, hospital volume was a significant predictor for cancer diagnosis by initial biopsy (medium volume: odds ratio [OR] = 1.15, 95% confidence interval [CI] = 1.05-1.25; high volume: OR = 1.30, 95% CI = 1.14-1.49), negative surgical margins (medium volume: OR = 1.15, 95% CI = 1.06-1.24; high volume: OR = 1.28, 95% CI = 1.13-1.44), and appropriate locoregional treatment (medium volume: OR = 1.12, 95% CI = 1.07-1.17; high volume: OR = 1.16, 95% CI = 1.09-1.24). Diagnosis of breast cancer before initial surgery, negative surgical margins and appropriate use of radiation therapy may partially explain the volume-survival relationship. Dissemination of these processes of care to a broader group of hospitals could potentially improve the overall quality of care and outcomes of breast cancer survivors. Cancer 2017;123:957-66. © 2016 American Cancer Society. © 2016 American Cancer Society. 18. Language Diversity and Cognitive Representations. Human Cognitive Processing, Volume 3. Science.gov (United States) Fuchs, Catherine, Ed.; Robert, Stephane, Ed. This book brings together the contributions of individual language scholars, linguists, anthropologists, psychologists, and neurophysicians. Each chapter focuses on the human cognitive processes involved in language activity and the impact of language diversity on them. The basic issue is how to correlate language diversity with the universality… 19. Cogeneration Technology Alternatives Study (CTAS). Volume 3: Industrial processes Science.gov (United States) Palmer, W. B.; Gerlaugh, H. E.; Priestley, R. R. 1980-01-01 Cogenerating electric power and process heat in single energy conversion systems rather than separately in utility plants and in process boilers is examined in terms of cost savings. The use of various advanced energy conversion systems are examined and compared with each other and with current technology systems for their savings in fuel energy, costs, and emissions in individual plants and on a national level. About fifty industrial processes from the target energy consuming sectors were used as a basis for matching a similar number of energy conversion systems that are considered as candidate which can be made available by the 1985 to 2000 time period. The sectors considered included food, textiles, lumber, paper, chemicals, petroleum, glass, and primary metals. The energy conversion systems included steam and gas turbines, diesels, thermionics, stirling, closed cycle and steam injected gas turbines, and fuel cells. Fuels considered were coal, both coal and petroleum based residual and distillate liquid fuels, and low Btu gas obtained through the on site gasification of coal. An attempt was made to use consistent assumptions and a consistent set of ground rules specified by NASA for determining performance and cost. Data and narrative descriptions of the industrial processes are given. 20. Statistical Study to Evaluate the Effect of Processing Variables on Shrinkage Incidence During Solidification of Nodular Cast Irons Science.gov (United States) Gutiérrez, J. M.; Natxiondo, A.; Nieves, J.; Zabala, A.; Sertucha, J. 2017-04-01 The study of shrinkage incidence variations in nodular cast irons is an important aspect of manufacturing processes. These variations change the feeding requirements on castings and the optimization of risers' size is consequently affected when avoiding the formation of shrinkage defects. The effect of a number of processing variables on the shrinkage size has been studied using a layout specifically designed for this purpose. The β parameter has been defined as the relative volume reduction from the pouring temperature up to the room temperature. It is observed that shrinkage size and β decrease as effective carbon content increases and when inoculant is added in the pouring stream. A similar effect is found when the parameters selected from cooling curves show high graphite nucleation during solidification of cast irons for a given inoculation level. Pearson statistical analysis has been used to analyze the correlations among all involved variables and a group of Bayesian networks have been subsequently built so as to get the best accurate model for predicting β as a function of the input processing variables. The developed models can be used in foundry plants to study the shrinkage incidence variations in the manufacturing process and to optimize the related costs. 1. DEVELOPMENT OF A METHOD STATISTICAL ANALYSIS ACCURACY AND PROCESS STABILITY PRODUCTION OF EPOXY RESIN ED-20 Directory of Open Access Journals (Sweden) N. V. Zhelninskaya 2015-01-01 Full Text Available Statistical methods play an important role in the objective evaluation of quantitative and qualitative characteristics of the process and are one of the most important elements of the quality assurance system production and total quality management process. To produce a quality product, one must know the real accuracy of existing equipment, to determine compliance with the accuracy of a selected technological process specified accuracy products, assess process stability. Most of the random events in life, particularly in manufacturing and scientific research, are characterized by the presence of a large number of random factors, is described by a normal distribution, which is the main in many practical studies. Modern statistical methods is quite difficult to grasp and wide practical use without in-depth mathematical training of all participants in the process. When we know the distribution of a random variable, you can get all the features of this batch of products, to determine the mean value and the variance. Using statistical control methods and quality control in the analysis of accuracy and stability of the technological process of production of epoxy resin ED20. Estimated numerical characteristics of the law of distribution of controlled parameters and determined the percentage of defects of the investigated object products. For sustainability assessment of manufacturing process of epoxy resin ED-20 selected Shewhart control charts, using quantitative data, maps of individual values of X and sliding scale R. Using Pareto charts identify the causes that affect low dynamic viscosity in the largest extent. For the analysis of low values of dynamic viscosity were the causes of defects using Ishikawa diagrams, which shows the most typical factors of the variability of the results of the process. To resolve the problem, it is recommended to modify the polymer composition of carbon fullerenes and to use the developed method for the production of 2. Los Alamos Controlled Air Incinerator for radioactive waste. Volume I. Rationale, process, equipment, performance, and recommendations Energy Technology Data Exchange (ETDEWEB) Neuls, A.S.; Draper, W.E.; Koenig, R.A.; Newmyer, J.M.; Warner, C.L. 1982-08-01 This two-volume report is a detailed design and operating documentation of the Los Alamos National Laboratory Controlled Air Incinerator (CAI) and is an aid to technology transfer to other Department of Energy contractor sites and the commercial sector. Volume I describes the CAI process, equipment, and performance, and it recommends modifications based on Los Alamos experience. It provides the necessary information for conceptual design and feasibility studies. Volume II provides descriptive engineering information such as drawing, specifications, calculations, and costs. It aids duplication of the process at other facilities. 3. Measurement and modeling of advanced coal conversion processes, Volume III Energy Technology Data Exchange (ETDEWEB) Ghani, M.U.; Hobbs, M.L.; Hamblen, D.G. [and others 1993-08-01 A generalized one-dimensional, heterogeneous, steady-state, fixed-bed model for coal gasification and combustion is presented. The model, FBED-1, is a design and analysis tool that can be used to simulate a variety of gasification, devolatilization, and combustion processes. The model considers separate gas and solid temperatures, axially variable solid and gas flow rates, variable bed void fraction, coal drying, devolatilization based on chemical functional group composition, depolymerization, vaporization and crosslinking, oxidation, and gasification of char, and partial equilibrium in the gas phase. 4. Estimation of Apple Volume and Its Shape Indentation Using Image Processing Technique and Neural Network Directory of Open Access Journals (Sweden) M Jafarlou 2014-04-01 Full Text Available Physical properties of agricultural products such as volume are the most important parameters influencing grading and packaging systems. They should be measured accurately as they are considered for any good system design. Image processing and neural network techniques are both non-destructive and useful methods which are recently used for such purpose. In this study, the images of apples were captured from a constant distance and then were processed in MATLAB software and the edges of apple images were extracted. The interior area of apple image was divided into some thin trapezoidal elements perpendicular to longitudinal axis. Total volume of apple was estimated by the summation of incremental volumes of these elements revolved around the apple’s longitudinal axis. The picture of half cut apple was also captured in order to obtain the apple shape’s indentation volume, which was subtracted from the previously estimated total volume of apple. The real volume of apples was measured using water displacement method and the relation between the real volume and estimated volume was obtained. The t-test and Bland-Altman indicated that the difference between the real volume and the estimated volume was not significantly different (p>0.05 i.e. the mean difference was 1.52 cm3 and the accuracy of measurement was 92%. Utilizing neural network with input variables of dimension and mass has increased the accuracy up to 97% and the difference between the mean of volumes decreased to 0.7 cm3. 5. Federal Funds for Research and Development: Fiscal Years 1980, 1981, and 1982. Volume XXX. Detailed Statistical Tables. Surveys of Science Resources Series. Science.gov (United States) National Science Foundation, Washington, DC. During the March through July 1981 period a total of 36 Federal agencies and their subdivisions (95 individual respondents) submitted data in response to the Annual Survey of Federal Funds for Research and Development, Volume XXX, conducted by the National Science Foundation. The detailed statistical tables presented in this report were derived… 6. Optimizing Friction Stir Welding via Statistical Design of Tool Geometry and Process Parameters Science.gov (United States) Blignault, C.; Hattingh, D. G.; James, M. N. 2012-06-01 This article considers optimization procedures for friction stir welding (FSW) in 5083-H321 aluminum alloy, via control of weld process parameters and tool design modifications. It demonstrates the potential utility of the "force footprint" (FF) diagram in providing a real-time graphical user interface (GUI) for process optimization of FSW. Multiple force, torque, and temperature responses were recorded during FS welding using 24 different tool pin geometries, and these data were statistically analyzed to determine the relative influence of a number of combinations of important process and tool geometry parameters on tensile strength. Desirability profile charts are presented, which show the influence of seven key combinations of weld process variables on tensile strength. The model developed in this study allows the weld tensile strength to be predicted for other combinations of tool geometry and process parameters to fall within an average error of 13%. General guidelines for tool profile selection and the likelihood of influencing weld tensile strength are also provided. 7. Statistical properties of a discrete version of the Ornstein-Uhlenbeck process. Science.gov (United States) Larralde, Hernán 2004-02-01 A discrete version of the Ornstein-Uhlenbeck process is discussed which arises from a simple generalization of the master equation of the random walk. The calculation of the statistical properties of the free propagator for this process can be obtained using essentially the same formalism as for simple random walks. These calculations are carried out in some detail for the one-dimensional case. The usual equation for the evolution of the probability distribution of the Ornstein-Uhlenbeck process is recovered in the continuum limit if the jump distribution has a finite variance. However, the discrete process is also well defined for long tailed jump distributions and, thus, can be used to describe a Lèvy walk under the effect of a harmonic potential. Finally, a brief discussion of the generalization of this process to describe random walks in general potentials is presented and briefly compared with results arising from the fractional diffusion approach. 8. ANALISIS KEHILANGAN MINYAK PADA CRUDE PALM OIL (CPO DENGAN MENGGUNAKAN METODE STATISTICAL PROCESS CONTROL Directory of Open Access Journals (Sweden) Vera Devani 2014-06-01 Full Text Available PKS “XYZ” merupakan perusahaan yang bergerak di bidang pengolahan kelapa sawit. Produk yang dihasilkan adalah Crude Palm Oil (CPO dan Palm Kernel Oil (PKO. Tujuan penelitian ini adalah menganalisa kehilangan minyak (oil losses dan faktor-faktor penyebab dengan menggunakan metoda Statistical Process Control. Statistical Process Control adalah sekumpulan strategi, teknik, dan tindakan yang diambil oleh sebuah organisasi untuk memastikan bahwa strategi tersebut menghasilkan produk yang berkualitas atau menyediakan pelayanan yang berkualitas. Sampel terjadinya oil losses pada CPO yang diteliti adalah tandan kosong (tankos, biji (nut, ampas (fibre, dan sludge akhir. Berdasarkan Peta Kendali I-MR dapat disimpulkan bahwa kondisi keempat jenis oil losses CPO berada dalam batas kendali dan konsisten. Sedangkan nilai Cpk dari total oil losses berada di luar batas kendali rata-rata proses, hal ini berarti CPO yang diproduksi telah memenuhi kebutuhan pelanggan, dengan total oil losses kurang dari batas maksimum yang ditetapkan oleh perusahaan yaitu 1,65%. 9. Nonlinear Statistical Process Monitoring Based on Control Charts with Memory Effect and Kernel Independent Component Analysis Institute of Scientific and Technical Information of China (English) 2007-01-01 A novel nonlinear combination process monitoring method was proposed based on techniques with memory effect (multivariate exponentially weighted moving average (MEWMA)) and kernel independent component analysis (KICA). The method was developed for dealing with nonlinear issues and detecting small or moderate drifts in one or more process variables with autocorrelation. MEWMA charts use additional information from the past history of the process for keeping the memory effect of the process behavior trend. KICA is a recently developed statistical technique for revealing hidden, nonlinear statistically independent factors that underlie sets of measurements and it is a two-phase algorithm: whitened kernel principal component analysis (KPCA) plus independent component analysis (ICA). The application to the fluid catalytic cracking unit (FCCU) simulated process indicates that the proposed combined method based on MEWMA and KICA can effectively capture the nonlinear relationship and detect small drifts in process variables. Its performance significantly outperforms monitoring method based on ICA, MEWMA-ICA and KICA, especially for long-term performance deterioration. 10. SOLTECH 92 proceedings: Solar Process Heat Program. Volume 1 Energy Technology Data Exchange (ETDEWEB) 1992-03-01 This document is a limited Proceedings, documenting the presentations given at the symposia conducted by the US Department of Energys (DOE) Solar Industrial Program and Solar Thermal Electrical Program at SOLTECH92. The SOLTECH92 national solar energy conference was held in Albuquerque, New Mexico during the period February 17--20, 1992. The National Renewable Energy Laboratory manages the Solar Industrial Program; Sandia National Laboratories (Albuquerque) manages the Solar Thermal Electric Program. The symposia sessions were as follows: (1) Solar Industrial Program and Solar Thermal Electric Program Overviews, (2) Solar Process Heat Applications, (3) Solar Decontamination of Water and Soil; (4) Solar Building Technologies, (5) Solar Thermal Electric Systems, (6) PV Applications and Technologies. For each presentation given in these symposia, these Proceedings provide a one- to two-page abstract and copies of the viewgraphs and/or 35mm slides utilized by the speaker. Some speakers provided additional materials in the interest of completeness. The materials presented in this document were not subjected to a peer review process. 11. Control of the aeration volume in an activated sludge process for nitrogen removal. Science.gov (United States) Samuelsson, P; Carlsson, B 2002-01-01 Biological nitrogen removal in an activated sludge process is obtained by two biological processes; nitrification and denitrification. Nitrifying bacteria need dissolved oxygen and a sufficiently large aeration volume for converting ammonium to nitrate in the wastewater. The objective of this paper is to develop an automatic control strategy for adjusting the aerated volume so that the effluent ammonium level can be kept close to a desired value despite major changes in the influent load. The strategy is based on applying exact linearization of the IAWO Activated Sludge Process Model No 1. Simulation results show that the suggested controller effectively attenuates process disturbances. 12. Optimization of Rolling Process for Bi(2223)/Ag Superconducting Tapes by a Statistical Method Institute of Scientific and Technical Information of China (English) 2001-01-01 Ag-sheathed (Bi,Pb)2Sr2Ca2Cu3Ox tapes were prepared by the powder-in-tube method. The influences of rolling parameters on superconducting characteristics of Bi(2223)/Ag tapes were analyzed qualitatively with a statistical method. The results demonstrate that roll diameter and reduction per pass significantly influence the properties of superconducting tapes while roll speed does less and working friction the least. An optimized rolling process was therefore achieved according to the above results. 13. Multivariate Statistical Process Control and Case-Based Reasoning for situation assessment of Sequencing Batch Reactors OpenAIRE Ruiz Ordóñez, Magda Liliana 2008-01-01 ABSRACTThis thesis focuses on the monitoring, fault detection and diagnosis of Wastewater Treatment Plants (WWTP), which are important fields of research for a wide range of engineering disciplines. The main objective is to evaluate and apply a novel artificial intelligent methodology based on situation assessment for monitoring and diagnosis of Sequencing Batch Reactor (SBR) operation. To this end, Multivariate Statistical Process Control (MSPC) in combination with Case-Based Reasoning (CBR)... 14. Treatment of the background error in the statistical analysis of Poisson processes CERN Document Server Giunti, C 1999-01-01 The formalism that allows to take into account the error sigma_b of the expected mean background b in the statistical analysis of a Poisson process with the frequentistic method is presented. It is shown that the error sigma_b cannot be neglected if it is not much smaller than sqrt(b). The resulting confidence belt is larger that the one for sigma_b=0, leading to larger confidence intervals for the mean mu of signal events. 15. Genetic covariation between brain volumes and IQ, reading performance, and processing speed. Science.gov (United States) Betjemann, Rebecca S; Johnson, Erin Phinney; Barnard, Holly; Boada, Richard; Filley, Christopher M; Filipek, Pauline A; Willcutt, Erik G; DeFries, John C; Pennington, Bruce F 2010-03-01 Although there has been much interest in the relation between brain size and cognition, few studies have investigated this relation within a genetic framework and fewer still in non-adult samples. We analyzed the genetic and environmental covariance between structural MRI data from four brain regions (total brain volume, neocortex, white matter, and prefrontal cortex), and four cognitive measures (verbal IQ (VIQ), performance IQ (PIQ), reading ability, and processing speed), in a sample of 41 MZ twin pairs and 30 same-sex DZ twin pairs (mean age at cognitive test = 11.4 years; mean age at scan = 15.4 years). Multivariate Cholesky decompositions were performed with each brain volume measure entered first, followed by the four cognitive measures. Consistent with previous research, each brain and cognitive measure was found to be significantly heritable. The novel finding was the significant genetic but not environmental covariance between brain volumes and cognitive measures. Specifically, PIQ shared significant common genetic variance with all four measures of brain volume (r (g) = .58-.82). In contrast, VIQ shared significant genetic influence with neocortex volume only (r (g) = .58). Processing speed was significant with total brain volume (r (g) = .79), neocortex (r (g) = .64), and white matter (r (g) = .89), but not prefrontal cortex. The only brain measure to share genetic influence with reading was total brain volume (r (g) = .32), which also shared genetic influences with processing speed. 16. Robust Multiscale Modelling Of Two-Phase Steels On Heterogeneous Hardware Infrastructures By Using Statistically Similar Representative Volume Element Directory of Open Access Journals (Sweden) Rauch Ł. 2015-09-01 Full Text Available The coupled finite element multiscale simulations (FE2 require costly numerical procedures in both macro and micro scales. Attempts to improve numerical efficiency are focused mainly on two areas of development, i.e. parallelization/distribution of numerical procedures and simplification of virtual material representation. One of the representatives of both mentioned areas is the idea of Statistically Similar Representative Volume Element (SSRVE. It aims at the reduction of the number of finite elements in micro scale as well as at parallelization of the calculations in micro scale which can be performed without barriers. The simplification of computational domain is realized by transformation of sophisticated images of material microstructure into artificially created simple objects being characterized by similar features as their original equivalents. In existing solutions for two-phase steels SSRVE is created on the basis of the analysis of shape coefficients of hard phase in real microstructure and searching for a representative simple structure with similar shape coefficients. Optimization techniques were used to solve this task. In the present paper local strains and stresses are added to the cost function in optimization. Various forms of the objective function composed of different elements were investigated and used in the optimization procedure for the creation of the final SSRVE. The results are compared as far as the efficiency of the procedure and uniqueness of the solution are considered. The best objective function composed of shape coefficients, as well as of strains and stresses, was proposed. Examples of SSRVEs determined for the investigated two-phase steel using that objective function are demonstrated in the paper. Each step of SSRVE creation is investigated from computational efficiency point of view. The proposition of implementation of the whole computational procedure on modern High Performance Computing (HPC 17. Super Efficient Refrigerator Program (SERP) evaluation. Volume 1: Process evaluation Energy Technology Data Exchange (ETDEWEB) Sandahl, L.J.; Ledbetter, M.R.; Chin, R.I.; Lewis, K.S.; Norling, J.M. 1996-01-01 The Pacific Northwest National Laboratory (PNNL) conducted this study for the US Department of Energy (DOE) as part of the Super Efficient Refrigerator Program (SERP) Evaluation. This report documents the SERP formation and implementation process, and identifies preliminary program administration and implementation issues. The findings are based primarily on interviews with those familiar with the program, such as utilities, appliance manufacturers, and SERP administrators. These interviews occurred primarily between March and April 1995, when SERP was in the early stages of program implementation. A forthcoming report will estimate the preliminary impacts of SERP within the industry and marketplace. Both studies were funded by DOE at the request of SERP Inc., which sought a third-party evaluation of its program. 18. Tungsten Ions in Plasmas: Statistical Theory of Radiative-Collisional Processes Directory of Open Access Journals (Sweden) Alexander V. Demura 2015-05-01 Full Text Available The statistical model for calculations of the collisional-radiative processes in plasmas with tungsten impurity was developed. The electron structure of tungsten multielectron ions is considered in terms of both the Thomas-Fermi model and the Brandt-Lundquist model of collective oscillations of atomic electron density. The excitation or ionization of atomic electrons by plasma electron impacts are represented as photo-processes under the action of flux of equivalent photons introduced by E. Fermi. The total electron impact single ionization cross-sections of ions Wk+ with respective rates have been calculated and compared with the available experimental and modeling data (e.g., CADW. Plasma radiative losses on tungsten impurity were also calculated in a wide range of electron temperatures 1 eV–20 keV. The numerical code TFATOM was developed for calculations of radiative-collisional processes involving tungsten ions. The needed computational resources for TFATOM code are orders of magnitudes less than for the other conventional numerical codes. The transition from corona to Boltzmann limit was investigated in detail. The results of statistical approach have been tested by comparison with the vast experimental and conventional code data for a set of ions Wk+. It is shown that the universal statistical model accuracy for the ionization cross-sections and radiation losses is within the data scattering of significantly more complex quantum numerical codes, using different approximations for the calculation of atomic structure and the electronic cross-sections. 19. A statistical property of multiagent learning based on Markov decision process. Science.gov (United States) Iwata, Kazunori; Ikeda, Kazushi; Sakai, Hideaki 2006-07-01 We exhibit an important property called the asymptotic equipartition property (AEP) on empirical sequences in an ergodic multiagent Markov decision process (MDP). Using the AEP which facilitates the analysis of multiagent learning, we give a statistical property of multiagent learning, such as reinforcement learning (RL), near the end of the learning process. We examine the effect of the conditions among the agents on the achievement of a cooperative policy in three different cases: blind, visible, and communicable. Also, we derive a bound on the speed with which the empirical sequence converges to the best sequence in probability, so that the multiagent learning yields the best cooperative result. 20. Monitoring Actuarial Present Values of Term Life Insurance By a Statistical Process Control Chart Science.gov (United States) Hafidz Omar, M. 2015-06-01 Tracking performance of life insurance or similar insurance policy using standard statistical process control chart is complex because of many factors. In this work, we present the difficulty in doing so. However, with some modifications of the SPC charting framework, the difficulty can be manageable to the actuaries. So, we propose monitoring a simpler but natural actuarial quantity that is typically found in recursion formulas of reserves, profit testing, as well as present values. We shared some simulation results for the monitoring process. Additionally, some advantages of doing so is discussed. 1. Statistical modeling of copper losses in the silicate slag of the sulfide concentrate smelting process OpenAIRE 2015-01-01 This article presents the results of the statistical modeling of copper losses in the silicate slag of the sulfide concentrates smelting process. The aim of this study was to define the correlation dependence of the degree of copper losses in the silicate slag on the following parameters of technological processes: SiO2, FeO, Fe3O4, CaO and Al2O3 content in the slag and copper content in the matte. Multiple linear regression analysis (MLRA), artificial neural networks (ANNs) and adaptive netw... 2. Statistical and dynamical aspects in fission process: The rotational degrees of freedom Indian Academy of Sciences (India) Bency John 2015-08-01 In the final phases of fission process, there are fast collective rotational degrees of freedom, which can exert a force on the slower tilting rotational degree. Experimental observations that lead to this realization and theoretical studies that account for dynamics of the processes are discussed briefly. Supported by these studies, and by assuming a conditional equilibrium of the collective rotational modes at a pre-scission point, a new statistical model for fission fragment angular and spin distributions has been developed. This model gives a consistent description of the fragment angular and spin distributions for a wide variety of heavy- and light-ion-induced fission reactions. 3. Process simulation and statistical approaches for validating waste form qualification models Energy Technology Data Exchange (ETDEWEB) Kuhn, W.L.; Toland, M.R.; Pulsipher, B.A. 1989-05-01 This report describes recent progress toward one of the principal objectives of the Nuclear Waste Treatment Program (NWTP) at the Pacific Northwest Laboratory (PNL): to establish relationships between vitrification process control and glass product quality. during testing of a vitrification system, it is important to show that departures affecting the product quality can be sufficiently detected through process measurements to prevent an unacceptable canister from being produced. Meeting this goal is a practical definition of a successful sampling, data analysis, and process control strategy. A simulation model has been developed and preliminarily tested by applying it to approximate operation of the West Valley Demonstration Project (WVDP) vitrification system at West Valley, New York. Multivariate statistical techniques have been identified and described that can be applied to analyze large sets of process measurements. Information on components, tanks, and time is then combined to create a single statistic through which all of the information can be used at once to determine whether the process has shifted away from a normal condition. 4. Statistics-enhanced multistage process models for integrated design &manufacturing of poly (vinyl alcohol) treated buckypaper Science.gov (United States) Wang, Kan Carbon nanotube (CNT) is considered a promising engineering material because of its exceptional mechanical, electrical, and thermal properties. Buckypaper (BP), a thin sheet of assembled CNTs, is an effective way to handle CNTs in macro scale. Pristine BP is a fragile material which is held together by weak van der Waals attractions among CNTs. This dissertation introduces a modified filtration based manufacturing process which uses poly (vinyl alcohol) (PVA) to treat BP. This treatment greatly improves the handleability of BP, reduces the spoilage during transferring, and shortens the production time. The multistage manufacturing process of PVA-treated BP is discussed in this dissertation, and process models are developed to predict the nanostructure of final products from the process parameters. Based on the nanostructure, a finite element based physical model for prediction of Young's modulus is also developed. This accuracy of this physical model is further improved by statistical methods. The aim of this study is to investigate and improve the scalability of the manufacturing process of PVA-treated BP. To achieve this goal, various statistical tools are employed. The unique issues in nanomanufacturing also motivate the development of new statistical tools and modification of existing tools. Those issues include the uncertainties in nanostructure characterization due to the scale, limited number experimental data due to high cost of raw materials, large variation in final product due to the random nature in structure, and the high complexity in physical models due to the small scale of structural building blocks. This dissertation addresses those issues by combining engineering field knowledge and statistical methods. The resulting statistics-enhanced physical model provides an approach to design the manufacturing process of PVA-treated BP for a targeting property and tailor the robustness of the final product by manipulating the process parameters. In addition 5. New advances in statistical modeling and applications CERN Document Server Santos, Rui; Oliveira, Maria; Paulino, Carlos 2014-01-01 This volume presents selected papers from the XIXth Congress of the Portuguese Statistical Society, held in the town of Nazaré, Portugal, from September 28 to October 1, 2011. All contributions were selected after a thorough peer-review process. It covers a broad range of papers in the areas of statistical science, probability and stochastic processes, extremes and statistical applications. 6. Statistical process monitoring based on orthogonal multi-manifold projections and a novel variable contribution analysis. Science.gov (United States) Tong, Chudong; Shi, Xuhua; Lan, Ting 2016-11-01 Multivariate statistical methods have been widely applied to develop data-based process monitoring models. Recently, a multi-manifold projections (MMP) algorithm was proposed for modeling and monitoring chemical industrial processes, the MMP is an effective tool for preserving the global and local geometric structure of the original data space in the reduced feature subspace, but it does not provide orthogonal basis functions for data reconstruction. Recognition of this issue, an improved version of MMP algorithm named orthogonal MMP (OMMP) is formulated. Based on the OMMP model, a further processing step and a different monitoring index are proposed to model and monitor the variation in the residual subspace. Additionally, a novel variable contribution analysis is presented for fault diagnosis by integrating the nearest in-control neighbor calculation and reconstruction-based contribution analysis. The validity and superiority of the proposed fault detection and diagnosis strategy are then validated through case studies on the Tennessee Eastman benchmark process. 7. Pierre Gy's sampling theory and sampling practice heterogeneity, sampling correctness, and statistical process control CERN Document Server Pitard, Francis F 1993-01-01 Pierre Gy's Sampling Theory and Sampling Practice, Second Edition is a concise, step-by-step guide for process variability management and methods. Updated and expanded, this new edition provides a comprehensive study of heterogeneity, covering the basic principles of sampling theory and its various applications. It presents many practical examples to allow readers to select appropriate sampling protocols and assess the validity of sampling protocols from others. The variability of dynamic process streams using variography is discussed to help bridge sampling theory with statistical process control. Many descriptions of good sampling devices, as well as descriptions of poor ones, are featured to educate readers on what to look for when purchasing sampling systems. The book uses its accessible, tutorial style to focus on professional selection and use of methods. The book will be a valuable guide for mineral processing engineers; metallurgists; geologists; miners; chemists; environmental scientists; and practit... 8. On Compound Poisson Processes Arising in Change-Point Type Statistical Models as Limiting Likelihood Ratios CERN Document Server Dachian, Serguei 2010-01-01 Different change-point type models encountered in statistical inference for stochastic processes give rise to different limiting likelihood ratio processes. In a previous paper of one of the authors it was established that one of these likelihood ratios, which is an exponential functional of a two-sided Poisson process driven by some parameter, can be approximated (for sufficiently small values of the parameter) by another one, which is an exponential functional of a two-sided Brownian motion. In this paper we consider yet another likelihood ratio, which is the exponent of a two-sided compound Poisson process driven by some parameter. We establish, that similarly to the Poisson type one, the compound Poisson type likelihood ratio can be approximated by the Brownian type one for sufficiently small values of the parameter. We equally discuss the asymptotics for large values of the parameter and illustrate the results by numerical simulations. 9. Kinetic Analysis of Dynamic Positron Emission Tomography Data using Open-Source Image Processing and Statistical Inference Tools. Science.gov (United States) Hawe, David; Hernández Fernández, Francisco R; O'Suilleabháin, Liam; Huang, Jian; Wolsztynski, Eric; O'Sullivan, Finbarr 2012-05-01 In dynamic mode, positron emission tomography (PET) can be used to track the evolution of injected radio-labelled molecules in living tissue. This is a powerful diagnostic imaging technique that provides a unique opportunity to probe the status of healthy and pathological tissue by examining how it processes substrates. The spatial aspect of PET is well established in the computational statistics literature. This article focuses on its temporal aspect. The interpretation of PET time-course data is complicated because the measured signal is a combination of vascular delivery and tissue retention effects. If the arterial time-course is known, the tissue time-course can typically be expressed in terms of a linear convolution between the arterial time-course and the tissue residue. In statistical terms, the residue function is essentially a survival function - a familiar life-time data construct. Kinetic analysis of PET data is concerned with estimation of the residue and associated functionals such as flow, flux, volume of distribution and transit time summaries. This review emphasises a nonparametric approach to the estimation of the residue based on a piecewise linear form. Rapid implementation of this by quadratic programming is described. The approach provides a reference for statistical assessment of widely used one- and two-compartmental model forms. We illustrate the method with data from two of the most well-established PET radiotracers, (15)O-H(2)O and (18)F-fluorodeoxyglucose, used for assessment of blood perfusion and glucose metabolism respectively. The presentation illustrates the use of two open-source tools, AMIDE and R, for PET scan manipulation and model inference. 10. Statistical Analysis of Deep Drilling Process Conditions Using Vibrations and Force Signals Directory of Open Access Journals (Sweden) Syafiq Hazwan 2016-01-01 Full Text Available Cooling systems is a key point for hot forming process of Ultra High Strength Steels (UHSS. Normally, cooling systems is made using deep drilling technique. Although deep twist drill is better than other drilling techniques in term of higher productivity however its main problem is premature tool breakage, which affects the production quality. In this paper, analysis of deep twist drill process parameters such as cutting speed, feed rate and depth of cut by using statistical analysis to identify the tool condition is presented. The comparisons between different two tool geometries are also studied. Measured data from vibrations and force sensors are being analyzed through several statistical parameters such as root mean square (RMS, mean, kurtosis, standard deviation and skewness. Result found that kurtosis and skewness value are the most appropriate parameters to represent the deep twist drill tool conditions behaviors from vibrations and forces data. The condition of the deep twist drill process been classified according to good, blunt and fracture. It also found that the different tool geometry parameters affect the performance of the tool drill. It believe the results of this study are useful in determining the suitable analysis method to be used for developing online tool condition monitoring system to identify the tertiary tool life stage and helps to avoid mature of tool fracture during drilling process. 11. Methods and algorithms for statistical processing of instantaneous meteorological parameters from ultrasonic measurements Science.gov (United States) Rohmistrov, D. S.; Bogushevich, A. Ya; Botygin, I. A. 2016-11-01 This paper describes a software system designed to support atmospheric studies with ultrasonic thermo-anemometer data processing. The system is capable of processing files containing sets of immediate values of temperature, three orthogonal wind velocity components, humidity, and pressure. The paper presents a technological scheme for selecting the necessary meteorological parameters depending on the observation time, the averaging interval, and the period between the immediate values. The data processing consists of three stages. At the initial stage, a query for the necessary meteorological parameters is executed. At the second stage, the system calculates the standard statistical characteristics of the meteorological fields, such as mean values, dispersion, standard deviation, asymmetric coefficients, kurtosis, correlation, etc. The third stage prepares to compute the atmospheric turbulence parameters. The system creates new arrays of data to process and calculate the second order statistical moments that are important for solving problems of atmospheric surface layer physics, predicting the pollutant dispersion in the atmosphere, etc. The calculation results are visualized and stored on a hard disk. 12. Influence of volume of sample processed on detection of Chlamydia trachomatis in urogenital samples by PCR NARCIS (Netherlands) Goessens, W H; Kluytmans, J A; den Toom, N; van Rijsoort-Vos, T H; Niesters, B G; Stolz, E; Verbrugh, H A; Quint, W G In the present study, it was demonstrated that the sensitivity of the PCR for the detection of Chlamydia trachomatis is influenced by the volume of the clinical sample which is processed in the PCR. An adequate sensitivity for PCR was established by processing at least 4%, i.e., 80 microliters, of 13. Influence of volume of sample processed on detection of Chlamydia trachomatis in urogenital samples by PCR NARCIS (Netherlands) Goessens, W H; Kluytmans, J A; den Toom, N; van Rijsoort-Vos, T H; Niesters, B G; Stolz, E; Verbrugh, H A; Quint, W G 1995-01-01 In the present study, it was demonstrated that the sensitivity of the PCR for the detection of Chlamydia trachomatis is influenced by the volume of the clinical sample which is processed in the PCR. An adequate sensitivity for PCR was established by processing at least 4%, i.e., 80 microliters, of t 14. Influence of volume of sample processed on detection of Chlamydia trachomatis in urogenital samples by PCR NARCIS (Netherlands) W.H.F. Goessens (Wil); J.A.J.W. Kluytmans (Jan); N. den Toom; T.H. van Rijsoort-Vos; E. Stolz (Ernst); H.A. Verbrugh (Henri); W.G.V. Quint (Wim); H.G.M. Niesters (Bert) 1995-01-01 textabstractIn the present study, it was demonstrated that the sensitivity of the PCR for the detection of Chlamydia trachomatis is influenced by the volume of the clinical sample which is processed in the PCR. An adequate sensitivity for PCR was established by processing at least 15. Image Data Processing System (IDAPS) user manual, S-056 experiment. Volume 1: System description. Volume 2: Batch IDAPS. Volume 3: Interactive IDAPS Science.gov (United States) 1975-01-01 Image data processing system (IDAPS) developed to satisfy the image processing requirements of the Skylab S-056 experiment is described. The S-056 experiment was designed to obtain high-resolution photographs of the sun in the far ultraviolet, or soft X-ray, portion of the electromagnetic spectrum. Thirty-five thousand photographs were obtained by the three flights of the program; and, faced with such a massive volume of imagery, the designers of the experiment decided to develop a computer-based system which would reduce the image processing workload. The purpose of the IDAPS User Manual is to give the IDAPS user the necessary information and instructions to effectively utilize the system. 16. In-Situ Statistical Analysis of Autotune Simulation Data using Graphical Processing Units Energy Technology Data Exchange (ETDEWEB) Ranjan, Niloo [ORNL; Sanyal, Jibonananda [ORNL; New, Joshua Ryan [ORNL 2013-08-01 Developing accurate building energy simulation models to assist energy efficiency at speed and scale is one of the research goals of the Whole-Building and Community Integration group, which is a part of Building Technologies Research and Integration Center (BTRIC) at Oak Ridge National Laboratory (ORNL). The aim of the Autotune project is to speed up the automated calibration of building energy models to match measured utility or sensor data. The workflow of this project takes input parameters and runs EnergyPlus simulations on Oak Ridge Leadership Computing Facility s (OLCF) computing resources such as Titan, the world s second fastest supercomputer. Multiple simulations run in parallel on nodes having 16 processors each and a Graphics Processing Unit (GPU). Each node produces a 5.7 GB output file comprising 256 files from 64 simulations. Four types of output data covering monthly, daily, hourly, and 15-minute time steps for each annual simulation is produced. A total of 270TB+ of data has been produced. In this project, the simulation data is statistically analyzed in-situ using GPUs while annual simulations are being computed on the traditional processors. Titan, with its recent addition of 18,688 Compute Unified Device Architecture (CUDA) capable NVIDIA GPUs, has greatly extended its capability for massively parallel data processing. CUDA is used along with C/MPI to calculate statistical metrics such as sum, mean, variance, and standard deviation leveraging GPU acceleration. The workflow developed in this project produces statistical summaries of the data which reduces by multiple orders of magnitude the time and amount of data that needs to be stored. These statistical capabilities are anticipated to be useful for sensitivity analysis of EnergyPlus simulations. 17. Optimization of Sinter Plant Operating Conditions Using Advanced Multivariate Statistics: Intelligent Data Processing Science.gov (United States) Fernández-González, Daniel; Martín-Duarte, Ramón; Ruiz-Bustinza, Íñigo; Mochón, Javier; González-Gasca, Carmen; Verdeja, Luis Felipe 2016-08-01 Blast furnace operators expect to get sinter with homogenous and regular properties (chemical and mechanical), necessary to ensure regular blast furnace operation. Blends for sintering also include several iron by-products and other wastes that are obtained in different processes inside the steelworks. Due to their source, the availability of such materials is not always consistent, but their total production should be consumed in the sintering process, to both save money and recycle wastes. The main scope of this paper is to obtain the least expensive iron ore blend for the sintering process, which will provide suitable chemical and mechanical features for the homogeneous and regular operation of the blast furnace. The systematic use of statistical tools was employed to analyze historical data, including linear and partial correlations applied to the data and fuzzy clustering based on the Sugeno Fuzzy Inference System to establish relationships among the available variables. 18. A statistical RCL interconnect delay model taking account of process variations Institute of Scientific and Technical Information of China (English) Zhu Zhang-Ming; wan Da-Jing; Yang Yin-Tang; En Yun-Fei 2011-01-01 As the feature size of the CMOS integrated circuit continues to shrink, process variations have become a key factor affecting the interconnect performance. Based on the equivalent Elmore model and the use of the polynomial chaos theory and the Galerkin method, we propose a linear statistical RCL interconnect delay model, taking into account process variations by successive application of the linear approximation method. Based on a variety of nano-CMOS process parameters, HSPICE simulation results show that the maximum error of the proposed model is less than 3.5%.The proposed model is simple, of high precision, and can be used in the analysis and design of nanometer integrated circuit interconnect systems. 19. Feasibility study of using statistical process control to customized quality assurance in proton therapy Energy Technology Data Exchange (ETDEWEB) Rah, Jeong-Eun; Oh, Do Hoon [Department of Radiation Oncology, Myongji Hospital, Goyang 412-270 (Korea, Republic of); Shin, Dongho; Kim, Tae Hyun [Proton Therapy Center, National Cancer Center, Goyang 410-769 (Korea, Republic of); Kim, Gwe-Ya, E-mail: gweyakim@gmail.com [Department of Radiation Medicine and Applied Sciences, University of California, San Diego, California 92093 (United States) 2014-09-15 Purpose: To evaluate and improve the reliability of proton quality assurance (QA) processes and, to provide an optimal customized tolerance level using the statistical process control (SPC) methodology. Methods: The authors investigated the consistency check of dose per monitor unit (D/MU) and range in proton beams to see whether it was within the tolerance level of the daily QA process. This study analyzed the difference between the measured and calculated ranges along the central axis to improve the patient-specific QA process in proton beams by using process capability indices. Results: The authors established a customized tolerance level of ±2% for D/MU and ±0.5 mm for beam range in the daily proton QA process. In the authors’ analysis of the process capability indices, the patient-specific range measurements were capable of a specification limit of ±2% in clinical plans. Conclusions: SPC methodology is a useful tool for customizing the optimal QA tolerance levels and improving the quality of proton machine maintenance, treatment delivery, and ultimately patient safety. 20. Multivariate statistical process control in product quality review assessment - A case study. Science.gov (United States) Kharbach, M; Cherrah, Y; Vander Heyden, Y; Bouklouze, A 2017-08-07 According to the Food and Drug Administration and the European Good Manufacturing Practices (GMP) guidelines, Annual Product Review (APR) is a mandatory requirement in GMP. It consists of evaluating a large collection of qualitative or quantitative data in order to verify the consistency of an existing process. According to the Code of Federal Regulation Part 11 (21 CFR 211.180), all finished products should be reviewed annually for the quality standards to determine the need of any change in specification or manufacturing of drug products. Conventional Statistical Process Control (SPC) evaluates the pharmaceutical production process by examining only the effect of a single factor at the time using a Shewhart's chart. It neglects to take into account the interaction between the variables. In order to overcome this issue, Multivariate Statistical Process Control (MSPC) can be used. Our case study concerns an APR assessment, where 164 historical batches containing six active ingredients, manufactured in Morocco, were collected during one year. Each batch has been checked by assaying the six active ingredients by High Performance Liquid Chromatography according to European Pharmacopoeia monographs. The data matrix was evaluated both by SPC and MSPC. The SPC indicated that all batches are under control, while the MSPC, based on Principal Component Analysis (PCA), for the data being either autoscaled or robust scaled, showed four and seven batches, respectively, out of the Hotelling T(2) 95% ellipse. Also, an improvement of the capability of the process is observed without the most extreme batches. The MSPC can be used for monitoring subtle changes in the manufacturing process during an APR assessment. Copyright © 2017 Académie Nationale de Pharmacie. Published by Elsevier Masson SAS. All rights reserved. 1. A statistical approach for identifying nuclear waste glass compositions that will meet quality and processability requirements Energy Technology Data Exchange (ETDEWEB) Piepel, G.F. 1990-09-01 Borosilicate glass provides a solid, stable medium for the disposal of high-level radioactive wastes resulting from the production of nuclear materials for United States defense needs. The glass must satisfy various quality and processability requirements on properties such as chemical durability, viscosity, and electrical conductivity. These properties depend on the composition of the waste glass, which will vary during production due to variations in nuclear waste composition and variations in the glass-making process. This paper discusses the experimentally-based statistical approach being used in the Hanford Waste Vitrification Plant (HWVP) Composition Variability Study (CVS). The overall goal of the CVS is to identify the composition region of potential HWVP waste glasses that satisfy with high confidence the applicable quality and processability requirements. This is being accomplished by melting and obtaining property data for simulated nuclear waste glasses of various compositions, and then statistically developing models and other tools needed to meet the goal. 6 refs., 1 fig., 5 tabs. 2. Design of U-Geometry Parameters Using Statistical Analysis Techniques in the U-Bending Process Directory of Open Access Journals (Sweden) Wiriyakorn Phanitwong 2017-06-01 Full Text Available The various U-geometry parameters in the U-bending process result in processing difficulties in the control of the spring-back characteristic. In this study, the effects of U-geometry parameters, including channel width, bend angle, material thickness, tool radius, as well as workpiece length, and their design, were investigated using a combination of finite element method (FEM simulation, and statistical analysis techniques. Based on stress distribution analyses, the FEM simulation results clearly identified the different bending mechanisms and effects of U-geometry parameters on the spring-back characteristic in the U-bending process, with and without pressure pads. The statistical analyses elucidated that the bend angle and channel width have a major influence in cases with and without pressure pads, respectively. The experiments were carried out to validate the FEM simulation results. Additionally, the FEM simulation results were in agreement with the experimental results, in terms of the bending forces and bending angles. 3. Impact analysis of critical success factors on the benefits from statistical process control implementation Directory of Open Access Journals (Sweden) Fabiano Rodrigues Soriano Full Text Available Abstract The Statistical Process Control - SPC is a set of statistical techniques focused on process control, monitoring and analyzing variation causes in the quality characteristics and/or in the parameters used to control and process improvements. Implementing SPC in organizations is a complex task. The reasons for its failure are related to organizational or social factors such as lack of top management commitment and little understanding about its potential benefits. Other aspects concern technical factors such as lack of training on and understanding about the statistical techniques. The main aim of the present article is to understand the interrelations between conditioning factors associated with top management commitment (Support, SPC Training and Application, as well as to understand the relationships between these factors and the benefits associated with the implementation of the program. The Partial Least Squares Structural Equation Modeling (PLS-SEM was used in the analysis since the main goal is to establish the causal relations. A cross-section survey was used as research method to collect information of samples from Brazilian auto-parts companies, which were selected according to guides from the auto-parts industry associations. A total of 170 companies were contacted by e-mail and by phone in order to be invited to participate in the survey. However, just 93 industries agreed on participating, and only 43 answered the questionnaire. The results showed that the senior management support considerably affects the way companies develop their training programs. In turn, these trainings affect the way companies apply the techniques. Thus, it will reflect on the benefits gotten from implementing the program. It was observed that the managerial and technical aspects are closely connected to each other and that they are represented by the ratio between top management and training support. The technical aspects observed through SPC 4. Treatment of the background error in the statistical analysis of Poisson processes Science.gov (United States) Giunti, C. 1999-06-01 The formalism that allows one to take into account the error σb of the expected mean background b¯ in the statistical analysis of a Poisson process with the frequentistic method is presented. It is shown that the error σb cannot be neglected if it is not much smaller than b¯. The resulting confidence belt is larger that the one for σb=0, leading to larger confidence intervals for the mean μ of signal events. 5. Using statistical process control methodology to improve the safe operating envelope Energy Technology Data Exchange (ETDEWEB) Reeves, A.D.; Lunney, B.P.; McIntyre, C.M. [Atlantic Nuclear Services Ltd. (ANSL), Fredericton, New Brunswick (Canada); Prime, D.R. [New Brunswick Power Nuclear (NBPN), Lepreau, New Brunswick (Canada) 2009-07-01 Failure limits used to assess impairments from Operating Manual Tests (OMT) are often established using licensing limits from safety analysis. While these determine that licensing conditions are not violated, they do not provide pro-active indications of problems developing with system components. This paper discusses statistical process control (SPC) methods to define action limits useful in diagnosing system component problems prior to reaching impairment limits. Using data from a specific OMT, an example of one such application is provided. Application of SPC limits can provide an improvement to station operating economics through early detection of abnormal equipment behaviour. (author) 6. PMMA/PS coaxial electrospinning: a statistical analysis on processing parameters Science.gov (United States) Rahmani, Shahrzad; Arefazar, Ahmad; Latifi, Masoud 2017-08-01 Coaxial electrospinning, as a versatile method for producing core-shell fibers, is known to be very sensitive to two classes of influential factors including material and processing parameters. Although coaxial electrospinning has been the focus of many studies, the effects of processing parameters on the outcomes of this method have not yet been well investigated. A good knowledge of the impacts of processing parameters and their interactions on coaxial electrospinning can make it possible to better control and optimize this process. Hence, in this study, the statistical technique of response surface method (RSM) using the design of experiments on four processing factors of voltage, distance, core and shell flow rates was applied. Transmission electron microscopy (TEM), scanning electron microscopy (SEM), oil immersion and Fluorescent microscopy were used to characterize fiber morphology. The core and shell diameters of fibers were measured and the effects of all factors and their interactions were discussed. Two polynomial models with acceptable R-squares were proposed to describe the core and shell diameters as functions of the processing parameters. Voltage and distance were recognized as the most significant and influential factors on shell diameter, while core diameter was mainly under the influence of core and shell flow rates besides the voltage. 7. Statistical Methods for Quality Control of Steel Coils Manufacturing Process using Generalized Linear Models Science.gov (United States) García-Díaz, J. Carlos 2009-11-01 Fault detection and diagnosis is an important problem in process engineering. Process equipments are subject to malfunctions during operation. Galvanized steel is a value added product, furnishing effective performance by combining the corrosion resistance of zinc with the strength and formability of steel. Fault detection and diagnosis is an important problem in continuous hot dip galvanizing and the increasingly stringent quality requirements in automotive industry has also demanded ongoing efforts in process control to make the process more robust. When faults occur, they change the relationship among these observed variables. This work compares different statistical regression models proposed in the literature for estimating the quality of galvanized steel coils on the basis of short time histories. Data for 26 batches were available. Five variables were selected for monitoring the process: the steel strip velocity, four bath temperatures and bath level. The entire data consisting of 48 galvanized steel coils was divided into sets. The first training data set was 25 conforming coils and the second data set was 23 nonconforming coils. Logistic regression is a modeling tool in which the dependent variable is categorical. In most applications, the dependent variable is binary. The results show that the logistic generalized linear models do provide good estimates of quality coils and can be useful for quality control in manufacturing process. 8. A bibliometric analysis of 50 years of worldwide research on statistical process control Directory of Open Access Journals (Sweden) Fabiane Letícia Lizarelli Full Text Available Abstract An increasing number of papers on statistical process control (SPC has emerged in the last fifty years, especially in the last fifteen years. This may be attributed to the increased global competitiveness generated by innovation and the continuous improvement of products and processes. In this sense, SPC has a fundamentally important role in quality and production systems. The research in this paper considers the context of technological improvement and innovation of products and processes to increase corporate competitiveness. There are several other statistical technics and tools for assisting continuous improvement and innovation of products and processes but, despite the limitations in their use in the improvement projects, there is growing concern about the use of SPC. A gap between the SPC technics taught in engineering courses and their practical applications to industrial problems is observed in empirical research; thus, it is important to understand what has been done and identify the trends in SPC research. The bibliometric study in this paper is proposed in this direction and uses the Web of Science (WoS database. Data analysis indicates that there was a growth rate of more than 90% in the number of publications on SPC after 1990. Our results reveal the countries where these publications have come from, the authors with the highest number of papers and their networks. Main sources of publications are also identified; it is observed that the publications of SPC papers are concentrated in some of the international research journals, not necessarily those with the major high-impact factors. Furthermore, the papers are focused on industrial engineering, operations research and management science fields. The most common term found in the papers was cumulative sum control charts, but new topics have emerged and have been researched in the past ten years, such as multivariate methods for process monitoring and nonparametric methods. 9. Frontiers in statistical quality control CERN Document Server Wilrich, Peter-Theodor 2004-01-01 This volume treats the four main categories of Statistical Quality Control: General SQC Methodology, On-line Control including Sampling Inspection and Statistical Process Control, Off-line Control with Data Analysis and Experimental Design, and, fields related to Reliability. Experts with international reputation present their newest contributions. 10. Influence of volume of sample processed on detection of Chlamydia trachomatis in urogenital samples by PCR. OpenAIRE Goessens, Wil; Kluytmans, Jan; Toom, N.; van Rijsoort-Vos, T H; Stolz, Ernst; Verbrugh, Henri; Quint, Wim; Niesters, Bert 1995-01-01 textabstractIn the present study, it was demonstrated that the sensitivity of the PCR for the detection of Chlamydia trachomatis is influenced by the volume of the clinical sample which is processed in the PCR. An adequate sensitivity for PCR was established by processing at least 4%, i.e., 80 microliters, of the clinical sample volume per PCR. By using this preparation procedure, 1,110 clinical samples were evaluated by PCR and by cell culture, and results were compared. After discordant ana... 11. Statistical Behavior of Formation Process of Magnetic Vortex State in Ni80Fe20 Nanodisks Energy Technology Data Exchange (ETDEWEB) Im, Mi-Young; Fischer, Peter; Keisuke, Yamada; Kasai, Shinya 2011-01-14 Magnetic vortices in magnetic nanodots, which are characterized by an in-plane (chirality) and an out-of-plane (polarity) magnetizations, have been intensively attracted because of their high potential for technological application to data storage and memory scheme as well as their scientific interest for an understanding of fundamental physics in magnetic nanostructures. Complete understanding of the formation process of vortex state in magnetic vortex systems is very significant issue to achieve storage and memory technologies using magnetic vortices and understand intrinsic physical properties in magnetic nanostructures. In our work, we have statistically investigated the formation process of vortex state in permalloy (Py, Ni{sub 80}Fe{sub 20}) nanodisks through the direct observation of vortex structure utilizing a magnetic transmission soft X-ray microscopy (MTXM) with a high spatial resolution down to 20 nm. Magnetic imaging in Py nanodots was performed at the Fe L{sub 3} (707 eV) absorption edge. Figure 1 shows in-plane and out-of-plane magnetic components observed in 40 nm thick nanodot arrays with different dot radius of r = 500 and 400 nm, respectively. Vortex chirality, either clockwise (CW) or counter-clockwise (CCW), and polarity, either up or down, are clearly visible in both arrays. To investigate the statistical behavior in formation process of the vortex state, the observation of vortex structure at a remanant state after saturation of nanodots by an external magnetic field of 1 kOe has been repeatedly performed over 100 times for each array. The typical MTXM images of vortex chirality taken in two successive measurements together with their overlapped images in nanodot arrays of r = 500 and 400 nm are displayed in Fig. 2. Within the statistical measurement, the formation process of chirality of either CW or CCW is quite stochastic in each nanodot. Similar behavior is also witnessed in the formation of vortex polarity observed in consecutive 12. A STATISTICAL REVIEW OF DWPF LABORATORY MEASUREMENTS GENERATED DURING THE PROCESSING OF BATCHES 300 THROUGH 356 Energy Technology Data Exchange (ETDEWEB) Edwards, T 2006-08-31 In this report, the Statistical Consulting Section (SCS) of the Savannah River National Laboratory (SRNL) provides summaries and comparisons of composition measurements for glass samples that were generated during the processing of batches 300 through 356 at the Defense Waste Processing Facility (DWPF). These analyses, which include measurements of samples from the Sludge Receipt and Adjustment Tank (SRAT) and the Slurry Mix Evaporator (SME) as well as samples of glass standards, were provided to SCS by the DWPF Laboratory (DWPF Lab) of Waste Laboratory Services. The comparisons made by SCS were extensive given that these data allowed for contrasts between preparation methods and between the two spectrometers that are currently in use at the DWPF Lab. In addition to general comparisons, specific questions that were posed in the Technical Task Request (TTR) behind this effort were addressed in this report. 13. Statistical modeling of copper losses in the silicate slag of the sulfide concentrate smelting process Directory of Open Access Journals (Sweden) Savic Marija V. 2015-09-01 Full Text Available This article presents the results of the statistical modeling of copper losses in the silicate slag of the sulfide concentrates smelting process. The aim of this study was to define the correlation dependence of the degree of copper losses in the silicate slag on the following parameters of technological processes: SiO2, FeO, Fe3O4, CaO and Al2O3 content in the slag and copper content in the matte. Multiple linear regression analysis (MLRA, artificial neural networks (ANNs and adaptive network based fuzzy inference system (ANFIS were used as tools for mathematical analysis of the indicated problem. The best correlation coefficient (R2 = 0.719 of the final model was obtained using the ANFIS modeling approach. 14. Statistical mechanics of neocortical interactions: Large-scale EEG influences on molecular processes. Science.gov (United States) Ingber, Lester 2016-04-21 Calculations further support the premise that large-scale synchronous firings of neurons may affect molecular processes. The context is scalp electroencephalography (EEG) during short-term memory (STM) tasks. The mechanism considered is Π=p+qA (SI units) coupling, where p is the momenta of free Ca(2+) waves, q the charge of Ca(2+) in units of the electron charge, and A the magnetic vector potential of current I from neuronal minicolumnar firings considered as wires, giving rise to EEG. Data has processed using multiple graphs to identify sections of data to which spline-Laplacian transformations are applied, to fit the statistical mechanics of neocortical interactions (SMNI) model to EEG data, sensitive to synaptic interactions subject to modification by Ca(2+) waves. 15. EXPERIMENTALLY-STATISTICAL MODEL OF CLADDING LAYER FORMATION PROCESS ON SLIDE-WAYS Directory of Open Access Journals (Sweden) N. N. Maksimchenko 2010-01-01 Full Text Available The developed experimentally-statistical model of the cladding composite layer formation process on slide-ways allows to operate technological modes of cladding by flexible instrument (CFI in order to obtain the set properties of a coating (thickness, continuity, adhesion strength.The established optimum technological modes of CFI process providing formation of continuous, strongly adhered to a basis composite coatings of the required thickness have been used for applying coatings on working surfaces of slide-ways of metal-cutting machine tool beds that has allowed to lower friction factor in coupling on the average by 1.3–1.7-fold and to improve uniformity of slow moving of machine tool units by 1.74-fold in comparison with slide-ways without a coating. 16. Stepping and crowding of molecular motors: statistical kinetics from an exclusion process perspective. Science.gov (United States) Ciandrini, Luca; Romano, M Carmen; Parmeggiani, Andrea 2014-09-02 Motor enzymes are remarkable molecular machines that use the energy derived from the hydrolysis of a nucleoside triphosphate to generate mechanical movement, achieved through different steps that constitute their kinetic cycle. These macromolecules, nowadays investigated with advanced experimental techniques to unveil their molecular mechanisms and the properties of their kinetic cycles, are implicated in many biological processes, ranging from biopolymerization (e.g., RNA polymerases and ribosomes) to intracellular transport (motor proteins such as kinesins or dyneins). Although the kinetics of individual motors is well studied on both theoretical and experimental grounds, the repercussions of their stepping cycle on the collective dynamics still remains unclear. Advances in this direction will improve our comprehension of transport process in the natural intracellular medium, where processive motor enzymes might operate in crowded conditions. In this work, we therefore extend contemporary statistical kinetic analysis to study collective transport phenomena of motors in terms of lattice gas models belonging to the exclusion process class. Via numerical simulations, we show how to interpret and use the randomness calculated from single particle trajectories in crowded conditions. Importantly, we also show that time fluctuations and non-Poissonian behavior are intrinsically related to spatial correlations and the emergence of large, but finite, clusters of comoving motors. The properties unveiled by our analysis have important biological implications on the collective transport characteristics of processive motor enzymes in crowded conditions. 17. Selected Remarks about Computer Processing in Terms of Flow Control and Statistical Mechanics Directory of Open Access Journals (Sweden) Dominik Strzałka 2016-03-01 Full Text Available Despite the fact that much has been said about processing in computer science, it seems that there is still much to do. A classical approach assumes that the computations done by computers are a kind of mathematical operation (calculations of functions values and have no special relations to energy transformation and flow. However, there is a possibility to get a new view on selected topics, and as a special case, the sorting problem is presented; we know many different sorting algorithms, including those that have complexity equal to O(n lg(n , which means that this problem is algorithmically closed, but it is also possible to focus on the problem of sorting in terms of flow control, entropy and statistical mechanics. This is done in relation to the existing definitions of sorting, connections between sorting and ordering and some important aspects of computer processing understood as a flow that are not taken into account in many theoretical considerations in computer science. The proposed new view is an attempt to change the paradigm in the description of algorithms’ performance by computational complexity and processing, taking into account the existing references between the idea of Turing machines and their physical implementations. This proposal can be expressed as a physics of computer processing; a reference point to further analysis of algorithmic and interactive processing in computer systems. 18. IMPROVING KNITTED FABRICS BY A STATISTICAL CONTROL OF DIMENSIONAL CHANGES AFTER THE DYEING PROCESS Directory of Open Access Journals (Sweden) LLINARES-BERENGUER Jorge 2017-05-01 Full Text Available One of the most important problems that cotton knitted fabrics present during the manufacturing process is their dimensional instability, which needs to be minimised. Some of the variables that intervene in fabric shrinkage are related with its structural characteristics, use of fiber when producing yarn, the yarn count used or the dyeing process employed. Conducted under real factory conditions, the present study attempted to model the behaviour of a fabric structure after a dyeing process by contributing several algorithms that calculate dyed fabric stability after the first wash cycle. Small-diameter circular machines are used to produce garments with no side seams. This is the reason why a list of machines that produce the same fabrics for different widths needs to be made available to produce all the sizes of a given garment. Two relaxation states were distingued for interlock fabric: dyed and dry relaxation, and dyed and wash relaxation. The linear density of the yarn employed to produce sample fabric was combed cotton Ne 30. The machines used for optic bleaching were Overflow. To obtain knitting structures with optimum dimensional stability, different statistical tools were used to help us to evaluate all the production process variables (raw material, machines and process responsible for this variation. This allowed to guarantee product quality without creating costs and losses. 19. Using multitype branching processes to quantify statistics of disease outbreaks in zoonotic epidemics Science.gov (United States) Singh, Sarabjeet; Schneider, David J.; Myers, Christopher R. 2014-03-01 Branching processes have served as a model for chemical reactions, biological growth processes, and contagion (of disease, information, or fads). Through this connection, these seemingly different physical processes share some common universalities that can be elucidated by analyzing the underlying branching process. In this work we focus on coupled branching processes as a model of infectious diseases spreading from one population to another. An exceedingly important example of such coupled outbreaks are zoonotic infections that spill over from animal populations to humans. We derive several statistical quantities characterizing the first spillover event from animals to humans, including the probability of spillover, the first passage time distribution for human infection, and disease prevalence in the animal population at spillover. Large stochastic fluctuations in those quantities can make inference of the state of the system at the time of spillover difficult. Focusing on outbreaks in the human population, we then characterize the critical threshold for a large outbreak, the distribution of outbreak sizes, and associated scaling laws. These all show a strong dependence on the basic reproduction number in the animal population and indicate the existence of a novel multicritical point with altered scaling behavior. The coupling of animal and human infection dynamics has crucial implications, most importantly allowing for the possibility of large human outbreaks even when human-to-human transmission is subcritical. 20. Preliminary Retrospective Analysis of Daily Tomotherapy Output Constancy Checks Using Statistical Process Control. Directory of Open Access Journals (Sweden) Emilio Mezzenga Full Text Available The purpose of this study was to retrospectively evaluate the results from a Helical TomoTherapy Hi-Art treatment system relating to quality controls based on daily static and dynamic output checks using statistical process control methods. Individual value X-charts, exponentially weighted moving average charts, and process capability and acceptability indices were used to monitor the treatment system performance. Daily output values measured from January 2014 to January 2015 were considered. The results obtained showed that, although the process was in control, there was an out-of-control situation in the principal maintenance intervention for the treatment system. In particular, process capability indices showed a decreasing percentage of points in control which was, however, acceptable according to AAPM TG148 guidelines. Our findings underline the importance of restricting the acceptable range of daily output checks and suggest a future line of investigation for a detailed process control of daily output checks for the Helical TomoTherapy Hi-Art treatment system. 1. High Accuracy Extraction of Respiratory Sinus Arrhythmia with Statistical Processing using Normal Distribution Science.gov (United States) Numata, Takashi; Ogawa, Yutaro; Yoshida, Lui; Kotani, Kiyoshi; Jimbo, Yasuhiko The autonomic nervous system is important in maintaining homeostasis by mediating the opposing effects of the sympathetic and parasympathetic nervous activity on organs. Although it is known that the amplitude of RSA (Respiratory Sinus Arrhythmia) is an index of parasympathetic nervous activity, it is difficult to estimate that activity in real-time in everyday situations. It is partly caused by body motions and extrasystoles. Also, automatic recognition of the R-wave on electrocardiograms is required for real-time analysis of RSA amplitude, there is an unresolved problem of false recognition of the R-wave. In this paper, we propose a method to evaluate the amplitude of RSA accurately using statistical processing with probabilistic models. Then, we estimate parasympathetic nervous activity during body motion and isometric exercise to examine the validity of the method. As a result, using the proposed method, we demonstrate that the amplitude of RSA can be extracted with false recognition of the R-wave. In addition, an appropriate threshold for the estimate is one or five percent because waveforms of RSA amplitude do not follow the abrupt changes of the parasympathetic nervous activity evoked by isometric exercise with the threshold at ten percent. Furthermore, the method using normal distribution is found to be more appropriate than that of chi-square distribution for statistical processing. Therefore, we expect that the proposed method can evaluate parasympathetic nervous activity with high accuracy in everyday situations. 2. Using statistical process control to make data-based clinical decisions. Science.gov (United States) Pfadt, A; Wheeler, D J 1995-01-01 Applied behavior analysis is based on an investigation of variability due to interrelationships among antecedents, behavior, and consequences. This permits testable hypotheses about the causes of behavior as well as for the course of treatment to be evaluated empirically. Such information provides corrective feedback for making data-based clinical decisions. This paper considers how a different approach to the analysis of variability based on the writings of Walter Shewart and W. Edwards Deming in the area of industrial quality control helps to achieve similar objectives. Statistical process control (SPC) was developed to implement a process of continual product improvement while achieving compliance with production standards and other requirements for promoting customer satisfaction. SPC involves the use of simple statistical tools, such as histograms and control charts, as well as problem-solving techniques, such as flow charts, cause-and-effect diagrams, and Pareto charts, to implement Deming's management philosophy. These data-analytic procedures can be incorporated into a human service organization to help to achieve its stated objectives in a manner that leads to continuous improvement in the functioning of the clients who are its customers. Examples are provided to illustrate how SPC procedures can be used to analyze behavioral data. Issues related to the application of these tools for making data-based clinical decisions and for creating an organizational climate that promotes their routine use in applied settings are also considered. 3. One approach in using multivariate statistical process control in analyzing cheese quality Directory of Open Access Journals (Sweden) Ilija Djekic 2015-05-01 Full Text Available The objective of this paper was to investigate possibility of using multivariate statistical process control in analysing cheese quality parameters. Two cheese types (white brined cheeses and soft cheese from ultra-filtered milk were selected and analysed for several quality parameters such as dry matter, milk fat, protein contents, pH, NaCl, fat in dry matter and moisture in non-fat solids. The obtained results showed significant variations for most of the quality characteristics which were examined among the two types of cheese. The only stable parameter in both types of cheese was moisture in non-fat solids. All of the other cheese quality characteristics were characterized above or below control limits for most of the samples. Such results indicated a high instability and variations within cheese production. Although the use of statistical process control is not mandatory in the dairy industry, it might provide benefits to organizations in improving quality control of dairy products. 4. Multicomponent statistical analysis to identify flow and transport processes in a highly-complex environment Science.gov (United States) Moeck, Christian; Radny, Dirk; Borer, Paul; Rothardt, Judith; Auckenthaler, Adrian; Berg, Michael; Schirmer, Mario 2016-11-01 A combined approach of multivariate statistical analysis, namely factor analysis (FA) and hierarchical cluster analysis (HCA), interpretation of geochemical processes, stable water isotope data and organic micropollutants enabling to assess spatial patterns of water types was performed for a study area in Switzerland, where drinking water production is close to different potential input pathways for contamination. To avoid drinking water contamination, artificial groundwater recharge with surface water into an aquifer is used to create a hydraulic barrier between potential intake pathways for contamination and drinking water extraction wells. Inter-aquifer mixing in the subsurface is identified, where a high amount of artificial infiltrated surface water is mixed with a lesser amount of water originating from the regional flow pathway in the vicinity of drinking water extraction wells. The spatial distribution of different water types can be estimated and a conceptual system understanding is developed. Results of the multivariate statistical analysis are comparable with gained information from isotopic data and organic micropollutants analyses. The integrated approach using different kinds of observations can be easily transferred to a variety of hydrological settings to synthesise and evaluate large hydrochemical datasets. The combination of additional data with different information content is conceivable and enabled effective interpretation of hydrological processes. Using the applied approach leads to more sound conceptual system understanding acting as the very basis to develop improved water resources management practices in a sustainable way. 5. Statistical Review of Data from DWPF's Process Samples for Batches 19 Through 30 Energy Technology Data Exchange (ETDEWEB) Edwards, T.B. 1999-04-06 The measurements derived from samples taken during the processing of batches 19 through 30 at the Defense Waste Processing Facility (DWPF) affords an opportunity for review and comparisons. This report has looked at some of the statistics from these data. Only the data reported by the DWPF lab (that is, the data provided by the lab as representative of the samples taken) are available for this analysis. In some cases, the sample results reported may be a subset of the sample results generated by the analytical procedures. A thorough assessment of the DWPF lab's analytical procedures would require the complete set of data. Thus, the statistics reported here, specifically, as they relate to analytical uncertainties, are limited to the reported data for these samples, A fell for the consistency of the incoming slurry is the estimation of the components of variation for the Sludge Receipt and Adjustment Tank (SRAT) receipts. In general, for all of the vessels, the data from batches after 21 show smaller batch-to-batch variation than the data from all the batches. The relative contributions of batch-to-batch versus residual, which includes analytical, are presented in these analyses. 6. Development of the NRCs Human Performance Investigation Process (HPIP). Volume 3, Development documentation Energy Technology Data Exchange (ETDEWEB) Paradies, M.; Unger, L. [System Improvements, Inc., Knoxville, TN (United States); Haas, P.; Terranova, M. [Concord Associates, Inc., Knoxville, TN (United States) 1993-10-01 The three volumes of this report detail a standard investigation process for use by US Nuclear Regulatory Commission (NRC) personnel when investigating human performance related events at nuclear power plants. The process, called the Human Performance Investigation Process (HPIP), was developed to meet the special needs of NRC personnel, especially NRC resident and regional inspectors. HPIP is a systematic investigation process combining current procedures and field practices, expert experience, NRC human performance research, and applicable investigation techniques. The process is easy to learn and helps NRC personnel perform better field investigations of the root causes of human performance problems. The human performance data gathered through such investigations provides a better understanding of the human performance issues that cause events at nuclear power plants. This document, Volume III, is a detailed documentation of the development effort and the pilot training program. 7. Boletin Estadistico de la Educacion: Ano VI, No. 1 (Statistical Bulletin on Education: Volume VI, No. 1). Science.gov (United States) Ministerio de Educacion, Guatemala City (Guatemala). Oficina de Planeamiento Integral de la Educacion. This booklet presents statistics concerning primary education in Guatemala. The first section covers enrollment, considering such factors as type of school and location. Other sections provide statistics on teachers, their locations, the number of schools, enrollment in terms of students repeating grades or leaving school, students advancing out… 8. Preliminary evaluation of alternative waste form solidification processes. Volume I. Identification of the processes. Energy Technology Data Exchange (ETDEWEB) Treat, R.L.; Nesbitt, J.F.; Blair, H.T.; Carter, J.G.; Gorton, P.S.; Partain, W.L.; Timmerman, C.L. 1980-04-01 This document contains preconceptual design data on 11 processes for the solidification and isolation of nuclear high-level liquid wastes (HLLW). The processes are: in-can glass melting (ICGM) process, joule-heated glass melting (JHGM) process, glass-ceramic (GC) process, marbles-in-lead (MIL) matrix process, supercalcine pellets-in-metal (SCPIM) matrix process, pyrolytic-carbon coated pellets-in-metal (PCCPIM) matrix process, supercalcine hot-isostatic-pressing (SCHIP) process, SYNROC hot-isostatic-pressing (SYNROC HIP) process, titanate process, concrete process, and cermet process. For the purposes of this study, it was assumed that each of the solidification processes is capable of handling similar amounts of HLLW generated in a production-sized fuel reprocessing plant. It was also assumed that each of the processes would be enclosed in a shielded canyon or cells within a waste facility located at the fuel reprocessing plant. Finally, it was assumed that all of the processes would be subject to the same set of regulations, codes and standards. Each of the solidification processes converts waste into forms that may be acceptable for geological disposal. Each process begins with the receipt of HLLW from the fuel reprocessing plant. In this study, it was assumed that the original composition of the HLLW would be the same for each process. The process ends when the different waste forms are enclosed in canisters or containers that are acceptable for interim storage. Overviews of each of the 11 processes and the bases used for their identification are presented in the first part of this report. Each process, including its equipment and its requirements, is covered in more detail in Appendices A through K. Pertinent information on the current state of the art and the research and development required for the implementation of each process are also noted in the appendices. 9. Statistical process control analysis for patient-specific IMRT and VMAT QA. Science.gov (United States) Sanghangthum, Taweap; Suriyapee, Sivalee; Srisatit, Somyot; Pawlicki, Todd 2013-05-01 This work applied statistical process control to establish the control limits of the % gamma pass of patient-specific intensity modulated radiotherapy (IMRT) and volumetric modulated arc therapy (VMAT) quality assurance (QA), and to evaluate the efficiency of the QA process by using the process capability index (Cpml). A total of 278 IMRT QA plans in nasopharyngeal carcinoma were measured with MapCHECK, while 159 VMAT QA plans were undertaken with ArcCHECK. Six megavolts with nine fields were used for the IMRT plan and 2.5 arcs were used to generate the VMAT plans. The gamma (3%/3 mm) criteria were used to evaluate the QA plans. The % gamma passes were plotted on a control chart. The first 50 data points were employed to calculate the control limits. The Cpml was calculated to evaluate the capability of the IMRT/VMAT QA process. The results showed higher systematic errors in IMRT QA than VMAT QA due to the more complicated setup used in IMRT QA. The variation of random errors was also larger in IMRT QA than VMAT QA because the VMAT plan has more continuity of dose distribution. The average % gamma pass was 93.7% ± 3.7% for IMRT and 96.7% ± 2.2% for VMAT. The Cpml value of IMRT QA was 1.60 and VMAT QA was 1.99, which implied that the VMAT QA process was more accurate than the IMRT QA process. Our lower control limit for % gamma pass of IMRT is 85.0%, while the limit for VMAT is 90%. Both the IMRT and VMAT QA processes are good quality because Cpml values are higher than 1.0. 10. QUALITY IMPROVEMENT USING STATISTICAL PROCESS CONTROL TOOLS IN GLASS BOTTLES MANUFACTURING COMPANY Directory of Open Access Journals (Sweden) Yonatan Mengesha Awaj 2013-03-01 Full Text Available In order to survive in a competitive market, improving quality and productivity of product or process is a must for any company. This study is about to apply the statistical process control (SPC tools in the production processing line and on final product in order to reduce defects by identifying where the highest waste is occur at and to give suggestion for improvement. The approach used in this study is direct observation, thorough examination of production process lines, brain storming session, fishbone diagram, and information has been collected from potential customers and company's workers through interview and questionnaire, Pareto chart/analysis and control chart (p-chart was constructed. It has been found that the company has many problems; specifically there is high rejection or waste in the production processing line. The highest waste occurs in melting process line which causes loss due to trickle and in the forming process line which causes loss due to defective product rejection. The vital few problems were identified, it was found that the blisters, double seam, stone, pressure failure and overweight are the vital few problems. The principal aim of the study is to create awareness to quality team how to use SPC tools in the problem analysis, especially to train quality team on how to held an effective brainstorming session, and exploit these data in cause-and-effect diagram construction, Pareto analysis and control chart construction. The major causes of non-conformities and root causes of the quality problems were specified, and possible remedies were proposed. Although the company has many constraints to implement all suggestion for improvement within short period of time, the company recognized that the suggestion will provide significant productivity improvement in the long run. 11. Speech segmentation by statistical learning is supported by domain-general processes within working memory. Science.gov (United States) Palmer, Shekeila D; Mattys, Sven L 2016-12-01 The purpose of this study was to examine the extent to which working memory resources are recruited during statistical learning (SL). Participants were asked to identify novel words in an artificial speech stream where the transitional probabilities between syllables provided the only segmentation cue. Experiments 1 and 2 demonstrated that segmentation performance improved when the speech rate was slowed down, suggesting that SL is supported by some form of active processing or maintenance mechanism that operates more effectively under slower presentation rates. In Experiment 3 we investigated the nature of this mechanism by asking participants to perform a two-back task while listening to the speech stream. Half of the participants performed a two-back rhyme task designed to engage phonological processing, whereas the other half performed a comparable two-back task on un-nameable visual shapes. It was hypothesized that if SL is dependent only upon domain-specific processes (i.e., phonological rehearsal), the rhyme task should impair speech segmentation performance more than the shape task. However, the two loads were equally disruptive to learning, as they both eradicated the benefit provided by the slow rate. These results suggest that SL is supported by working-memory processes that rely on domain-general resources. 12. Risk management for moisture related effects in dry manufacturing processes: a statistical approach. Science.gov (United States) Quiroz, Jorge; Strong, John; Zhang, Lanju 2016-03-01 A risk- and science-based approach to control the quality in pharmaceutical manufacturing includes a full understanding of how product attributes and process parameters relate to product performance through a proactive approach in formulation and process development. For dry manufacturing, where moisture content is not directly manipulated within the process, the variability in moisture of the incoming raw materials can impact both the processability and drug product quality attributes. A statistical approach is developed using individual raw material historical lots as a basis for the calculation of tolerance intervals for drug product moisture content so that risks associated with excursions in moisture content can be mitigated. The proposed method is based on a model-independent approach that uses available data to estimate parameters of interest that describe the population of blend moisture content values and which do not require knowledge of the individual blend moisture content values. Another advantage of the proposed tolerance intervals is that, it does not require the use of tabulated values for tolerance factors. This facilitates the implementation on any spreadsheet program like Microsoft Excel. A computational example is used to demonstrate the proposed method. 13. Electrophysiological Correlates of Emotional Content and Volume Level in Spoken Word Processing. Science.gov (United States) Grass, Annika; Bayer, Mareike; Schacht, Annekathrin 2016-01-01 For visual stimuli of emotional content as pictures and written words, stimulus size has been shown to increase emotion effects in the early posterior negativity (EPN), a component of event-related potentials (ERPs) indexing attention allocation during visual sensory encoding. In the present study, we addressed the question whether this enhanced relevance of larger (visual) stimuli might generalize to the auditory domain and whether auditory emotion effects are modulated by volume. Therefore, subjects were listening to spoken words with emotional or neutral content, played at two different volume levels, while ERPs were recorded. Negative emotional content led to an increased frontal positivity and parieto-occipital negativity-a scalp distribution similar to the EPN-between ~370 and 530 ms. Importantly, this emotion-related ERP component was not modulated by differences in volume level, which impacted early auditory processing, as reflected in increased amplitudes of the N1 (80-130 ms) and P2 (130-265 ms) components as hypothesized. However, contrary to effects of stimulus size in the visual domain, volume level did not influence later ERP components. These findings indicate modality-specific and functionally independent processing triggered by emotional content of spoken words and volume level. 14. Artificial Intelligence vs. Statistical Modeling and Optimization of Continuous Bead Milling Process for Bacterial Cell Lysis. Science.gov (United States) Haque, Shafiul; Khan, Saif; Wahid, Mohd; Dar, Sajad A; Soni, Nipunjot; Mandal, Raju K; Singh, Vineeta; Tiwari, Dileep; Lohani, Mohtashim; Areeshi, Mohammed Y; Govender, Thavendran; Kruger, Hendrik G; Jawed, Arshad 2016-01-01 For a commercially viable recombinant intracellular protein production process, efficient cell lysis and protein release is a major bottleneck. The recovery of recombinant protein, cholesterol oxidase (COD) was studied in a continuous bead milling process. A full factorial response surface methodology (RSM) design was employed and compared to artificial neural networks coupled with genetic algorithm (ANN-GA). Significant process variables, cell slurry feed rate (A), bead load (B), cell load (C), and run time (D), were investigated and optimized for maximizing COD recovery. RSM predicted an optimum of feed rate of 310.73 mL/h, bead loading of 79.9% (v/v), cell loading OD600nm of 74, and run time of 29.9 min with a recovery of ~3.2 g/L. ANN-GA predicted a maximum COD recovery of ~3.5 g/L at an optimum feed rate (mL/h): 258.08, bead loading (%, v/v): 80%, cell loading (OD600nm): 73.99, and run time of 32 min. An overall 3.7-fold increase in productivity is obtained when compared to a batch process. Optimization and comparison of statistical vs. artificial intelligence techniques in continuous bead milling process has been attempted for the very first time in our study. We were able to successfully represent the complex non-linear multivariable dependence of enzyme recovery on bead milling parameters. The quadratic second order response functions are not flexible enough to represent such complex non-linear dependence. ANN being a summation function of multiple layers are capable to represent complex non-linear dependence of variables in this case; enzyme recovery as a function of bead milling parameters. Since GA can even optimize discontinuous functions present study cites a perfect example of using machine learning (ANN) in combination with evolutionary optimization (GA) for representing undefined biological functions which is the case for common industrial processes involving biological moieties. 15. Aspects Regarding the Medical Data Processing - The Statistical Study of Malocclusions Directory of Open Access Journals (Sweden) Georgeta ZEGAN 2012-11-01 Full Text Available An important aspect in the analysis of medical data is represented by their statistical processing, which gives useful directions in finding the diagnosis and the most adequate treatment, especially when the amount of statistical data is very large and extended in time. We give an example in this paper, by presenting a statistical evaluation on a consignment of patients who have addressed the Dental Ambulatory for Children (Iaşi for orthodontic consulting over a period of 10 years (2000 – 2010. The study has been performed on 375 patients (157 boys and 218 girls, with ages between 4-24 years and malocclusions. The diagnosis was established by clinical and paraclinical exams (cast and radiological measurements. Both removable and fixed appliances were used in conducting the treatment. The patients in need of orthodontic treatment presented malocclusion I-st Class 63,2%, II-nd Class 28,3%, and III-rd Class 5,8%. The group and isolated malocclusion proportion varied on the basis of the clinical manifestations associated to the malocclusion class. Based on age, the patients were applied with prophylactic treatment 3%, interceptive treatment 5% and curative 92%. The Pearson correlations made on the sample of patients proved the existence of a directly proportional connection between the therapeutic results, the diagnosis of the malocclusions and the treatment chosen to be carried out. The results regarding the prevalence malocclusion types are comparable with those from the literature. The correlations that were carried out were based on medical reasoning. All these results are useful to depict the general characteristics of the Dental Ambulatory’s potential patients and, as a consequence, to establish easier the most accurate treatment. 16. Controle estatístico do processo em pintura industrial = Process statistics control in industrial painting Directory of Open Access Journals (Sweden) Valentina de Lourdes Milani de Paula Soares 2006-07-01 statistics control. After achieving statistics control of the process, control limits were established and they will allow to monitor those processes from now on, as well as to calculate their capacity rates. In the analysis where the first out-of-control process was found, it will be necessary to continue the study of variability causes. 17. The physics benchmark processes for the detector performance studies used in CLIC CDR Volume 3 CERN Document Server Allanach, B.J.; Desch, K.; Ellis, J.; Giudice, G.; Grefe, C.; Kraml, S.; Lastovicka, T.; Linssen, L.; Marschall, J.; Martin, S.P.; Muennich, A.; Poss, S.; Roloff, P.; Simon, F.; Strube, J.; Thomson, M.; Wells, J.D. 2012-01-01 This note describes the detector benchmark processes used in volume 3 of the CLIC conceptual design report (CDR), which explores a staged construction and operation of the CLIC accelerator. The goal of the detector benchmark studies is to assess the performance of the CLIC ILD and CLIC SiD detector concepts for different physics processes and at a few CLIC centre-of-mass energies. 18. Optimizing the prediction process: from statistical concepts to the case study of soccer. Directory of Open Access Journals (Sweden) Andreas Heuer Full Text Available We present a systematic approach for prediction purposes based on panel data, involving information about different interacting subjects and different times (here: two. The corresponding bivariate regression problem can be solved analytically for the final statistical estimation error. Furthermore, this expression is simplified for the special case that the subjects do not change their properties between the last measurement and the prediction period. This statistical framework is applied to the prediction of soccer matches, based on information from the previous and the present season. It is determined how well the outcome of soccer matches can be predicted theoretically. This optimum limit is compared with the actual quality of the prediction, taking the German premier league as an example. As a key step for the actual prediction process one has to identify appropriate observables which reflect the strength of the individual teams as close as possible. A criterion to distinguish different observables is presented. Surprisingly, chances for goals turn out to be much better suited than the goals themselves to characterize the strength of a team. Routes towards further improvement of the prediction are indicated. Finally, two specific applications are discussed. 19. Statistical media and process optimization for biotransformation of rice bran to vanillin using Pediococcus acidilactici. Science.gov (United States) Kaur, Baljinder; Chakraborty, Debkumar 2013-11-01 An isolate of P. acidilactici capable of producing vanillin from rice bran was isolated from a milk product. Response Surface Methodology was employed for statistical media and process optimization for production of biovanillin. Statistical medium optimization was done in two steps involving Placket Burman Design and Central Composite Response Designs. The RSM optimized vanillin production medium consisted of 15% (w/v) rice bran, 0.5% (w/v) peptone, 0.1% (w/v) ammonium nitrate, 0.005% (w/v) ferulic acid, 0.005% (w/v) magnesium sulphate, and 0.1% (v/v) tween-80, pH 5.6, at a temperature of 37 degrees C under shaking conditions at 180 rpm. 1.269 g/L vanillin was obtained within 24 h of incubation in optimized culture medium. This is the first report indicating such a high vanillin yield obtained during biotransformation of ferulic acid to vanillin using a Pediococcal isolate. 20. Statistical optimization of process parameters on biohydrogen production from glucose by Clostridium sp. Fanp2. Science.gov (United States) Pan, C M; Fan, Y T; Xing, Y; Hou, H W; Zhang, M L 2008-05-01 Statistically based experimental designs were applied to optimizing process parameters for hydrogen production from glucose by Clostridium sp. Fanp2 which was isolated from effluent sludge of anaerobic hydrogen-producing bioreactor. The important factors influencing hydrogen production, which identified by initial screening method of Plackett-Burman, were glucose, phosphate buffer and vitamin solution. The path of steepest ascent was undertaken to approach the optimal region of the three significant factors. Box-Behnken design and response surface analysis were adopted to further investigate the mutual interaction between the variables and identify optimal values that bring maximum hydrogen production. Experimental results showed that glucose, vitamin solution and phosphate buffer concentration all had an individual significant influence on the specific hydrogen production potential (Ps). Simultaneously, glucose and vitamin solution, glucose and phosphate buffer were interdependent. The optimal conditions for the maximal Ps were: glucose 23.75 g/l, phosphate buffer 0.159 M and vitamin solution 13.3 ml/l. Using this statistical optimization method, the hydrogen production from glucose was increased from 2248.5 to 4165.9 ml H2/l. 1. Data Processing System (DPS) software with experimental design, statistical analysis and data mining developed for use in entomological research Institute of Scientific and Technical Information of China (English) Qi-Yi Tang; Chuan-Xi Zhang 2013-01-01 A comprehensive but simple-to-use software package called DPS (Data Processing System) has been developed to execute a range of standard numerical analyses and operations used in experimental design,statistics and data mining.This program runs on standard Windows computers.Many of the functions are specific to entomological and other biological research and are not found in standard statistical software.This paper presents applications of DPS to experimental design,statistical analysis and data mining in entomology. 2. Data Processing System (DPS) software with experimental design, statistical analysis and data mining developed for use in entomological research. Science.gov (United States) Tang, Qi-Yi; Zhang, Chuan-Xi 2013-04-01 A comprehensive but simple-to-use software package called DPS (Data Processing System) has been developed to execute a range of standard numerical analyses and operations used in experimental design, statistics and data mining. This program runs on standard Windows computers. Many of the functions are specific to entomological and other biological research and are not found in standard statistical software. This paper presents applications of DPS to experimental design, statistical analysis and data mining in entomology. 3. Multivariate process modeling of high-volume manufacturing of consumer electronics Science.gov (United States) Asp, Stefan; Wide, Peter 1998-12-01 As production volumes continue to increase and the global market for consumer electronics is getting fiercer, the need for a reliable and essentially fault-free production process is becoming a necessity to survive. The manufacturing processes of today are highly complex and the increasing amount of process data produced in making it hard to unravel the useful information extracted from a huge data set. We have used multivariate and nonlinear process modeling to examine the surface mount production process in a high volume manufacturing of mobile telephones and made an artificial neural network model of the process. As input parameters to the model we have used process data logged by an automatic test equipment and the result variables come from an Automatic Inspection system placed after the board manufacturing process. Using multivariate process modeling has enabled us to identify parameters, which contributes heavily to the quality of the product and can further be implemented to optimize the manufacturing process for system production faults. 4. Bootstrap-based confidence estimation in PCA and multivariate statistical process control DEFF Research Database (Denmark) Babamoradi, Hamid Traditional/Asymptotic confidence estimation has limited applicability since it needs statistical theories to estimate the confidences, which are not available for all indicators/parameters. Furthermore, in case the theories are available for a specific indicator/parameter, the theories are based...... on assumptions that do not always hold in practice. The aim of this thesis was to illustrate the concept of bootstrap-based confidence estimation in PCA and MSPC. It particularly shows how to build bootstrapbased confidence limits in these areas to be used as alternative to the traditional/asymptotic limits....... The goal was to improve process monitoring by improving the quality of MSPC charts and contribution plots. Bootstrapping algorithm to build confidence limits was illustrated in a case study format (Paper I). The main steps in the algorithm were discussed where a set of sensible choices (plus... 5. A commercial microbial enhanced oil recovery process: statistical evaluation of a multi-project database Energy Technology Data Exchange (ETDEWEB) Portwood, J.T. 1995-12-31 This paper discusses a database of information collected and organized during the past eight years from 2,000 producing oil wells in the United States, all of which have been treated with special applications techniques developed to improve the effectiveness of MEOR technology. The database, believed to be the first of its kind, has been generated for the purpose of statistically evaluating the effectiveness and economics of the MEOR process in a wide variety of oil reservoir environments, and is a tool that can be used to improve the predictability of treatment response. The information in the database has also been evaluated to determine which, if any, reservoir characteristics are dominant factors in determining the applicability of MEOR. 6. Improving Statistical Process Monitoring of Quality Characteristic with Polynomial Profile in Phase II Directory of Open Access Journals (Sweden) Amirhossein Amiri 2012-03-01 Full Text Available Profile monitoring is a new research subject in Statistical Process Control which has been recently considered by many researchers. Profile describes the relationship between a response variable and one or more independent variables. This relationship is often modeled using regression which can be simple linear, multiple linear, polynomial or sometimes nonlinear. To monitor polynomial profiles, some methods have been developed, the best of which is the orthogonal polynomial approach. One of the disadvantages of this method is the large number of control charts which are simultaneously used. In this paper, a new approach has been proposed based on orthogonal polynomial approach, in which only two control charts are used to monitor a kth-order polynomial profile. Simulation findings and average run length (ARL curve analysis imply better performance of the proposed approach compared to the existing approach. Also, using the proposed approach is much easier in practice. 7. Statistical study of influence of the IMF cone angle on foreshock processes Science.gov (United States) Urbar, Jaroslav; Nemecek, Zdenek; Safrankova, Jana; Prech, Lubomir 2016-07-01 The parameters of the solar wind plasma are modified upstream the Earth's bow shock, in the ion foreshock region, which is typically observed at quasi-parallel bow shock. Associated ULF waves are created due to the interaction of the solar wind plasma with the ions reflected at the bow shock where they generate fast magnetosonic waves with an in-phase relationship between the ion flux and magnetic field fluctuations. Using multipoint observations from the THEMIS spacecraft located in the vicinity of the bow shock or in the foreshock, we present statistical maps of a modification of solar wind parameters due to foreshock processes (solar wind heating and deceleration, enhancements of electric and magnetic field fluctuation levels, etc.). At the paper, a special attention is devoted to intervals of the radial interplanetary magnetic field that creates the foreshock upstream of a whole dayside bow shock. 8. Statistical modelling of space-time processes with application to wind power DEFF Research Database (Denmark) Lenzi, Amanda . This thesis aims at contributing to the wind power literature by building and evaluating new statistical techniques for producing forecasts at multiple locations and lead times using spatio-temporal information. By exploring the features of a rich portfolio of wind farms in western Denmark, we investigate...... correlation is captured by a latent Gaussian field. We explore how such models can be handled with stochastic partial differential approximations of Matérn Gaussian fields together with integrated nested Laplace approximations. We show that complex hierarchical spatial models are well suited for wind power...... forecasts. The second approach has a common intercept for all farms and a spatio-temporal model that varies in time with first order autoregressive dynamics and has spatially correlated innovations given by a zero mean Gaussian process. The third model, which also has a common intercept as well... 9. Self-Organized Criticality in Astrophysics The Statistics of Nonlinear Processes in the Universe CERN Document Server Aschwanden, Markus 2011-01-01 The concept of ‘self-organized criticality’ (SOC) has been applied to a variety of problems, ranging from population growth and traffic jams to earthquakes, landslides and forest fires. The technique is now being applied to a wide range of phenomena in astrophysics, such as planetary magnetospheres, solar flares, cataclysmic variable stars, accretion disks, black holes and gamma-ray bursts, and also to phenomena in galactic physics and cosmology. Self-organized Criticality in Astrophysics introduces the concept of SOC and shows that, due to its universality and ubiquity, it is a law of nature. The theoretical framework and specific physical models are described, together with a range of applications in various aspects of astrophyics. The mathematical techniques, including the statistics of random processes, time series analysis, time scale and waiting time distributions, are presented and the results are applied to specific observations of astrophysical phenomena. 10. Freely Evolving Process and Statistics in the Two-Dimensional Granular Turbulence Science.gov (United States) Isobe, Masaharu 2002-08-01 We studied the macroscopic statistical properties on the freely evolving quasi-inelastic hard disk (granular) system by performing large-scale (more than a million particles) event-driven molecular dynamics systematically and found that remarkably analogous to an enstrophy cascade process in decaying two-dimensional fluid turbulence. There are four typcial stages in the freely evolving inelastic hard disk system, which are homogeneous, shearing (vortex), clustering and final state. In the shearing stage, the self-organized macroscopic coherent vortices become dominant and the enstrophy decays power-low behavior. In the clustering stage, the energy spectra are close to the expectation of Kraichnan-Batchelor theory and the squared two particle separation strictly obeys Richardson law. These results indicate that the cooperative behavior of quasi-inelastic hard disks system has a same universal class as the macroscopic Navier-Stokes fluid turbulence in the study of dissipative structure. 11. The asymmetric simple exclusion process: an integrable model for non-equilibrium statistical mechanics Science.gov (United States) Golinelli, Olivier; Mallick, Kirone 2006-10-01 The asymmetric simple exclusion process (ASEP) plays the role of a paradigm in non-equilibrium statistical mechanics. We review exact results for the ASEP obtained by the Bethe ansatz and put emphasis on the algebraic properties of this model. The Bethe equations for the eigenvalues of the Markov matrix of the ASEP are derived from the algebraic Bethe ansatz. Using these equations we explain how to calculate the spectral gap of the model and how global spectral properties such as the existence of multiplets can be predicted. An extension of the Bethe ansatz leads to an analytic expression for the large deviation function of the current in the ASEP that satisfies the Gallavotti-Cohen relation. Finally, we describe some variants of the ASEP that are also solvable by the Bethe ansatz. 12. The asymmetric simple exclusion process: an integrable model for non-equilibrium statistical mechanics Energy Technology Data Exchange (ETDEWEB) Golinelli, Olivier [Service de Physique Theorique, CEA Saclay, 91191 Gif-sur-Yvette Cedex (France); Mallick, Kirone [Service de Physique Theorique, CEA Saclay, 91191 Gif-sur-Yvette Cedex (France) 2006-10-13 The asymmetric simple exclusion process (ASEP) plays the role of a paradigm in non-equilibrium statistical mechanics. We review exact results for the ASEP obtained by the Bethe ansatz and put emphasis on the algebraic properties of this model. The Bethe equations for the eigenvalues of the Markov matrix of the ASEP are derived from the algebraic Bethe ansatz. Using these equations we explain how to calculate the spectral gap of the model and how global spectral properties such as the existence of multiplets can be predicted. An extension of the Bethe ansatz leads to an analytic expression for the large deviation function of the current in the ASEP that satisfies the Gallavotti-Cohen relation. Finally, we describe some variants of the ASEP that are also solvable by the Bethe ansatz. 13. Statistical optimization of microencapsulation process for coating of magnesium particles with Viton polymer Energy Technology Data Exchange (ETDEWEB) Pourmortazavi, Seied Mahdi, E-mail: pourmortazavi@yahoo.com [Faculty of Material and Manufacturing Technologies, Malek Ashtar University of Technology, P.O. Box 16765-3454, Tehran (Iran, Islamic Republic of); Babaee, Saeed; Ashtiani, Fatemeh Shamsi [Faculty of Chemistry & Chemical Engineering, Malek Ashtar University of Technology, Tehran (Iran, Islamic Republic of) 2015-09-15 Graphical abstract: - Highlights: • Surface of magnesium particles was modified with Viton via solvent/non-solvent method. • FT-IR, SEM, EDX, Map analysis, and TG/DSC techniques were employed to characterize the coated particles. • Coating process factors were optimized by Taguchi robust design. • The importance of coating conditions on resistance of coated magnesium against oxidation was studied. - Abstract: The surface of magnesium particles was modified by coating with Viton as an energetic polymer using solvent/non-solvent technique. Taguchi robust method was utilized as a statistical experiment design to evaluate the role of coating process parameters. The coated magnesium particles were characterized by various techniques, i.e., Fourier transform infrared (FT-IR) spectroscopy, scanning electron microscopy (SEM), energy-dispersive X-ray spectroscopy (EDX) and thermogravimetry (TG), and differential scanning calorimetry (DSC). The results showed that the coating of magnesium powder with the Viton leads to a higher resistance of metal against oxidation in the presence of air atmosphere. Meanwhile, tuning of the coating process parameters (i.e., percent of Viton, flow rate of non-solvent addition, and type of solvent) influences on the resistance of the metal particles against thermal oxidation. Coating of magnesium particles yields Viton coated particles with higher thermal stability (632 °C); in comparison with the pure magnesium powder, which commences oxidation in the presence of air atmosphere at a lower temperature of 260 °C. 14. Statistical Downscaling Output GCM Modeling with Continuum Regression and Pre-Processing PCA Approach Directory of Open Access Journals (Sweden) Sutikno Sutikno 2010-08-01 Full Text Available One of the climate models used to predict the climatic conditions is Global Circulation Models (GCM. GCM is a computer-based model that consists of different equations. It uses numerical and deterministic equation which follows the physics rules. GCM is a main tool to predict climate and weather, also it uses as primary information source to review the climate change effect. Statistical Downscaling (SD technique is used to bridge the large-scale GCM with a small scale (the study area. GCM data is spatial and temporal data most likely to occur where the spatial correlation between different data on the grid in a single domain. Multicollinearity problems require the need for pre-processing of variable data X. Continuum Regression (CR and pre-processing with Principal Component Analysis (PCA methods is an alternative to SD modelling. CR is one method which was developed by Stone and Brooks (1990. This method is a generalization from Ordinary Least Square (OLS, Principal Component Regression (PCR and Partial Least Square method (PLS methods, used to overcome multicollinearity problems. Data processing for the station in Ambon, Pontianak, Losarang, Indramayu and Yuntinyuat show that the RMSEP values and R2 predict in the domain 8x8 and 12x12 by uses CR method produces results better than by PCR and PLS. 15. Statistical learning problem of artificial neural network to control roofing process Directory of Open Access Journals (Sweden) Lapidus Azariy 2017-01-01 Full Text Available Now software developed on the basis of artificial neural networks (ANN has been actively implemented in construction companies to support decision-making in organization and management of construction processes. ANN learning is the main stage of its development. A key question for supervised learning is how many number of training examples we need to approximate the true relationship between network inputs and output with the desired accuracy. Also designing of ANN architecture is related to learning problem known as “curse of dimensionality”. This problem is important for the study of construction process management because of the difficulty to get training data from construction sites. In previous studies the authors have designed a 4-layer feedforward ANN with a unit model of 12-5-4-1 to approximate estimation and prediction of roofing process. This paper presented the statistical learning side of created ANN with simple-error-minimization algorithm. The sample size to efficient training and the confidence interval of network outputs defined. In conclusion the authors predicted successful ANN learning in a large construction business company within a short space of time. 16. Statistical physics CERN Document Server Sadovskii, Michael V 2012-01-01 This volume provides a compact presentation of modern statistical physics at an advanced level. Beginning with questions on the foundations of statistical mechanics all important aspects of statistical physics are included, such as applications to ideal gases, the theory of quantum liquids and superconductivity and the modern theory of critical phenomena. Beyond that attention is given to new approaches, such as quantum field theory methods and non-equilibrium problems. 17. Determination of dominant biogeochemical processes in a contaminated aquifer-wetland system using multivariate statistical analysis Science.gov (United States) Baez-Cazull, S. E.; McGuire, J.T.; Cozzarelli, I.M.; Voytek, M.A. 2008-01-01 Determining the processes governing aqueous biogeochemistry in a wetland hydrologically linked to an underlying contaminated aquifer is challenging due to the complex exchange between the systems and their distinct responses to changes in precipitation, recharge, and biological activities. To evaluate temporal and spatial processes in the wetland-aquifer system, water samples were collected using cm-scale multichambered passive diffusion samplers (peepers) to span the wetland-aquifer interface over a period of 3 yr. Samples were analyzed for major cations and anions, methane, and a suite of organic acids resulting in a large dataset of over 8000 points, which was evaluated using multivariate statistics. Principal component analysis (PCA) was chosen with the purpose of exploring the sources of variation in the dataset to expose related variables and provide insight into the biogeochemical processes that control the water chemistry of the system. Factor scores computed from PCA were mapped by date and depth. Patterns observed suggest that (i) fermentation is the process controlling the greatest variability in the dataset and it peaks in May; (ii) iron and sulfate reduction were the dominant terminal electron-accepting processes in the system and were associated with fermentation but had more complex seasonal variability than fermentation; (iii) methanogenesis was also important and associated with bacterial utilization of minerals as a source of electron acceptors (e.g., barite BaSO4); and (iv) seasonal hydrological patterns (wet and dry periods) control the availability of electron acceptors through the reoxidation of reduced iron-sulfur species enhancing iron and sulfate reduction. Copyright ?? 2008 by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America. All rights reserved. 18. Approximation of epidemic models by diffusion processes and their statistical inference. Science.gov (United States) Guy, Romain; Larédo, Catherine; Vergu, Elisabeta 2015-02-01 Multidimensional continuous-time Markov jump processes [Formula: see text] on [Formula: see text] form a usual set-up for modeling [Formula: see text]-like epidemics. However, when facing incomplete epidemic data, inference based on [Formula: see text] is not easy to be achieved. Here, we start building a new framework for the estimation of key parameters of epidemic models based on statistics of diffusion processes approximating [Formula: see text]. First, previous results on the approximation of density-dependent [Formula: see text]-like models by diffusion processes with small diffusion coefficient [Formula: see text], where [Formula: see text] is the population size, are generalized to non-autonomous systems. Second, our previous inference results on discretely observed diffusion processes with small diffusion coefficient are extended to time-dependent diffusions. Consistent and asymptotically Gaussian estimates are obtained for a fixed number [Formula: see text] of observations, which corresponds to the epidemic context, and for [Formula: see text]. A correction term, which yields better estimates non asymptotically, is also included. Finally, performances and robustness of our estimators with respect to various parameters such as [Formula: see text] (the basic reproduction number), [Formula: see text], [Formula: see text] are investigated on simulations. Two models, [Formula: see text] and [Formula: see text], corresponding to single and recurrent outbreaks, respectively, are used to simulate data. The findings indicate that our estimators have good asymptotic properties and behave noticeably well for realistic numbers of observations and population sizes. This study lays the foundations of a generic inference method currently under extension to incompletely observed epidemic data. Indeed, contrary to the majority of current inference techniques for partially observed processes, which necessitates computer intensive simulations, our method being mostly an 19. Wind gust estimation by combining numerical weather prediction model and statistical post-processing Science.gov (United States) Patlakas, Platon; Drakaki, Eleni; Galanis, George; Spyrou, Christos; Kallos, George 2017-04-01 The continuous rise of off-shore and near-shore activities as well as the development of structures, such as wind farms and various offshore platforms, requires the employment of state-of-the-art risk assessment techniques. Such analysis is used to set the safety standards and can be characterized as a climatologically oriented approach. Nevertheless, a reliable operational support is also needed in order to minimize cost drawbacks and human danger during the construction and the functioning stage as well as during maintenance activities. One of the most important parameters for this kind of analysis is the wind speed intensity and variability. A critical measure associated with this variability is the presence and magnitude of wind gusts as estimated in the reference level of 10m. The latter can be attributed to different processes that vary among boundary-layer turbulence, convection activities, mountain waves and wake phenomena. The purpose of this work is the development of a wind gust forecasting methodology combining a Numerical Weather Prediction model and a dynamical statistical tool based on Kalman filtering. To this end, the parameterization of Wind Gust Estimate method was implemented to function within the framework of the atmospheric model SKIRON/Dust. The new modeling tool combines the atmospheric model with a statistical local adaptation methodology based on Kalman filters. This has been tested over the offshore west coastline of the United States. The main purpose is to provide a useful tool for wind analysis and prediction and applications related to offshore wind energy (power prediction, operation and maintenance). The results have been evaluated by using observational data from the NOAA's buoy network. As it was found, the predicted output shows a good behavior that is further improved after the local adjustment post-process. 20. Machining Error Control by Integrating Multivariate Statistical Process Control and Stream of Variations Methodology Institute of Scientific and Technical Information of China (English) WANG Pei; ZHANG Dinghua; LI Shan; CHEN Bing 2012-01-01 For aircraft manufacturing industries,the analyses and prediction of part machining error during machining process are very important to control and improve part machining quality.In order to effectively control machining error,the method of integrating multivariate statistical process control (MSPC) and stream of variations (SoV) is proposed.Firstly,machining error is modeled by multi-operation approaches for part machining process.SoV is adopted to establish the mathematic model of the relationship between the error of upstream operations and the error of downstream operations.Here error sources not only include the influence of upstream operations but also include many of other error sources.The standard model and the predicted model about SoV are built respectively by whether the operation is done or not to satisfy different requests during part machining process.Secondly,the method of one-step ahead forecast error (OSFE) is used to eliminate autocorrelativity of the sample data from the SoV model,and the T2 control chart in MSPC is built to realize machining error detection according to the data characteristics of the above error model,which can judge whether the operation is out of control or not.If it is,then feedback is sent to the operations.The error model is modified by adjusting the operation out of control,and continually it is used to monitor operations.Finally,a machining instance containing two operations demonstrates the effectiveness of the machining error control method presented in this paper. 1. Statistics of MLT wind field values derived from 11 years of common volume specular meteor observations in northern Norway Science.gov (United States) Chau, Jorge Luis; Stober, Gunter; Laskar, Fazlul; Hall, Chris M.; Tsutsumi, Masaki 2016-04-01 Traditionally mean values of the mesosphere and lower thermosphere winds over the radar volume are obtained using monostatic specular meteor radars. Such observing volume consist of a few hundreds of kilometers in radius. Moreover the differences between measured radial velocities and the expected radial velocities from the measured mean winds are used to derive properties of gravity wave momentum fluxes. Recently, Stober and Chau [2015] have proposed to use a multi-static approach to retrieve horizontally resolved wind fields, where most of the radar volume is observed from different viewing angles. Similar results could be obtained if measurements from close-by monostatic systems are combined. In this work we present the results of the derived wind fields from combining specular meteor radar data between 2004 and 2015 from the Trømso (19.22oW, 69.58oN) and Andenes (16.04oW, 69.27oN) radar systems. Among the directly estimated values are the mean winds and the horizontal and vertical gradients of the zonal and meridional winds. Combining the horizontal gradients, the horizontal divergence, relative vorticity, shear and deformation are derived. The seasonal and annual variability of these parameters are presented and discussed, as well as the planetary wave, tidal, and gravity wave information embedded in these new parameters. 2. Iontophoretic delivery of lisinopril: Optimization of process variables by Box-Behnken statistical design. Science.gov (United States) Gannu, Ramesh; Yamsani, Vamshi Vishnu; Palem, Chinna Reddy; Yamsani, Shravan Kumar; Yamsani, Madhusudan Rao 2010-01-01 The objective of the investigation was to optimize the iontophoresis process parameters of lisinopril (LSP) by 3 x 3 factorial design, Box-Behnken statistical design. LSP is an ideal candidate for iontophoretic delivery to avoid the incomplete absorption problem associated after its oral administration. Independent variables selected were current (X(1)), salt (sodium chloride) concentration (X(2)) and medium/pH (X(3)). The dependent variables studied were amount of LSP permeated in 4 h (Y(1): Q(4)), 24 h (Y(2): Q(24)) and lag time (Y(3)). Mathematical equations and response surface plots were used to relate the dependent and independent variables. The regression equation generated for the iontophoretic permeation was Y(1) = 1.98 + 1.23X(1) - 0.49X(2) + 0.025X(3) - 0.49X(1)X(2) + 0.040X(1)X(3) - 0.010X(2)X(3) + 0.58X(1)(2) - 0.17X(2)(2) - 0.18X(3)(2); Y(2) = 7.28 + 3.32X(1) - 1.52X(2) + 0.22X(3) - 1.30X(1)X(2) + 0.49X(1)X(3) - 0.090X(2)X(3) + 0.79X(1)(2) - 0.62X(2)(2) - 0.33X(3)(2) and Y(3) = 0.60 + 0.0038X(1) + 0.12X(2) - 0.011X(3) + 0.005X(1)X(2) - 0.018X(1)X(3) - 0.015X(2)X(3) - 0.00075X(1)(2) + 0.017X(2)(2) - 0.11X(3)(2). The statistical validity of the polynomials was established and optimized process parameters were selected by feasibility and grid search. Validation of the optimization study with 8 confirmatory runs indicated high degree of prognostic ability of response surface methodology. The use of Box-Behnken design approach helped in identifying the critical process parameters in the iontophoretic delivery of lisinopril. 3. Signal processing and statistical descriptive reanalysis of steady state chute-flow experiments Science.gov (United States) truong, hoan; eckert, nicolas; keylock, chris; naaim, mohamed; bellot, hervé 2014-05-01 An accurate knowledge of snow rheology is needed for the mitigation against avalanche hazard. Indeed snow avalanches have a significant impact on the livelihoods and economies of alpine communities. To do so, 60 small-scale in-situ flow experiments were performed with various slopes, temperatures and flow depths. The investigation of these data previously seemed to show the dense flow of dry snow may be composed of two layers; a sheared basal layer made of single snow grains and a less sheared upper layer made of large aggregates. These outcomes were mainly based on the mean velocity profile of the flow and on interpretation in terms of rheological behavior of granular materials and snow microstructure [Pierre G. Rognon et al., 2007]. Here, the main objective remains the same, but the rheological and physical viewpoints are put aside to extract as much information contained in the data as possible various using signal processing methods and descriptive statistics methods as the maximum overlap discrete wavelet transform (MODWT), transfer entropy (TE) and maximum cross-correlation (MCC). Specifically, we aim at the improving the velocity estimations as function of the depth particularly the velocity fluctuations around the mean profile to better document the behavior of dense dry snow flows during a steady and uniform chute regime. The data are composed of pairs of voltage signals (right and left), which makes that the velocity is known indirectly only. The MCC method is classically used to determine the time lag between both signals. Previously, the MCC method that showed the mean velocity profile may be fitted by a simple bilinear function [Pierre G. Rognon et al., 2007], but no interesting temporal dynamics could be highlighted. Hence, a new process method was developed to provide velocity series with much better temporal resolution. The process is mainly made of a MODWT-based denoising method and the choice of window size for correlation. The results prove to be 4. Use of graphical statistical process control tools to monitor and improve outcomes in cardiac surgery. Science.gov (United States) Smith, Ian R; Garlick, Bruce; Gardner, Michael A; Brighouse, Russell D; Foster, Kelley A; Rivers, John T 2013-02-01 Graphical Statistical Process Control (SPC) tools have been shown to promptly identify significant variations in clinical outcomes in a range of health care settings. We explored the application of these techniques to qualitatively inform the routine cardiac surgical morbidity and mortality (M&M) review process at a single site. Baseline clinical and procedural data relating to 4774 consecutive cardiac surgical procedures, performed between the 1st January 2003 and the 30th April 2011, were retrospectively evaluated. A range of appropriate performance measures and benchmarks were developed and evaluated using a combination of CUmulative SUM (CUSUM) charts, Exponentially Weighted Moving Average (EWMA) charts and Funnel Plots. Charts have been discussed at the unit's routine M&M meetings. Risk adjustment (RA) based on EuroSCORE has been incorporated into the charts to improve performance. Discrete and aggregated measures, including Blood Product/Reoperation, major acute post-procedural complications and Length of Stay/Readmissiontools facilitate near "real-time" performance monitoring allowing early detection and intervention in altered performance. Careful interpretation of charts for group and individual operators has proven helpful in detecting and differentiating systemic vs. individual variation. Copyright © 2012 Australian and New Zealand Society of Cardiac and Thoracic Surgeons (ANZSCTS) and the Cardiac Society of Australia and New Zealand (CSANZ). Published by Elsevier B.V. All rights reserved. 5. Statistical evaluation and optimization of zinc electrolyte hot purification process by Taguchi method Institute of Scientific and Technical Information of China (English) Bahram Behnajady; Javad Moghaddam 2015-01-01 The neutral zinc sulfate solution obtained from hydrometallurgical process of Angouran zinc concentrate has cadmium, nickel and cobalt impurities, that must be purified before electrowinning. Therefore, cadmium and nickel are usually cemented out by addition of zinc dust and remained nickel and cobalt cemented out at second stage with zinc powder and arsenic trioxide. In this research, a new approach is described for determination of effective parameters and optimization of zinc electrolyte hot purification process using statistical design of experiments. The Taguchi method based on orthogonal array design (OAD) has been used to arrange the experimental runs. The experimental conditions involved in the work are as follows: the temperature range of 70−90°C for reaction temperature (T), 30−90 min for reaction time (t), 2−4 g/L for zinc powder mass concentration (M), one to five series for zinc dust particle size distributions (S1−S5), and 0.1−0.5 g/L (C) for arsenic trioxide mass concentration. Optimum conditions for hot purification obtained in this work areT4 (85 °C),t4=75 min,M4=3.5 g/L,S4 (Serie 4), andC2=0.2 g/L. 6. Improving medium-range ensemble streamflow forecasts through statistical post-processing Science.gov (United States) Mendoza, Pablo; Wood, Andy; Clark, Elizabeth; Nijssen, Bart; Clark, Martyn; Ramos, Maria-Helena; Nowak, Kenneth; Arnold, Jeffrey 2017-04-01 Probabilistic hydrologic forecasts are a powerful source of information for decision-making in water resources operations. A common approach is the hydrologic model-based generation of streamflow forecast ensembles, which can be implemented to account for different sources of uncertainties - e.g., from initial hydrologic conditions (IHCs), weather forecasts, and hydrologic model structure and parameters. In practice, hydrologic ensemble forecasts typically have biases and spread errors stemming from errors in the aforementioned elements, resulting in a degradation of probabilistic properties. In this work, we compare several statistical post-processing techniques applied to medium-range ensemble streamflow forecasts obtained with the System for Hydromet Applications, Research and Prediction (SHARP). SHARP is a fully automated prediction system for the assessment and demonstration of short-term to seasonal streamflow forecasting applications, developed by the National Center for Atmospheric Research, University of Washington, U.S. Army Corps of Engineers, and U.S. Bureau of Reclamation. The suite of post-processing techniques includes linear blending, quantile mapping, extended logistic regression, quantile regression, ensemble analogs, and the generalized linear model post-processor (GLMPP). We assess and compare these techniques using multi-year hindcasts in several river basins in the western US. This presentation discusses preliminary findings about the effectiveness of the techniques for improving probabilistic skill, reliability, discrimination, sharpness and resolution. 7. Artificial intelligence versus statistical modeling and optimization of continuous bead milling process for bacterial cell lysis Directory of Open Access Journals (Sweden) Shafiul Haque 2016-11-01 Full Text Available AbstractFor a commercially viable recombinant intracellular protein production process, efficient cell lysis and protein release is a major bottleneck. The recovery of recombinant protein, cholesterol oxidase (COD was studied in a continuous bead milling process. A full factorial Response Surface Model (RSM design was employed and compared to Artificial Neural Networks coupled with Genetic Algorithm (ANN-GA. Significant process variables, cell slurry feed rate (A, bead load (B, cell load (C and run time (D, were investigated and optimized for maximizing COD recovery. RSM predicted an optimum of feed rate of 310.73 mL/h, bead loading of 79.9% (v/v, cell loading OD600 nm of 74, and run time of 29.9 min with a recovery of ~3.2 g/L. ANN coupled with GA predicted a maximum COD recovery of ~3.5 g/L at an optimum feed rate (mL/h: 258.08, bead loading (%, v/v: 80%, cell loading (OD600 nm: 73.99, and run time of 32 min. An overall 3.7-fold increase in productivity is obtained when compared to a batch process. Optimization and comparison of statistical vs. artificial intelligence techniques in continuous bead milling process has been attempted for the very first time in our study. We were able to successfully represent the complex non-linear multivariable dependence of enzyme recovery on bead milling parameters. The quadratic second order response functions are not flexible enough to represent such complex non-linear dependence. ANN being a summation function of multiple layers are capable to represent complex non-linear dependence of variables in this case; enzyme recovery as a function of bead milling parameters. Since GA can even optimize discontinuous functions present study cites a perfect example of using machine learning (ANN in combination with evolutionary optimization (GA for representing undefined biological functions which is the case for common industrial processes involving biological moieties. 8. Performance of Statistical Control Charts with Bilateral Limits of Probability to Monitor Processes Weibull in Maintenance Directory of Open Access Journals (Sweden) Quintana Alicia Esther 2015-01-01 Full Text Available Manufacturing with optimal quality standards is underpinned to the high reliability of its equipment and systems, among other essential pillars. Maintenance Engineering is responsible for planning control and continuous improvement of its critical equipment by any approach, such as Six Sigma. This is nourished by numerous statistical tools highlighting, among them, statistical process control charts. While their first applications were in production, other designs have emerged to adapt to new needs as monitoring equipment and systems in the manufacturing environment. The time between failures usually fits an exponential or Weibull model. The t chart and adjusted t chart, with probabilistic control limits, are suitable alternatives to monitor the mean time between failures. Unfortunately, it is difficult to find publications of them applied to the models Weibull, very useful in contexts such as maintenance. In addition, literature limits the study of their performance to the analysis of the standard metric average run length, thus giving a partial view. The aim of this paper is to explore the performance of the t chart and adjusted t chart using three metrics, two unconventional. To do this, it incorporates the concept of lateral variability, in their forms left and right variability. Major precisions of the behavior of these charts allow to understand the conditions under which are suitable: if the main objective of monitoring lies in detecting deterioration, the t chart with adjustment is recommended. On the other hand, when the priority is to detect improvements, the t chart without adjustment is the best choice. However, the response speed of both charts is very variable from run to run. 9. Quality control of high-dose-rate brachytherapy: treatment delivery analysis using statistical process control. Science.gov (United States) Able, Charles M; Bright, Megan; Frizzell, Bart 2013-03-01 Statistical process control (SPC) is a quality control method used to ensure that a process is well controlled and operates with little variation. This study determined whether SPC was a viable technique for evaluating the proper operation of a high-dose-rate (HDR) brachytherapy treatment delivery system. A surrogate prostate patient was developed using Vyse ordnance gelatin. A total of 10 metal oxide semiconductor field-effect transistors (MOSFETs) were placed from prostate base to apex. Computed tomography guidance was used to accurately position the first detector in each train at the base. The plan consisted of 12 needles with 129 dwell positions delivering a prescribed peripheral dose of 200 cGy. Sixteen accurate treatment trials were delivered as planned. Subsequently, a number of treatments were delivered with errors introduced, including wrong patient, wrong source calibration, wrong connection sequence, single needle displaced inferiorly 5 mm, and entire implant displaced 2 mm and 4 mm inferiorly. Two process behavior charts (PBC), an individual and a moving range chart, were developed for each dosimeter location. There were 4 false positives resulting from 160 measurements from 16 accurately delivered treatments. For the inaccurately delivered treatments, the PBC indicated that measurements made at the periphery and apex (regions of high-dose gradient) were much more sensitive to treatment delivery errors. All errors introduced were correctly identified by either the individual or the moving range PBC in the apex region. Measurements at the urethra and base were less sensitive to errors. SPC is a viable method for assessing the quality of HDR treatment delivery. Further development is necessary to determine the most effective dose sampling, to ensure reproducible evaluation of treatment delivery accuracy. Copyright © 2013 Elsevier Inc. All rights reserved. 10. Quality Control of High-Dose-Rate Brachytherapy: Treatment Delivery Analysis Using Statistical Process Control Energy Technology Data Exchange (ETDEWEB) Able, Charles M., E-mail: cable@wfubmc.edu [Department of Radiation Oncology, Wake Forest School of Medicine, Winston-Salem, North Carolina (United States); Bright, Megan; Frizzell, Bart [Department of Radiation Oncology, Wake Forest School of Medicine, Winston-Salem, North Carolina (United States) 2013-03-01 Purpose: Statistical process control (SPC) is a quality control method used to ensure that a process is well controlled and operates with little variation. This study determined whether SPC was a viable technique for evaluating the proper operation of a high-dose-rate (HDR) brachytherapy treatment delivery system. Methods and Materials: A surrogate prostate patient was developed using Vyse ordnance gelatin. A total of 10 metal oxide semiconductor field-effect transistors (MOSFETs) were placed from prostate base to apex. Computed tomography guidance was used to accurately position the first detector in each train at the base. The plan consisted of 12 needles with 129 dwell positions delivering a prescribed peripheral dose of 200 cGy. Sixteen accurate treatment trials were delivered as planned. Subsequently, a number of treatments were delivered with errors introduced, including wrong patient, wrong source calibration, wrong connection sequence, single needle displaced inferiorly 5 mm, and entire implant displaced 2 mm and 4 mm inferiorly. Two process behavior charts (PBC), an individual and a moving range chart, were developed for each dosimeter location. Results: There were 4 false positives resulting from 160 measurements from 16 accurately delivered treatments. For the inaccurately delivered treatments, the PBC indicated that measurements made at the periphery and apex (regions of high-dose gradient) were much more sensitive to treatment delivery errors. All errors introduced were correctly identified by either the individual or the moving range PBC in the apex region. Measurements at the urethra and base were less sensitive to errors. Conclusions: SPC is a viable method for assessing the quality of HDR treatment delivery. Further development is necessary to determine the most effective dose sampling, to ensure reproducible evaluation of treatment delivery accuracy. 11. [Statistical Process Control (SPC) can help prevent treatment errors without increasing costs in radiotherapy]. Science.gov (United States) Govindarajan, R; Llueguera, E; Melero, A; Molero, J; Soler, N; Rueda, C; Paradinas, C 2010-01-01 Statistical Process Control (SPC) was applied to monitor patient set-up in radiotherapy and, when the measured set-up error values indicated a loss of process stability, its root cause was identified and eliminated to prevent set-up errors. Set up errors were measured for medial-lateral (ml), cranial-caudal (cc) and anterior-posterior (ap) dimensions and then the upper control limits were calculated. Once the control limits were known and the range variability was acceptable, treatment set-up errors were monitored using sub-groups of 3 patients, three times each shift. These values were plotted on a control chart in real time. Control limit values showed that the existing variation was acceptable. Set-up errors, measured and plotted on a X chart, helped monitor the set-up process stability and, if and when the stability was lost, treatment was interrupted, the particular cause responsible for the non-random pattern was identified and corrective action was taken before proceeding with the treatment. SPC protocol focuses on controlling the variability due to assignable cause instead of focusing on patient-to-patient variability which normally does not exist. Compared to weekly sampling of set-up error in each and every patient, which may only ensure that just those sampled sessions were set-up correctly, the SPC method enables set-up error prevention in all treatment sessions for all patients and, at the same time, reduces the control costs. Copyright © 2009 SECA. Published by Elsevier Espana. All rights reserved. 12. Preparing systems engineering and computing science students in disciplined methods, quantitative, and advanced statistical techniques to improve process performance Science.gov (United States) McCray, Wilmon Wil L., Jr. The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization 13. Molten salt coal gasification process development unit. Phase 1. Volume 1. PDU operations. Final report Energy Technology Data Exchange (ETDEWEB) Kohl, A.L. 1980-05-01 This report summarizes the results of a test program conducted on the Molten Salt Coal Gasification Process, which included the design, construction, and operation of a Process Development Unit. In this process, coal is gasified by contacting it with air in a turbulent pool of molten sodium carbonate. Sulfur and ash are retained in the melt, and a small stream is continuously removed from the gasifier for regeneration of sodium carbonate, removal of sulfur, and disposal of the ash. The process can handle a wide variety of feed materials, including highly caking coals, and produces a gas relatively free from tars and other impurities. The gasification step is carried out at approximately 1800/sup 0/F. The PDU was designed to process 1 ton per hour of coal at pressures up to 20 atm. It is a completely integrated facility including systems for feeding solids to the gasifier, regenerating sodium carbonate for reuse, and removing sulfur and ash in forms suitable for disposal. Five extended test runs were made. The observed product gas composition was quite close to that predicted on the basis of earlier small-scale tests and thermodynamic considerations. All plant systems were operated in an integrated manner during one of the runs. The principal problem encountered during the five test runs was maintaining a continuous flow of melt from the gasifier to the quench tank. Test data and discussions regarding plant equipment and process performance are presented. The program also included a commercial plant study which showed the process to be attractive for use in a combined-cycle, electric power plant. The report is presented in two volumes, Volume 1, PDU Operations, and Volume 2, Commercial Plant Study. 14. Bioenergy. Data base for the statistics of the renewable energy and emissions balance. Material volume; Bioenergie. Datengrundlagen fuer die Statistik der erneuerbaren Energien und Emissionsbilanzierung. Materialband Energy Technology Data Exchange (ETDEWEB) Dreher, Marion; Memmler, Michael; Rother, Stefan; Schneider, Sven [Umweltbundesamt, Dessau (Germany); Boehme, Dieter [Bundesministerium fuer Umwelt, Naturschutz und Reaktorsicherheit, Berlin (Germany) 2012-02-15 In July 2011, the Federal Environment Agency (Dessau-Rosslau, Federal Republic of Germany) and the Federal Ministry for the Environment, Nature Conservation and Nuclear Safety (Berlin, Federal Republic of Germany) performed the workshop ''Bioenergy. Data base for the statistics of the renewable energy and emissions balance''. The material volume of this workshop under consideration contains plenary lectures on the state of knowledge and information need as well as materials to the working groups solid biomass (working group 1), biogas (working group 2) and liquid biomass (working group 3). 15. GASP cloud- and particle-encounter statistics and their application to LPC aircraft studies. Volume 1: Analysis and conclusions Science.gov (United States) Jasperson, W. H.; Nastrom, G. D.; Davis, R. E.; Holdeman, J. D. 1984-01-01 Summary studies are presented for the entire cloud observation archieve from the NASA Global Atmospheric Sampling Program (GASP). Studies are also presented for GASP particle concentration data gathered concurrently with the cloud observations. Cloud encounters are shown on about 15 percent of the data samples overall, but the probability of cloud encounter is shown to vary significantly with altitude, latitude, and distance from the tropopause. Several meteorological circulation features are apparent in the latitudinal distribution of cloud cover, and the cloud encounter statistics are shown to be consistent with the classical mid-latitude cyclone model. Observations of clouds spaced more closely than 90 minutes are shown to be statistically dependent. The statistics for cloud and particle encounter are utilized to estimate the frequency of cloud encounter on long range airline routes, and to assess the probability and extent of laminar flow loss due to cloud or particle encounter by aircraft utilizing laminar flow control (LFC). It is shown that the probability of extended cloud encounter is too low, of itself, to make LFC impractical. 16. The NJOY nuclear data processing system: Volume 2, The NJOY, RECONR, BROADR, HEATR, and THERMR modules Energy Technology Data Exchange (ETDEWEB) MacFarlane, R.E.; Muir, D.W.; Boicourt, R.M. 1982-05-01 The NJOY nuclear data processing system is a comprehensive computer code package for producing cross sections and related nuclear parameters from ENDF/B evaluated nuclear data. This volume provides detailed descriptions of the NJOY module, which contains the executive program and utility subroutines used by the other modules, and it discusses the theory and computational methods of four of the modules used for producing pointwise cross sections: RECONR, BROADR, HEATR, and THERMR. 17. Stochastic processes, optimization, and control theory a volume in honor of Suresh Sethi CERN Document Server Yan, Houmin 2006-01-01 This edited volume contains 16 research articles. It presents recent and pressing issues in stochastic processes, control theory, differential games, optimization, and their applications in finance, manufacturing, queueing networks, and climate control. One of the salient features is that the book is highly multi-disciplinary. The book is dedicated to Professor Suresh Sethi on the occasion of his 60th birthday, in view of his distinguished career. 18. Can postoperative process of care utilization or complication rates explain the volume-cost relationship for cancer surgery? Science.gov (United States) Ho, Vivian; Short, Marah N; Aloia, Thomas A 2017-08-01 Past studies identify an association between provider volume and outcomes, but less is known about the volume-cost relationship for cancer surgery. We analyze the volume-cost relationship for 6 cancer operations and explore whether it is influenced by the occurrence of complications and/or utilization of processes of care. Medicare hospital and inpatient claims for the years 2005 through 2009 were analyzed for 6 cancer resections: colectomy, rectal resection, pulmonary lobectomy, pneumonectomy, esophagectomy, and pancreatic resection. Regressions were first estimated to quantify the association of provider volume with costs, excluding measures of complications and processes of care as explanatory variables. Next, these variables were added to the regressions to test whether they weakened any previously observed volume-cost relationship. Higher hospital volume is associated with lower patient costs for esophagectomy but not for other operations. Higher surgeon volume reduces costs for most procedures, but this result weakens when processes of care are added to the regressions. Processes of care that are frequently implemented in response to adverse events are associated with 14% to 34% higher costs. Utilization of these processes is more prevalent among low-volume versus high-volume surgeons. Processes of care implemented when complications occur explain much of the surgeon volume-cost relationship. Given that surgeon volume is readily observed, better outcomes and lower costs may be achieved by referring patients to high-volume surgeons. Increasing patient access to surgeons with lower rates of complications may be the most effective strategy for avoiding costly processes of care, controlling expenditure growth. Copyright © 2017 Elsevier Inc. All rights reserved. 19. Information processing in bacteria: memory, computation, and statistical physics: a key issues review. Science.gov (United States) Lan, Ganhui; Tu, Yuhai 2016-05-01 preserving information, it does not reveal the underlying mechanism that leads to the observed input-output relationship, nor does it tell us much about which information is important for the organism and how biological systems use information to carry out specific functions. To do that, we need to develop models of the biological machineries, e.g. biochemical networks and neural networks, to understand the dynamics of biological information processes. This is a much more difficult task. It requires deep knowledge of the underlying biological network-the main players (nodes) and their interactions (links)-in sufficient detail to build a model with predictive power, as well as quantitative input-output measurements of the system under different perturbations (both genetic variations and different external conditions) to test the model predictions to guide further development of the model. Due to the recent growth of biological knowledge thanks in part to high throughput methods (sequencing, gene expression microarray, etc) and development of quantitative in vivo techniques such as various florescence technology, these requirements are starting to be realized in different biological systems. The possible close interaction between quantitative experimentation and theoretical modeling has made systems biology an attractive field for physicists interested in quantitative biology. In this review, we describe some of the recent work in developing a quantitative predictive model of bacterial chemotaxis, which can be considered as the hydrogen atom of systems biology. Using statistical physics approaches, such as the Ising model and Langevin equation, we study how bacteria, such as E. coli, sense and amplify external signals, how they keep a working memory of the stimuli, and how they use these data to compute the chemical gradient. In particular, we will describe how E. coli cells avoid cross-talk in a heterogeneous receptor cluster to keep a ligand-specific memory. We will also 20. Information processing in bacteria: memory, computation, and statistical physics: a key issues review Science.gov (United States) Lan, Ganhui; Tu, Yuhai 2016-05-01 preserving information, it does not reveal the underlying mechanism that leads to the observed input-output relationship, nor does it tell us much about which information is important for the organism and how biological systems use information to carry out specific functions. To do that, we need to develop models of the biological machineries, e.g. biochemical networks and neural networks, to understand the dynamics of biological information processes. This is a much more difficult task. It requires deep knowledge of the underlying biological network—the main players (nodes) and their interactions (links)—in sufficient detail to build a model with predictive power, as well as quantitative input-output measurements of the system under different perturbations (both genetic variations and different external conditions) to test the model predictions to guide further development of the model. Due to the recent growth of biological knowledge thanks in part to high throughput methods (sequencing, gene expression microarray, etc) and development of quantitative in vivo techniques such as various florescence technology, these requirements are starting to be realized in different biological systems. The possible close interaction between quantitative experimentation and theoretical modeling has made systems biology an attractive field for physicists interested in quantitative biology. In this review, we describe some of the recent work in developing a quantitative predictive model of bacterial chemotaxis, which can be considered as the hydrogen atom of systems biology. Using statistical physics approaches, such as the Ising model and Langevin equation, we study how bacteria, such as E. coli, sense and amplify external signals, how they keep a working memory of the stimuli, and how they use these data to compute the chemical gradient. In particular, we will describe how E. coli cells avoid cross-talk in a heterogeneous receptor cluster to keep a ligand-specific memory. We will also 1. Initial investigation using statistical process control for quality control of accelerator beam steering Directory of Open Access Journals (Sweden) Able Charles M 2011-12-01 Full Text Available Abstract Background This study seeks to increase clinical operational efficiency and accelerator beam consistency by retrospectively investigating the application of statistical process control (SPC to linear accelerator beam steering parameters to determine the utility of such a methodology in detecting changes prior to equipment failure (interlocks actuated. Methods Steering coil currents (SCC for the transverse and radial planes are set such that a reproducibly useful photon or electron beam is available. SCC are sampled and stored in the control console computer each day during the morning warm-up. The transverse and radial - positioning and angle SCC for photon beam energies were evaluated using average and range (Xbar-R process control charts (PCC. The weekly average and range values (subgroup n = 5 for each steering coil were used to develop the PCC. SCC from September 2009 (annual calibration until two weeks following a beam steering failure in June 2010 were evaluated. PCC limits were calculated using the first twenty subgroups. Appropriate action limits were developed using conventional SPC guidelines. Results PCC high-alarm action limit was set at 6 standard deviations from the mean. A value exceeding this limit would require beam scanning and evaluation by the physicist and engineer. Two low alarms were used to indicate negative trends. Alarms received following establishment of limits (week 20 are indicative of a non-random cause for deviation (Xbar chart and/or an uncontrolled process (R chart. Transverse angle SCC for 6 MV and 15 MV indicated a high-alarm 90 and 108 days prior to equipment failure respectively. A downward trend in this parameter continued, with high-alarm, until failure. Transverse position and radial angle SCC for 6 and 15 MV indicated low-alarms starting as early as 124 and 116 days prior to failure, respectively. Conclusion Radiotherapy clinical efficiency and accelerator beam consistency may be improved by 2. Understanding Aroma Release from Model Cheeses by a Statistical Multiblock Approach on Oral Processing Science.gov (United States) Feron, Gilles; Ayed, Charfedinne; Qannari, El Mostafa; Courcoux, Philippe; Laboure, Hélène; Guichard, Elisabeth 2014-01-01 For human beings, the mouth is the first organ to perceive food and the different signalling events associated to food breakdown. These events are very complex and as such, their description necessitates combining different data sets. This study proposed an integrated approach to understand the relative contribution of main food oral processing events involved in aroma release during cheese consumption. In vivo aroma release was monitored on forty eight subjects who were asked to eat four different model cheeses varying in fat content and firmness and flavoured with ethyl propanoate and nonan-2-one. A multiblock partial least square regression was performed to explain aroma release from the different physiological data sets (masticatory behaviour, bolus rheology, saliva composition and flux, mouth coating and bolus moistening). This statistical approach was relevant to point out that aroma release was mostly explained by masticatory behaviour whatever the cheese and the aroma, with a specific influence of mean amplitude on aroma release after swallowing. Aroma release from the firmer cheeses was explained mainly by bolus rheology. The persistence of hydrophobic compounds in the breath was mainly explained by bolus spreadability, in close relation with bolus moistening. Resting saliva poorly contributed to the analysis whereas the composition of stimulated saliva was negatively correlated with aroma release and mostly for soft cheeses, when significant. PMID:24691625 3. Understanding aroma release from model cheeses by a statistical multiblock approach on oral processing. Science.gov (United States) Feron, Gilles; Ayed, Charfedinne; Qannari, El Mostafa; Courcoux, Philippe; Laboure, Hélène; Guichard, Elisabeth 2014-01-01 For human beings, the mouth is the first organ to perceive food and the different signalling events associated to food breakdown. These events are very complex and as such, their description necessitates combining different data sets. This study proposed an integrated approach to understand the relative contribution of main food oral processing events involved in aroma release during cheese consumption. In vivo aroma release was monitored on forty eight subjects who were asked to eat four different model cheeses varying in fat content and firmness and flavoured with ethyl propanoate and nonan-2-one. A multiblock partial least square regression was performed to explain aroma release from the different physiological data sets (masticatory behaviour, bolus rheology, saliva composition and flux, mouth coating and bolus moistening). This statistical approach was relevant to point out that aroma release was mostly explained by masticatory behaviour whatever the cheese and the aroma, with a specific influence of mean amplitude on aroma release after swallowing. Aroma release from the firmer cheeses was explained mainly by bolus rheology. The persistence of hydrophobic compounds in the breath was mainly explained by bolus spreadability, in close relation with bolus moistening. Resting saliva poorly contributed to the analysis whereas the composition of stimulated saliva was negatively correlated with aroma release and mostly for soft cheeses, when significant. 4. CEval: All-in-one software for data processing and statistical evaluations in affinity capillary electrophoresis. Science.gov (United States) Dubský, Pavel; Ördögová, Magda; Malý, Michal; Riesová, Martina 2016-05-06 We introduce CEval software (downloadable for free at echmet.natur.cuni.cz) that was developed for quicker and easier electrophoregram evaluation and further data processing in (affinity) capillary electrophoresis. This software allows for automatic peak detection and evaluation of common peak parameters, such as its migration time, area, width etc. Additionally, the software includes a nonlinear regression engine that performs peak fitting with the Haarhoff-van der Linde (HVL) function, including automated initial guess of the HVL function parameters. HVL is a fundamental peak-shape function in electrophoresis, based on which the correct effective mobility of the analyte represented by the peak is evaluated. Effective mobilities of an analyte at various concentrations of a selector can be further stored and plotted in an affinity CE mode. Consequently, the mobility of the free analyte, μA, mobility of the analyte-selector complex, μAS, and the apparent complexation constant, K('), are first guessed automatically from the linearized data plots and subsequently estimated by the means of nonlinear regression. An option that allows two complexation dependencies to be fitted at once is especially convenient for enantioseparations. Statistical processing of these data is also included, which allowed us to: i) express the 95% confidence intervals for the μA, μAS and K(') least-squares estimates, ii) do hypothesis testing on the estimated parameters for the first time. We demonstrate the benefits of the CEval software by inspecting complexation of tryptophan methyl ester with two cyclodextrins, neutral heptakis(2,6-di-O-methyl)-β-CD and charged heptakis(6-O-sulfo)-β-CD. 5. USAF Logistics Process Optimization Study for the Aircraft Asset Sustainment Process. Volume 3. Future to be Asset Sustainment Process Model. Science.gov (United States) 2007-11-02 Models), contains the To-Be Retail Asset Sustainment Process Model displaying the activities and functions related to the improved processes for receipt...of a logistics process model for a more distant future asset sustainment scenario unconstrained by today’s logistics information systems limitations...It also contains a process model reflecting the Reengineering Team’s vision of the future asset sustainment process. 6. Waste Receiving and Processing Facility Module 2A: Advanced Conceptual Design Report. Volume 2 Energy Technology Data Exchange (ETDEWEB) 1994-03-01 This volume presents the Total Estimated Cost (TEC) for the WRAP (Waste Receiving and Processing) 2A facility. The TEC is$81.9 million, including an overall project contingency of 25% and escalation of 13%, based on a 1997 construction midpoint. (The mission of WRAP 2A is to receive, process, package, certify, and ship for permanent burial at the Hanford site disposal facilities the Category 1 and 3 contact handled low-level radioactive mixed wastes that are currently in retrievable storage, and are forecast to be generated over the next 30 years by Hanford, and waste to be shipped to Hanford site from about 20 DOE sites.)

7. Study of Aerospace Materials, Coatings, Adhesions and Processes. Aircraft Icing Processes. Volume 1.

Science.gov (United States)

1984-09-14

AP A160 413 STUDY OF AEROSPACE MATERIALS CATIS AD|SIOS A - PROCESSES AIRCRAFT IC.. (UI INSTITUbO NACIONAL DE TECNICA AEROESPACIAL MORID ISPAIN) E I...Approved for public release; distribution unlimited. Prepared for INSTITTTTO NACIONAL DE TECNICA AEROESPACIAL "Esteban Terradas". Torrejdn de Ardoz...ADDRESS il0. PROGRAM ELEMENT. PROJECT, TASKC Thstituto Naciorial Tecnica Aeroespacial Dto. Aerodindmica y Navegabilidad 2301 / D1 Torrejcn de Ardoz

8. Comparing statistical and process-based flow duration curve models in ungauged basins and changing rain regimes

Science.gov (United States)

Müller, M. F.; Thompson, S. E.

2016-02-01

The prediction of flow duration curves (FDCs) in ungauged basins remains an important task for hydrologists given the practical relevance of FDCs for water management and infrastructure design. Predicting FDCs in ungauged basins typically requires spatial interpolation of statistical or model parameters. This task is complicated if climate becomes non-stationary, as the prediction challenge now also requires extrapolation through time. In this context, process-based models for FDCs that mechanistically link the streamflow distribution to climate and landscape factors may have an advantage over purely statistical methods to predict FDCs. This study compares a stochastic (process-based) and statistical method for FDC prediction in both stationary and non-stationary contexts, using Nepal as a case study. Under contemporary conditions, both models perform well in predicting FDCs, with Nash-Sutcliffe coefficients above 0.80 in 75 % of the tested catchments. The main drivers of uncertainty differ between the models: parameter interpolation was the main source of error for the statistical model, while violations of the assumptions of the process-based model represented the main source of its error. The process-based approach performed better than the statistical approach in numerical simulations with non-stationary climate drivers. The predictions of the statistical method under non-stationary rainfall conditions were poor if (i) local runoff coefficients were not accurately determined from the gauge network, or (ii) streamflow variability was strongly affected by changes in rainfall. A Monte Carlo analysis shows that the streamflow regimes in catchments characterized by frequent wet-season runoff and a rapid, strongly non-linear hydrologic response are particularly sensitive to changes in rainfall statistics. In these cases, process-based prediction approaches are favored over statistical models.

9. A descriptive statistical analysis of volume, visibility and attitudes regarding nursing and care robots in social media.

Science.gov (United States)

Salzmann-Erikson, Martin; Eriksson, Henrik

2017-10-05

Technology in the healthcare sector is undergoing rapid development. One of the most prominent areas of healthcare in which robots are implemented is nursing homes. However, nursing and technology are often considered as being contradictory, an attitude originating in the view of "the natural" versus "the artificial". Social media mirror this view, including in attitudes and societal debates regarding nursing and care robots. However, little is known about this topic in previous research. To examine user behaviour in social media platforms on the topic of nursing and care robots. A retrospective and cross-sectional observation study design was applied. Data were collected via the Alchemy streaming application programming interface. Data from social media were collected from 1 January 2014 to 5 January 2016. The data set consisted of 12 311 mentions in total. Nursing and care robots are a small-scale topic of discussion in social media. Twitter was found to be the largest channel in terms of volume, followed by Tumblr. News channels had the highest percentage of visibility, while forums and Tumblr had the least. It was found in the data that 67.9% of the mentions were positive, 24.4% were negative and 7.8% were neutral. The volume and visibility of the data on nursing robots found in social media, as well as the attitudes to nursing robots found there, indicate that nursing care robots, which are seen as representing a next step in technological development in healthcare, are a topic on the rise in social media. These findings are likely to be related to the idea that nursing care robots are on the breakthrough of replacing human labour in healthcare worldwide.

10. Statistical simulations to estimate motion-inclusive dose-volume histograms for prediction of rectal morbidity following radiotherapy

Science.gov (United States)

THOR, MARIA; APTE, ADITYA; DEASY, JOSEPH O.; MUREN, LUDVIG PAUL

2016-01-01

Background and purpose Internal organ motion over a course of radiotherapy (RT) leads to uncertainties in the actual delivered dose distributions. In studies predicting RT morbidity, the single estimate of the delivered dose provided by the treatment planning computed tomography (pCT) is typically assumed to be representative of the dose distribution throughout the course of RT. In this paper, a simple model for describing organ motion is introduced, and is associated to late rectal morbidity data, with the aim of improving morbidity prediction. Material and methods Organ motion was described by normally distributed translational motion, with its magnitude characterised by the standard deviation (SD) of this distribution. Simulations of both isotropic and anisotropic (anterior-posterior only) motion patterns were performed, as were random, systematic or combined random and systematic motion. The associations between late rectal morbidity and motion-inclusive delivered dose-volume histograms (dDVHs) were quantified using Spearman's rank correlation coefficient (Rs) in a series of 232 prostate cancer patients, and were compared to the associations obtained with the static/planned DVH (pDVH). Results For both isotropic and anisotropic motion, different associations with rectal morbidity were seen with the dDVHs relative to the pDVHs. The differences were most pronounced in the mid-dose region (40–60 Gy). The associations were dependent on the applied motion patterns, with the strongest association with morbidity obtained by applying random motion with an SD in the range 0.2–0.8 cm. Conclusion In this study we have introduced a simple model for describing organ motion occurring during RT. Differing and, for some cases, stronger dose-volume dependencies were found between the motion-inclusive dose distributions and rectal morbidity as compared to the associations with the planned dose distributions. This indicates that rectal organ motion during RT influences the

11. Modelling short- and long-term statistical learning of music as a process of predictive entropy reduction

DEFF Research Database (Denmark)

Hansen, Niels Christian; Loui, Psyche; Vuust, Peter

Statistical learning underlies the generation of expectations with different degrees of uncertainty. In music, uncertainty applies to expectations for pitches in a melody. This uncertainty can be quantified by Shannon entropy from distributions of expectedness ratings for multiple continuations...... of each melody, as obtained with the probe-tone paradigm. We hypothesised that statistical learning of music can be modelled as a process of entropy reduction. Specifically, implicit learning of statistical regularities allows reduction in the relative entropy (i.e. symmetrised Kullback-Leibler Divergence...... of musical training, and within-participant decreases in entropy after short-term statistical learning of novel music. Thus, whereas inexperienced listeners make high-entropy predictions, following the Principle of Maximum Entropy, statistical learning over varying timescales enables listeners to generate...

12. Characterizing the statistical structure of bathymetry and topography as a Matérn process

Science.gov (United States)

Simons, Frederik J.; Olhede, Sofia C.; Eggers, Gabe L.; Lewis, Kevin W.

2014-05-01

Describing and classifying the statistical structure of topography and bathymetry is of much interest across the geophysical sciences. Oceanographers are interested in the roughness of seafloor bathymetry as a parameter that can be linked to internal-wave generation and mixing of ocean currents. Tectonicists are searching for ways to link the shape and fracturing of the ocean floor to build detailed models of the evolution of the ocean basins in a plate-tectonic context. Geomorphologists are building time-dependent models of the surface that benefit from sparsely parameterized representations whose evolution can be described by differential equations. Geophysicists seek access to parameterized forms for the spectral shape of topographic or bathymetric loading at various (sub)surface interfaces in order to use the joint structure of topography and gravity for inversions for the effective elastic thickness of the lithosphere. Planetary scientists are in need of robust terrain-classification models to help unravel the cratering history and tectonic evolution of planetary surfaces, for the selection of suitable landing sites, and for purposes as mundane as the prediction of wear and tear on rover wheels. Finally, statisticians, mathematicians and computer scientists are interested in the analysis of texture for purposes of out-of-sample prediction, extension, and in-painting for application in fields as diverse as computer graphics and medical imaging. A unified geostatistical framework for the description, characterization and study of surfaces of these various kinds and for such a multitude of applications is via the Matérn process, a theoretically well justified and mathematically attractive parameterized form for the spectral-domain covariance of Gaussian processes, both in isotropic form and considering various geometrical kinds anisotropy. We discuss a constructive new estimation technique to find the parameters of the Matérn forms of topography and bathymetry

13. Statistics for stochastic modeling of volume reduction, hydrograph extension, and water-quality treatment by structural stormwater runoff best management practices (BMPs)

Science.gov (United States)

Granato, Gregory E.

2014-01-01

The U.S. Geological Survey (USGS) developed the Stochastic Empirical Loading and Dilution Model (SELDM) in cooperation with the Federal Highway Administration (FHWA) to indicate the risk for stormwater concentrations, flows, and loads to be above user-selected water-quality goals and the potential effectiveness of mitigation measures to reduce such risks. SELDM models the potential effect of mitigation measures by using Monte Carlo methods with statistics that approximate the net effects of structural and nonstructural best management practices (BMPs). In this report, structural BMPs are defined as the components of the drainage pathway between the source of runoff and a stormwater discharge location that affect the volume, timing, or quality of runoff. SELDM uses a simple stochastic statistical model of BMP performance to develop planning-level estimates of runoff-event characteristics. This statistical approach can be used to represent a single BMP or an assemblage of BMPs. The SELDM BMP-treatment module has provisions for stochastic modeling of three stormwater treatments: volume reduction, hydrograph extension, and water-quality treatment. In SELDM, these three treatment variables are modeled by using the trapezoidal distribution and the rank correlation with the associated highway-runoff variables. This report describes methods for calculating the trapezoidal-distribution statistics and rank correlation coefficients for stochastic modeling of volume reduction, hydrograph extension, and water-quality treatment by structural stormwater BMPs and provides the calculated values for these variables. This report also provides robust methods for estimating the minimum irreducible concentration (MIC), which is the lowest expected effluent concentration from a particular BMP site or a class of BMPs. These statistics are different from the statistics commonly used to characterize or compare BMPs. They are designed to provide a stochastic transfer function to approximate

14. Statistical learning and auditory processing in children with music training: An ERP study.

Science.gov (United States)

Mandikal Vasuki, Pragati Rao; Sharma, Mridula; Ibrahim, Ronny; Arciuli, Joanne

2017-07-01

15. Human-system interface design review guideline -- Process and guidelines: Final report. Revision 1, Volume 1

Energy Technology Data Exchange (ETDEWEB)

None

1996-06-01

NUREG-0700, Revision 1, provides human factors engineering (HFE) guidance to the US Nuclear Regulatory Commission staff for its: (1) review of the human system interface (HSI) design submittals prepared by licensees or applications for a license or design certification of commercial nuclear power plants, and (2) performance of HSI reviews that could be undertaken as part of an inspection or other type of regulatory review involving HSI design or incidents involving human performance. The guidance consists of a review process and HFE guidelines. The document describes those aspects of the HSI design review process that are important to the identification and resolution of human engineering discrepancies that could adversely affect plant safety. Guidance is provided that could be used by the staff to review an applicants HSI design review process or to guide the development of an HSI design review plan, e.g., as part of an inspection activity. The document also provides detailed HFE guidelines for the assessment of HSI design implementations. NUREG-0700, Revision 1, consists of three stand-alone volumes. Volume 1 consists of two major parts. Part 1 describes those aspects of the review process of the HSI design that are important to identifying and resolving human engineering discrepancies. Part 2 contains detailed guidelines for a human factors engineering review which identify criteria for assessing the implementation of an applicants or licensees HSI design.

16. Vivaldi: A Domain-Specific Language for Volume Processing and Visualization on Distributed Heterogeneous Systems.

Science.gov (United States)

Choi, Hyungsuk; Choi, Woohyuk; Quan, Tran Minh; Hildebrand, David G C; Pfister, Hanspeter; Jeong, Won-Ki

2014-12-01

As the size of image data from microscopes and telescopes increases, the need for high-throughput processing and visualization of large volumetric data has become more pressing. At the same time, many-core processors and GPU accelerators are commonplace, making high-performance distributed heterogeneous computing systems affordable. However, effectively utilizing GPU clusters is difficult for novice programmers, and even experienced programmers often fail to fully leverage the computing power of new parallel architectures due to their steep learning curve and programming complexity. In this paper, we propose Vivaldi, a new domain-specific language for volume processing and visualization on distributed heterogeneous computing systems. Vivaldi's Python-like grammar and parallel processing abstractions provide flexible programming tools for non-experts to easily write high-performance parallel computing code. Vivaldi provides commonly used functions and numerical operators for customized visualization and high-throughput image processing applications. We demonstrate the performance and usability of Vivaldi on several examples ranging from volume rendering to image segmentation.

17. Optimized statistical parametric mapping for partial-volume-corrected amyloid positron emission tomography in patients with Alzheimer's disease and Lewy body dementia

Science.gov (United States)

Oh, Jungsu S.; Kim, Jae Seung; Chae, Sun Young; Oh, Minyoung; Oh, Seung Jun; Cha, Seung Nam; Chang, Ho-Jong; Lee, Chong Sik; Lee, Jae Hong

2017-03-01

We present an optimized voxelwise statistical parametric mapping (SPM) of partial-volume (PV)-corrected positron emission tomography (PET) of 11C Pittsburgh Compound B (PiB), incorporating the anatomical precision of magnetic resonance image (MRI) and amyloid β (A β) burden-specificity of PiB PET. First, we applied region-based partial-volume correction (PVC), termed the geometric transfer matrix (GTM) method, to PiB PET, creating MRI-based lobar parcels filled with mean PiB uptakes. Then, we conducted a voxelwise PVC by multiplying the original PET by the ratio of a GTM-based PV-corrected PET to a 6-mm-smoothed PV-corrected PET. Finally, we conducted spatial normalizations of the PV-corrected PETs onto the study-specific template. As such, we increased the accuracy of the SPM normalization and the tissue specificity of SPM results. Moreover, lobar smoothing (instead of whole-brain smoothing) was applied to increase the signal-to-noise ratio in the image without degrading the tissue specificity. Thereby, we could optimize a voxelwise group comparison between subjects with high and normal A β burdens (from 10 patients with Alzheimer's disease, 30 patients with Lewy body dementia, and 9 normal controls). Our SPM framework outperformed than the conventional one in terms of the accuracy of the spatial normalization (85% of maximum likelihood tissue classification volume) and the tissue specificity (larger gray matter, and smaller cerebrospinal fluid volume fraction from the SPM results). Our SPM framework optimized the SPM of a PV-corrected A β PET in terms of anatomical precision, normalization accuracy, and tissue specificity, resulting in better detection and localization of A β burdens in patients with Alzheimer's disease and Lewy body dementia.

18. Distributed location-based query processing on large volumes of moving items

Institute of Scientific and Technical Information of China (English)

JEON Se-gil; LEE Chung-woo; NAH Yunmook; KIM Moon-hae; HAN Ki-joon

2004-01-01

Recently, new techniques to efficiently manage current and past location information of moving objects have received significant interests in the area of moving object databases and location-based service systems. In this paper, we exploit query processing schemes for location management systems, which consist of multiple data processing nodes to handle massive volume of moving objects such as cellular phone users.To show the usefulness of the proposed schemes, some experimental results showing performance factors regarding distributed query processing are explained. In our experiments, we use two kinds of data set: one is generated by the extended GSTD simulator and another is generated by the real-time data generator which generates location sensing reports of various types of users having different movement patterns.

19. Harmonic statistics

Science.gov (United States)

Eliazar, Iddo

2017-05-01

The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their 'public relations' for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford's law, and 1/f noise.

20. Measurement and modeling of advanced coal conversion processes, Volume I, Part 2. Final report, September 1986--September 1993

Energy Technology Data Exchange (ETDEWEB)

Solomon, P.R.; Serio, M.A.; Hamblen, D.G. [and others

1995-09-01

This report describes work pertaining to the development of models for coal gasification and combustion processes. This volume, volume 1, part 2, contains research progress in the areas of large particle oxidation at high temperatures, large particle, thick-bed submodels, sulfur oxide/nitrogen oxides submodels, and comprehensive model development and evaluation.

1. Refilling process in the plasmasphere: a 3-D statistical characterization based on Cluster density observations

Directory of Open Access Journals (Sweden)

G. Lointier

2013-02-01

Full Text Available The Cluster mission offers an excellent opportunity to investigate the evolution of the plasma population in a large part of the inner magnetosphere, explored near its orbit's perigee, over a complete solar cycle. The WHISPER sounder, on board each satellite of the mission, is particularly suitable to study the electron density in this region, between 0.2 and 80 cm−3. Compiling WHISPER observations during 1339 perigee passes distributed over more than three years of the Cluster mission, we present first results of a statistical analysis dedicated to the study of the electron density morphology and dynamics along and across magnetic field lines between L = 2 and L = 10. In this study, we examine a specific topic: the refilling of the plasmasphere and trough regions during extended periods of quiet magnetic conditions. To do so, we survey the evolution of the ap index during the days preceding each perigee crossing and sort out electron density profiles along the orbit according to three classes, namely after respectively less than 2 days, between 2 and 4 days, and more than 4 days of quiet magnetic conditions (ap ≤ 15 nT following an active episode (ap > 15 nT. This leads to three independent data subsets. Comparisons between density distributions in the 3-D plasmasphere and trough regions at the three stages of quiet magnetosphere provide novel views about the distribution of matter inside the inner magnetosphere during several days of low activity. Clear signatures of a refilling process inside an expended plasmasphere in formation are noted. A plasmapause-like boundary, at L ~ 6 for all MLT sectors, is formed after 3 to 4 days and expends somewhat further after that. In the outer part of the plasmasphere (L ~ 8, latitudinal profiles of median density values vary essentially according to the MLT sector considered rather than according to the refilling duration. The shape of these density profiles

2. Mechanisms controlling warm water volume interannual variations in the equatorial Pacific: diabatic versus adiabatic processes

Energy Technology Data Exchange (ETDEWEB)

Lengaigne, M. [CNRS, UPMC, IRD, Laboratoire d' Oceanographie Experimentation et Approches Numeriques, Paris (France); Paris Cedex 05 (France); Hausmann, U. [Imperial College, Department of Physics, London (United Kingdom); Madec, G. [CNRS, UPMC, IRD, Laboratoire d' Oceanographie Experimentation et Approches Numeriques, Paris (France); National Oceanographic Centre, Southampton (United Kingdom); Menkes, C.; Vialard, J. [CNRS, UPMC, IRD, Laboratoire d' Oceanographie Experimentation et Approches Numeriques, Paris (France); Molines, J.M. [CNRS, UJF, INP, Laboratoire Ecoulements Geophysiques et Industriels, Grenoble (France)

2012-03-15

Variations of the volume of warm water above the thermocline in the equatorial Pacific are a good predictor of ENSO (El Nino/Southern Oscillation) and are thought to be critical for its preconditioning and development. In this study, the Warm Water Volume (WWV) interannual variability is analysed using forced general circulation model experiments and an original method for diagnosing processes responsible for WWV variations. The meridional recharge/discharge to higher latitudes drives 60% of the ENSO-related equatorial WWV variations, while diabatic processes in the eastern equatorial Pacific account for the remaining 40%. Interior meridional transport is partially compensated by western boundary transports, especially in the southern hemisphere. Diabatic equatorial WWV formation (depletions) during La Nina (El Nino) are explained by enhanced (reduced) diathermal transport through enhanced (reduced) vertical mixing and penetrating solar forcing at the 20 C isotherm depth. The respective contribution of diabatic and adiabatic processes during build-ups/depletions strongly varies from event-to-event. The WWV build-up during neutral ENSO phases (e.g. 1980-1982) is almost entirely controlled by meridional recharge, providing a text-book example for the recharge/discharge oscillator's theory. On the other hand, diabatic processes are particularly active during the strongest La Nina events (1984, 1988, 1999), contributing to more than 70% of the WWV build-up, with heating by penetrative solar fluxes explaining as much as 30% of the total build-up due to a very shallow thermocline in the eastern Pacific. This study does not invalidate the recharge/discharge oscillator theory but rather emphasizes the importance of equatorial diabatic processes and western boundary transports in controlling WWV changes. (orig.)

3. Building high-performance system for processing a daily large volume of Chinese satellites imagery

Science.gov (United States)

Deng, Huawu; Huang, Shicun; Wang, Qi; Pan, Zhiqiang; Xin, Yubin

2014-10-01

The number of Earth observation satellites from China increases dramatically recently and those satellites are acquiring a large volume of imagery daily. As the main portal of image processing and distribution from those Chinese satellites, the China Centre for Resources Satellite Data and Application (CRESDA) has been working with PCI Geomatics during the last three years to solve two issues in this regard: processing the large volume of data (about 1,500 scenes or 1 TB per day) in a timely manner and generating geometrically accurate orthorectified products. After three-year research and development, a high performance system has been built and successfully delivered. The high performance system has a service oriented architecture and can be deployed to a cluster of computers that may be configured with high end computing power. The high performance is gained through, first, making image processing algorithms into parallel computing by using high performance graphic processing unit (GPU) cards and multiple cores from multiple CPUs, and, second, distributing processing tasks to a cluster of computing nodes. While achieving up to thirty (and even more) times faster in performance compared with the traditional practice, a particular methodology was developed to improve the geometric accuracy of images acquired from Chinese satellites (including HJ-1 A/B, ZY-1-02C, ZY-3, GF-1, etc.). The methodology consists of fully automatic collection of dense ground control points (GCP) from various resources and then application of those points to improve the photogrammetric model of the images. The delivered system is up running at CRESDA for pre-operational production and has been and is generating good return on investment by eliminating a great amount of manual labor and increasing more than ten times of data throughput daily with fewer operators. Future work, such as development of more performance-optimized algorithms, robust image matching methods and application

4. Avalanche Statistics Identify Intrinsic Stellar Processes near Criticality in KIC 8462852

Science.gov (United States)

Sheikh, Mohammed A.; Weaver, Richard L.; Dahmen, Karin A.

2016-12-01

The star KIC8462852 (Tabby's star) has shown anomalous drops in light flux. We perform a statistical analysis of the more numerous smaller dimming events by using methods found useful for avalanches in ferromagnetism and plastic flow. Scaling exponents for avalanche statistics and temporal profiles of the flux during the dimming events are close to mean field predictions. Scaling collapses suggest that this star may be near a nonequilibrium critical point. The large events are interpreted as avalanches marked by modified dynamics, limited by the system size, and not within the scaling regime.

5. Intrinsic Volumes of the Maximal Facet Process in Higher Dimensional STIT Tessellations

CERN Document Server

Schreiber, Tomasz

2010-01-01

Stationary and isotropic random tessellations in ${\\Bbb R}^d$ are considered, which are stable under the operation of iteration -- so-called STIT tessellations. These tessellations are constructed by a random process of cell division and we analyze first- and second-order properties of the intrinsic volumes of the collection of all cell-separating facets (so-called maximal or I-facets) at a fixed time $t$ in a convex window $W\\subset{\\Bbb R}^d$. We provide formulas for mean values, exact and asymptotic variances, as well as a characterization of the lower-dimensional face covariance measures. We will focus here on the case $d\\geq 3$, which is more involved than the planar case, treated separately in an earlier paper. Beside the afore mentioned results, we prove central limit theorems for the process of suitably rescaled intrinsic volumes, leading -- in sharp contrast to the situation in the plane -- to non-Gaussian limit distributions.

6. Using Statistical Process Control Charts to Identify the Steroids Era in Major League Baseball: An Educational Exercise

Science.gov (United States)

Hill, Stephen E.; Schvaneveldt, Shane J.

2011-01-01

This article presents an educational exercise in which statistical process control charts are constructed and used to identify the Steroids Era in American professional baseball. During this period (roughly 1993 until the present), numerous baseball players were alleged or proven to have used banned, performance-enhancing drugs. Also observed…

7. Electrospining of polyaniline/poly(lactic acid) ultrathin fibers: process and statistical modeling using a non-gaussian approach

Science.gov (United States)

Cover: The electrospinning technique was employed to obtain conducting nanofibers based on polyaniline and poly(lactic acid). A statistical model was employed to describe how the process factors (solution concentration, applied voltage, and flow rate) govern the fiber dimensions. Nanofibers down to ...

8. Estimating the Time to Benefit for Preventive Drugs with the Statistical Process Control Method : An Example with Alendronate

NARCIS (Netherlands)

van de Glind, Esther M. M.; Willems, Hanna C.; Eslami, Saeid; Abu-Hanna, Ameen; Lems, Willem F.; Hooft, Lotty; de Rooij, Sophia E.; Black, Dennis M.; van Munster, Barbara C.

2016-01-01

For physicians dealing with patients with a limited life expectancy, knowing the time to benefit (TTB) of preventive medication is essential to support treatment decisions. The aim of this study was to investigate the usefulness of statistical process control (SPC) for determining the TTB in relatio

9. Statistical analysis of the V-tool bending process parameters in the bending of HC260Y steel

Directory of Open Access Journals (Sweden)

J. Cumin

2016-04-01

Full Text Available This paper presents statistical analysis of the parameters in the V-tool bending process of the HC260Y steel. Assessment of the mathematical model and analysis of variance (ANOVA were performed within the design of experiments. The hydraulic testing machine Amsler and the developed V-tool were used in the experiments.

10. Statistical methods for processing seismic information and studying complicated media. Statisticheskiye metody obrabotki seysmicheskoy informatsii pri issledovanii slozhnykh sred

Energy Technology Data Exchange (ETDEWEB)

Troyan, V.N.

1982-01-01

For the first time materials are generalized concerning statistical methods of processing seismic information which are used more widely in prospecting minerals (oil, gas, ore) in regions of complex structure and great depths. The methods provide reliable identification of useful signals in the background of interferences. Fundamentals are examined of methods of constructing algorithms and programs used in interpretation, and their efficiency.

11. A new statistical procedure for testing the presence of a significative correlation in the residuals of stable autoregressive processes

CERN Document Server

Proïa, Frédéric

2012-01-01

The purpose of this paper is to investigate the asymptotic behavior of the Durbin-Watson statistic for the general stable $p-$order autoregressive process when the driven noise is given by a first-order autoregressive process. We establish the almost sure convergence and the asymptotic normality for both the least squares estimator of the unknown vector parameter of the autoregressive process as well as for the serial correlation estimator associated with the driven noise. In addition, the almost sure rates of convergence of our estimates are also provided. Then, we prove the almost sure convergence and the asymptotic normality for the Durbin-Watson statistic. Finally, we propose a new bilateral statistical procedure for testing the presence of a significative first-order residual autocorrelation and we also explain how our procedure performs better than the commonly used Box-Pierce and Ljung-Box statistical tests for white noise applied to the stable autoregressive process, even on small-sized samples.

12. The Prospective Mathematics Teachers' Thought Processes and Views about Using Problem-Based Learning in Statistics Education

Science.gov (United States)

Canturk-Gunhan, Berna; Bukova-Guzel, Esra; Ozgur, Zekiye

2012-01-01

The purpose of this study is to determine prospective mathematics teachers' views about using problem-based learning (PBL) in statistics teaching and to examine their thought processes. It is a qualitative study conducted with 15 prospective mathematics teachers from a state university in Turkey. The data were collected via participant observation…

13. Application of binomial and multinomial probability statistics to the sampling design process of a global grain tracing and recall system

Science.gov (United States)

Small, coded, pill-sized tracers embedded in grain are proposed as a method for grain traceability. A sampling process for a grain traceability system was designed and investigated by applying probability statistics using a science-based sampling approach to collect an adequate number of tracers fo...

14. Fieldcrest Cannon, Inc. Advanced Technical Preparation. Statistical Process Control (SPC). PRE-SPC 11: SPC & Graphs. Instructor Book.

Science.gov (United States)

Averitt, Sallie D.

This instructor guide, which was developed for use in a manufacturing firm's advanced technical preparation program, contains the materials required to present a learning module that is designed to prepare trainees for the program's statistical process control module by improving their basic math skills in working with line graphs and teaching…

15. Frontiers in statistical quality control 11

CERN Document Server

Schmid, Wolfgang

2015-01-01

The main focus of this edited volume is on three major areas of statistical quality control: statistical process control (SPC), acceptance sampling and design of experiments. The majority of the papers deal with statistical process control, while acceptance sampling and design of experiments are also treated to a lesser extent. The book is organized into four thematic parts, with Part I addressing statistical process control. Part II is devoted to acceptance sampling. Part III covers the design of experiments, while Part IV discusses related fields. The twenty-three papers in this volume stem from The 11th International Workshop on Intelligent Statistical Quality Control, which was held in Sydney, Australia from August 20 to August 23, 2013. The event was hosted by Professor Ross Sparks, CSIRO Mathematics, Informatics and Statistics, North Ryde, Australia and was jointly organized by Professors S. Knoth, W. Schmid and Ross Sparks. The papers presented here were carefully selected and reviewed by the scientifi...

16. Finnish Upper Secondary Students' Collaborative Processes in Learning Statistics in a CSCL Environment

Science.gov (United States)

Oikarinen, Juho Kaleva; Järvelä, Sanna; Kaasila, Raimo

2014-01-01

This design-based research project focuses on documenting statistical learning among 16-17-year-old Finnish upper secondary school students (N = 78) in a computer-supported collaborative learning (CSCL) environment. One novel value of this study is in reporting the shift from teacher-led mathematical teaching to autonomous small-group learning in…

17. The Underlying Process Generating Lotka's Law and the Statistics of Exceedances.

Science.gov (United States)

Huber, John C.

1998-01-01

Demonstrates that the statistics of exceedances generates Lotka's Law--a widely-observed distribution of authors of scholarly papers and patents. The Frequency of production (papers or patents per year) and Lifetime (career duration) are exponentially distributed random variables. Empirical, phenomenological, and mathematical development shows…

18. Statistical processing of facial electromyography (EMG) signals in emotional film scenes

NARCIS (Netherlands)

Westerink, Joyce; van den Broek, Egon; van Herk, Jan; Tuinenbreijer, Kees; Schut, Marleen

To improve human-computer interaction, computers need to recognize and respond properly to their users’ emotional state. As a first step to such systems, we investigated how emotional experiences are expressed in various statistical parameters of facial EMG signals. 22 Subjects were presented with 8

19. Low-loss polysilicon waveguides fabricated in an emulated high-volume electronics process.

Science.gov (United States)

Orcutt, Jason S; Tang, Sanh D; Kramer, Steve; Mehta, Karan; Li, Hanqing; Stojanović, Vladimir; Ram, Rajeev J

2012-03-26

We measure end-of-line polysilicon waveguide propagation losses of ~6-15 dB/cm across the telecommunication O-, E-, S-, C- and L-bands in a process representative of high-volume product integration. The lowest loss of 6.2 dB/cm is measured at 1550 nm in a polysilicon waveguide with a 120 nm x 350 nm core geometry. The reported waveguide characteristics are measured after the thermal cycling of the full CMOS electronics process that results in a 32% increase in the extracted material loss relative to the as-crystallized waveguide samples. The measured loss spectra are fit to an absorption model using defect state parameters to identify the dominant loss mechanism in the end-of-line and as-crystallized polysilicon waveguides.

20. Waste Receiving and Processing Facility Module 2A: Advanced Conceptual Design Report. Volume 1

Energy Technology Data Exchange (ETDEWEB)

1994-03-01

This ACDR was performed following completed of the Conceptual Design Report in July 1992; the work encompassed August 1992 to January 1994. Mission of the WRAP Module 2A facility is to receive, process, package, certify, and ship for permanent burial at the Hanford site disposal facilities the Category 1 and 3 contact handled low-level radioactive mixed wastes that are currently in retrievable storage at Hanford and are forecast to be generated over the next 30 years by Hanford, and waste to be shipped to Hanford from about DOE sites. This volume provides an introduction to the ACDR process and the scope of the task along with a project summary of the facility, treatment technologies, cost, and schedule. Major areas of departure from the CDR are highlighted. Descriptions of the facility layout and operations are included.

1. Statistical optimization of process parameters influencing the biotransformation of plant tannin into gallic acid under solid-liquid fermentation

OpenAIRE

Bibhu Prasad Panda; Rupa Mazumder; Rintu Banerjee

2009-01-01

Purpose : To optimize and produce gallic acid by biotransformation of plant tannin under solid-liquid fermentation. Materials and Methods : Optimization of different process parameters like temperature, relative humidity, pH of the liquid medium, fermentation period, volume of inoculum weight of substrate influencing gallic acid production from plant tannin were carried out by EVOP factorial method. Results : Maximum gallic acid yield of 93.29% was produced at 28ΊC, 70% relative humidity, pH ...

2. Effect Size Matters : The Role of Language Statistics and Perceptual Simulation in Conceptual Processing

NARCIS (Netherlands)

Louwerse, M.M.; Hutchinson, S; Tillman, R.N.

2014-01-01

The cognitive science literature increasingly demonstrates that perceptual representations are activated during conceptual processing. Such findings suggest that the debate on whether conceptual processing is predominantly symbolic or perceptual has been resolved. However, studies too frequently pro

3. Statistical optimization of process parameters for exopolysaccharide production by Aureobasidium pullulans using sweet potato based medium

OpenAIRE

Padmanaban, Sethuraman; Balaji, Nagarajan; Muthukumaran, Chandrasekaran; Tamilarasan, Krishnamurthi

2015-01-01

Statistical experimental designs were applied to optimize the fermentation medium for exopolysaccharide (EPS) production. Plackett–Burman design was applied to identify the significance of seven medium variables, in which sweet potato and yeast extract were found to be the significant variables for EPS production. Central composite design was applied to evaluate the optimum condition of the selected variables. Maximum EPS production of 9.3 g/L was obtained with the predicted optimal level of ...

4. A Complex Approach to UXO Discrimination: Combining Advanced EMI Forward Models and Statistical Signal Processing

Science.gov (United States)

2012-01-01

1572 Final Report Sky Research, Inc. January 2012 xii List of Acronyms 3D Three-Dimensional AIC Akaike Information Criterion APG Aberdeen...Proving Ground BIC Bayesian Information Criterion BOR Body of Revolution BUD Berkeley UXO Discriminator cm Centimeter CRREL Cold Regions...This model-based approach has the desirable traits (1) that it permits the use of objective statistical criteria—like the Akaike Information Criterion

5. Statistical Modeling Applied to Deformation-Relaxation Processes in a Composite Biopolymer Network Induced by Magnetic Field

Science.gov (United States)

Tarrío-Saavedra, Javier; González, Cécilia Galindo; Naya, Salvador; López-Beceiro, Jorge

2017-01-01

This study investigated a methodology based on image processing and statistics to characterize and model the deformation upon controlled and uniform magnetic field and the relaxation under zero field of droplets observed in aqueous solutions of sodium alginate incorporating magnetic maghemite nanoparticles stabilized by adsorption of citrate ions. The changes of droplet geometry were statistically analyzed using a new approach based on the data obtained from optical microscopy, image processing, nonlinear regression, evolutionary optimization, analysis of variance and resampling. Image enhancement and then image segmentation (Gaussian mixture modeling) processes were applied to extract features with reliable information of droplets dimensions from optical micrographs. The droplets deformation and relaxation trends were accurately adjusted by the Kohlrausch-Williams-Watts (KWW) function and a mean relaxation time was obtained by fitting the time evolution of geometry parameters. It was found to be proportional to the initial radius of the spherical droplets and was associated to interfacial tension. PMID:28081239

6. Proceedings of waste stream minimization and utilization innovative concepts: An experimental technology exchange. Volume 2, Industrial liquid waste processing, industrial gaseous waste processing

Energy Technology Data Exchange (ETDEWEB)

Lee, V.E. [ed.; Watts, R.L.

1993-04-01

This two-volume proceedings summarize the results of fifteen innovations that were funded through the US Department of Energys Innovative Concept Program. The fifteen innovations were presented at the sixth Innovative Concepts Fair, held in Austin, Texas, on April 22--23, 1993. The concepts in this years fair address innovations that can substantially reduce or use waste streams. Each paper describes the need for the proposed concept, the concept being proposed, and the concepts economics and market potential, key experimental results, and future development needs. The papers are divided into two volumes: Volume 1 addresses innovations for industrial solid waste processing and municipal waste reduction/recycling, and Volume 2 addresses industrial liquid waste processing and industrial gaseous waste processing. Individual reports are indexed separately.

7. Statistical Inference for Ergodic Point Processes and Application to Limit Order Book

OpenAIRE

Clinet, Simon; Yoshida, Nakahiro

2015-01-01

We construct a general procedure for the Quasi Likelihood Analysis applied to a multivariate point process on the real half line in an ergodic framework. More precisely, we assume that the stochastic intensity of the underlying model belongs to a family of processes indexed by a finite dimensional parameter. When a particular family of laws of large numbers applies to those processes, we establish the consistency, the asymptotic normality and the convergence of moments of both the Quasi Maxim...

8. Statistical inference for critical continuous state and continuous time branching processes with immigration

OpenAIRE

2014-01-01

We study asymptotic behavior of conditional least squares estimators for critical continuous state and continuous time branching processes with immigration based on discrete time (low frequency) observations.

9. Development of the NRCs Human Performance Investigation Process (HPIP). Volume 2, Investigatorss Manual

Energy Technology Data Exchange (ETDEWEB)

Paradies, M.; Unger, L. [System Improvements, Inc., Knoxville, TN (United States); Haas, P.; Terranova, M. [Concord Associates, Inc., Knoxville, TN (United States)

1993-10-01

The three volumes of this report detail a standard investigation process for use by US Nuclear Regulatory Commission (NRC) personnel when investigating human performance related events at nuclear power plants. The process, called the Human Performance Investigation Process (HPIP), was developed to meet the special needs of NRC personnel, especially NRC resident and regional inspectors. HPIP is a systematic investigation process combining current procedures and field practices, expert experience, NRC human performance research, and applicable investigation techniques. The process is easy to learn and helps NRC personnel perform better field investigations of the root causes of human performance problems. The human performance data gathered through such investigations provides a better understanding of the human performance issues that cause event at nuclear power plants. This document, Volume II, is a field manual for use by investigators when performing event investigations. Volume II includes the HPIP Procedure, the HPIP Modules, and Appendices that provide extensive documentation of each investigation technique.

10. EPR statistical mixture of correlated states with fractional brownian process induced by third party interaction

CERN Document Server

Tamburini, F; Bianchini, A

1999-01-01

A time-correlated EPR pairs protocol is analized, based on detection of fractal correlated signals into a statistical mixture of EPR correlated pairs: an approximated alpha-Fractional Brownian Motion (FBM) is induced on the group of EPR pairs (e.g. by sender-third party eavesdropper-like interactions as in Ekert quantum cryptography), to be detected by the receiver using a non - orthogonal wavelet filter, able to characterize the FBM from a noisy enviroment by formalizing a nonlinear optimization problem for the FBM alpha-characteristic parameter extimation.

11. Constitutive Modelling in Thermomechanical Processes, Using The Control Volume Method on Staggered Grid

DEFF Research Database (Denmark)

Thorborg, Jesper

of the method has been focused on high temperature processes such as casting and welding and the interest of using nonlinear constitutive stress-strain relations has grown to extend the applicability of the method. The work of implementing classical plasticity into the control volume formulation has been based...... on the $J_2$ flow theory describing an isotropic hardening material with a temperature dependent yield stress. This work has successfully been verified by comparing results to analytical solutions. Due to the comprehensive implementation in the staggered grid an alternative constitutive stress......-strain relation has been suggested. The intention of this method is to provide fast numerical results with reasonable accuracy in relation to the first order effects of the presented classical plasticity model. Application of the $J_2$ flow theory and the alternative method have shown some agreement...

12. Volume reduction outweighs biogeochemical processes in controlling phosphorus treatment in aged detention systems

Science.gov (United States)

Shukla, Asmita; Shukla, Sanjay; Annable, Michael D.; Hodges, Alan W.

2017-08-01

Stormwater detention areas (SDAs) play an important role in treating end-of-the-farm runoff in phosphorous (P) limited agroecosystems. Phosphorus transport from the SDAs, including those through subsurface pathways, are not well understood. The prevailing understanding of these systems assumes that biogeochemical processes play the primary treatment role and that subsurface losses can be neglected. Water and P fluxes from a SDA located in a row-crop farm were measured for two years (2009-2011) to assess the SDA's role in reducing downstream P loads. The SDA treated 55% (497 kg) and 95% (205 kg) of the incoming load during Year 1 (Y1, 09-10) and Year 2 (Y2, 10-11), respectively. These treatment efficiencies were similar to surface water volumetric retention (49% in Y1 and 84% in Y2) and varied primarily with rainfall. Similar water volume and P retentions indicate that volume retention is the main process controlling P loads. A limited role of biogeochemical processes was supported by low to no remaining soil P adsorption capacity due to long-term drainage P input. The fact that outflow P concentrations (Y1 = 368.3 μg L- 1, Y2 = 230.4 μg L- 1) could be approximated by using a simple mixing of rainfall and drainage P input further confirmed the near inert biogeochemical processes. Subsurface P losses through groundwater were 304 kg (27% of inflow P) indicating that they are an important source for downstream P. Including subsurface P losses reduces the treatment efficiency to 35% (from 61%). The aboveground biomass in the SDA contained 42% (240 kg) of the average incoming P load suggesting that biomass harvesting could be a cost-effective alternative for reviving the role of biogeochemical processes to enhance P treatment in aged, P-saturated SDAs. The 20-year present economic value of P removal through harvesting was estimated to be 341,000, which if covered through a cost share or a payment for P treatment services program could be a positive outcome for both

13. The utilization of six sigma and statistical process control techniques in surgical quality improvement.

Science.gov (United States)

Sedlack, Jeffrey D

2010-01-01

Surgeons have been slow to incorporate industrial reliability techniques. Process control methods were applied to surgeon waiting time between cases, and to length of stay (LOS) after colon surgery. Waiting times between surgeries were evaluated by auditing the operating room records of a single hospital over a 1-month period. The medical records of 628 patients undergoing colon surgery over a 5-year period were reviewed. The average surgeon wait time between cases was 53 min, and the busiest surgeon spent 291/2 hr in 1 month waiting between surgeries. Process control charting demonstrated poor overall control of the room turnover process. Average LOS after colon resection also demonstrated very poor control. Mean LOS was 10 days. Weibull's conditional analysis revealed a conditional LOS of 9.83 days. Serious process management problems were identified in both analyses. These process issues are both expensive and adversely affect the quality of service offered by the institution. Process control mechanisms were suggested or implemented to improve these surgical processes. Industrial reliability and quality management tools can easily and effectively identify process control problems that occur on surgical services.

14. Implementation of statistical tools to support identification and management of persistent Listeria monocytogenes contamination in smoked fish processing plants.

Science.gov (United States)

Malley, Thomas J V; Stasiewicz, Matthew J; Gröhn, Yrjö T; Roof, Sherry; Warchocki, Steven; Nightingale, Kendra; Wiedmann, Martin

2013-05-01

Listeria monocytogenes persistence in food processing plants is a key source of postprocessing contamination of ready-to-eat foods. Thus, identification and elimination of sites where L. monocytogenes persists (niches) is critical. Two smoked fish processing plants were used as models to develop and implement environmental sampling plans (i) to identify persistent L. monocytogenes subtypes (EcoRI ribotypes) using two statistical approaches and (ii) to identify and eliminate likely L. monocytogenes niches. The first statistic, a binomial test based on ribotype frequencies, was used to evaluate L. monocytogenes ribotype recurrences relative to reference distributions extracted from a public database; the second statistic, a binomial test based on previous positives, was used to measure ribotype occurrences as a risk factor for subsequent isolation of the same ribotype. Both statistics revealed persistent ribotypes in both plants based on data from the initial 4 months of sampling. The statistic based on ribotype frequencies revealed persistence of particular ribotypes at specific sampling sites. Two adaptive sampling strategies guided plant interventions during the study: sampling multiple times before and during processing and vector swabbing (i.e., sampling of additional sites in different directions [vectors] relative to a given site). Among sites sampled for 12 months, a Poisson model regression revealed borderline significant monthly decreases in L. monocytogenes isolates at both plants (P = 0.026 and 0.076). Our data indicate elimination of an L. monocytogenes niche on a food contact surface; niches on nonfood contact surfaces were not eliminated. Although our data illustrate the challenge of identifying and eliminating L. monocytogenes niches, particularly at nonfood contact sites in small and medium plants, the methods for identification of persistence we describe here should broadly facilitate science-based identification of microbial persistence.

15. On the statistical relationship between solar activity and spontaneous social processes

Science.gov (United States)

Rodkin, M. V.; Kharin, E. P.

2014-12-01

The starting times of mass spontaneous social movements have been compared with temporal changes in solar activity (Wolf numbers) and in the Aa index of geomagnetic activity. It is shown that relatively high values of solar and, hence, geomagnetic activity are typical (on average) of a set of years when social cataclysms began. In addition, the relationship between social activity and geomagnetic activity is expressed somewhat more strongly than with solar activity. Heliogeomagnetic activity itself is not, however, the cause of social conflicts, as is evidenced by the weakness of the statistical relationship and the fact that the time intervals of an extremely large number of social conflicts (the decades of the 1800s, 1910s, and 1990s) occur during periods of a reduced mean level of solar and geomagnetic activity. From an averaged statistical model of the solar-geomagnetic influence on social activity and the current status and forecast of the 24th solar cycle, we can assume that heliogeomagnetic factors will contribute to an increased level of sociopolitical activity at least until the end of 2014 and, possibly, a little longer.

16. Difficult Decisions: A Qualitative Exploration of the Statistical Decision Making Process from the Perspectives of Psychology Students and Academics

Science.gov (United States)

Allen, Peter J.; Dorozenko, Kate P.; Roberts, Lynne D.

2016-01-01

Quantitative research methods are essential to the development of professional competence in psychology. They are also an area of weakness for many students. In particular, students are known to struggle with the skill of selecting quantitative analytical strategies appropriate for common research questions, hypotheses and data types. To begin understanding this apparent deficit, we presented nine psychology undergraduates (who had all completed at least one quantitative methods course) with brief research vignettes, and asked them to explicate the process they would follow to identify an appropriate statistical technique for each. Thematic analysis revealed that all participants found this task challenging, and even those who had completed several research methods courses struggled to articulate how they would approach the vignettes on more than a very superficial and intuitive level. While some students recognized that there is a systematic decision making process that can be followed, none could describe it clearly or completely. We then presented the same vignettes to 10 psychology academics with particular expertise in conducting research and/or research methods instruction. Predictably, these “experts” were able to describe a far more systematic, comprehensive, flexible, and nuanced approach to statistical decision making, which begins early in the research process, and pays consideration to multiple contextual factors. They were sensitive to the challenges that students experience when making statistical decisions, which they attributed partially to how research methods and statistics are commonly taught. This sensitivity was reflected in their pedagogic practices. When asked to consider the format and features of an aid that could facilitate the statistical decision making process, both groups expressed a preference for an accessible, comprehensive and reputable resource that follows a basic decision tree logic. For the academics in particular, this aid

17. Difficult decisions: A qualitative exploration of the statistical decision making process from the perspectives of psychology students and academics

Directory of Open Access Journals (Sweden)

Peter James Allen

2016-02-01

Full Text Available Quantitative research methods are essential to the development of professional competence in psychology. They are also an area of weakness for many students. In particular, students are known to struggle with the skill of selecting quantitative analytical strategies appropriate for common research questions, hypotheses and data types. To begin understanding this apparent deficit, we presented nine psychology undergraduates (who had all completed at least one quantitative methods course with brief research vignettes, and asked them to explicate the process they would follow to identify an appropriate statistical technique for each. Thematic analysis revealed that all participants found this task challenging, and even those who had completed several research methods courses struggled to articulate how they would approach the vignettes on more than a very superficial and intuitive level. While some students recognized that there is a systematic decision making process that can be followed, none could describe it clearly or completely. We then presented the same vignettes to 10 psychology academics with particular expertise in conducting research and/or research methods instruction. Predictably, these ‘experts’ were able to describe a far more systematic, comprehensive, flexible and nuanced approach to statistical decision making, which begins early in the research process, and pays consideration to multiple contextual factors. They were sensitive to the challenges that students experience when making statistical decisions, which they attributed partially to how research methods and statistics are commonly taught. This sensitivity was reflected in their pedagogic practices. When asked to consider the format and features of an aid that could facilitate the statistical decision making process, both groups expressed a preference for an accessible, comprehensive and reputable resource that follows a basic decision tree logic. For the academics in

18. Difficult Decisions: A Qualitative Exploration of the Statistical Decision Making Process from the Perspectives of Psychology Students and Academics.

Science.gov (United States)

Allen, Peter J; Dorozenko, Kate P; Roberts, Lynne D

2016-01-01

Quantitative research methods are essential to the development of professional competence in psychology. They are also an area of weakness for many students. In particular, students are known to struggle with the skill of selecting quantitative analytical strategies appropriate for common research questions, hypotheses and data types. To begin understanding this apparent deficit, we presented nine psychology undergraduates (who had all completed at least one quantitative methods course) with brief research vignettes, and asked them to explicate the process they would follow to identify an appropriate statistical technique for each. Thematic analysis revealed that all participants found this task challenging, and even those who had completed several research methods courses struggled to articulate how they would approach the vignettes on more than a very superficial and intuitive level. While some students recognized that there is a systematic decision making process that can be followed, none could describe it clearly or completely. We then presented the same vignettes to 10 psychology academics with particular expertise in conducting research and/or research methods instruction. Predictably, these "experts" were able to describe a far more systematic, comprehensive, flexible, and nuanced approach to statistical decision making, which begins early in the research process, and pays consideration to multiple contextual factors. They were sensitive to the challenges that students experience when making statistical decisions, which they attributed partially to how research methods and statistics are commonly taught. This sensitivity was reflected in their pedagogic practices. When asked to consider the format and features of an aid that could facilitate the statistical decision making process, both groups expressed a preference for an accessible, comprehensive and reputable resource that follows a basic decision tree logic. For the academics in particular, this aid

19. Asymptotic results and statistical procedures for time-changed L\\'evy processes sampled at hitting times

CERN Document Server

Rosenbaum, Mathieu

2010-01-01

We provide asymptotic results and develop high frequency statistical procedures for time-changed L\\'evy processes sampled at random instants. The sampling times are given by first hitting times of symmetric barriers whose distance with respect to the starting point is equal to $\\varepsilon$. This setting can be seen as a first step towards a model for tick-by-tick financial data allowing for large jumps. For a wide class of L\\'evy processes, we introduce a renormalization depending on $\\varepsilon$, under which the L\\'evy process converges in law to an $\\alpha$-stable process as $\\varepsilon$ goes to $0$. The convergence is extended to moments of hitting times and overshoots. In particular, these results allow us to construct consistent estimators of the time change and of the Blumenthal-Getoor index of the underlying L\\'evy process. Convergence rates and a central limit theorem are established under additional assumptions.

20. Harmonic statistics

Energy Technology Data Exchange (ETDEWEB)

Eliazar, Iddo, E-mail: eliazar@post.tau.ac.il

2017-05-15

The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.

1. Statistical Techniques for Analyzing Process or "Similarity" Data in TID Hardness Assurance

Science.gov (United States)

2010-01-01

We investigate techniques for estimating the contributions to TID hardness variability for families of linear bipolar technologies, determining how part-to-part and lot-to-lot variability change for different part types in the process.

2. Statistical Optimization of process parameters for SiO2-Nickel nanocomposites by in-situ reduction

Directory of Open Access Journals (Sweden)

A.K.Pramanick

2015-06-01

Full Text Available The optimum combination of process parameters - temperature, time of reduction under nitrogen atmosphere and amount of NiCl2 was delineated to find the maximum yield of nanocrystallite Ni in the synthesized silica gel matrix. A statistically adequate regression equation, within 95% confidence limit was developed by carrying out a set of experiments within the framework of design of experiment. The regression equation is found to indicate the beneficial role of the temperature and time of reduction.

3. An investigation into the effectiveness of statistical process control techniques, with management data from a product development environment

OpenAIRE

Julien, Denyse

1998-01-01

The study reported on in this thesis was an empirical investigation into the implementation and use of, Statistical Process Control (SPC) techniques and tools in a product development environment. The data used originated from four different business units in the European flavour division of a large International company, belonging to the Flavour and Fragrance industry. The study highlights many of the problems related to the use of real data, and working with individuals throughout an organi...

4. Kinetic Analysis of Dynamic Positron Emission Tomography Data using Open-Source Image Processing and Statistical Inference Tools

OpenAIRE

Hawe, David; Hernández Fernández, Francisco R.; O’Suilleabháin, Liam; Huang, Jian; Wolsztynski, Eric; O’Sullivan, Finbarr

2012-01-01

In dynamic mode, positron emission tomography (PET) can be used to track the evolution of injected radio-labelled molecules in living tissue. This is a powerful diagnostic imaging technique that provides a unique opportunity to probe the status of healthy and pathological tissue by examining how it processes substrates. The spatial aspect of PET is well established in the computational statistics literature. This article focuses on its temporal aspect. The interpretation of PET time-course da...

5. Statistical Process Control Concerning the Glazed Areas Influence on the Energy Efficiency of Buildings

Directory of Open Access Journals (Sweden)

Daniel Lepădatu

2008-01-01

Full Text Available The aim of this paper is to present a statistical investigation, for analyzing the buildings characteristics from the energy efficiency point of view. The energy efficiency of buildings may be estimated by their capacity to ensure a healthy and comfortable environment, with low energy consumption during the whole year. The glazed areas have a decisive role in the building energy efficiency having in view the complex functions that they play in the system. A parametric study, based on the method of factorial plan of experience with two levels, allows us to emphasize the measure in which the geometric and energetic characteristics of glazed areas influence the energy efficiency, estimated by the yearly energy needs, to ensure a comfortable and healthy environment.

6. Marrakesh International Conference on Probability and Statistics

CERN Document Server

Ouassou, Idir; Rachdi, Mustapha

2015-01-01

This volume, which highlights recent advances in statistical methodology and applications, is divided into two main parts. The first part presents theoretical results on estimation techniques in functional statistics, while the second examines three key areas of application: estimation problems in queuing theory, an application in signal processing, and the copula approach to epidemiologic modelling. The book’s peer-reviewed contributions are based on papers originally presented at the Marrakesh International Conference on Probability and Statistics held in December 2013.

7. PROBABILITY AND STATISTICS.

Science.gov (United States)

STATISTICAL ANALYSIS, REPORTS), (*PROBABILITY, REPORTS), INFORMATION THEORY, DIFFERENTIAL EQUATIONS, STATISTICAL PROCESSES, STOCHASTIC PROCESSES, MULTIVARIATE ANALYSIS, DISTRIBUTION THEORY , DECISION THEORY, MEASURE THEORY, OPTIMIZATION

8. Evaluation of influence of the locality, the vintage year, wine variety and fermentation process on volume of cooper and lead in wine

Directory of Open Access Journals (Sweden)

Jaroslav Jedlička

2014-11-01

Full Text Available We have focused on the influence evaluation of the locality, the vintage year and fermentation process on the volume of copper and lead into grape must and wine. First of all copper and lead volume was assessed into fresh grape musts. Subsequently the musts were fermented. During the wines analyses we found great decrease of copper by the fermentation process. Assessed Cu2+ values vary from 0.07 to 0.2 mg.L-1 and represent a decrease of the original copper volume from 90 to 97%. On the copper content into grape has probably the significant influence also the precipitation amount, which falling in the second part of the vegetation half a year. Total rainfall in the period before the grape harvesting (the months of August - September was for the first year 153 mm and for second year 137,5 mm. During both observed vintage years it was concerning to the above average values. Copper is not possible to eliminate totally in the protection of the vine against fungal diseases, because against it does not come into existence resistance into a pathogen. For resolution of this problem it is suitable to combine the copper and organic products. Fermentation affect as a biological filter and influence also lead volume. Into analysed wines we found the decrease of the lead volume from 25 to 94%. Maximal assessed Pb2+ value into wine was 0.09 mg.L-1. The linear relationship between lead and copper into grape must in relationship to the lead and copper into wine was not statistically demonstrated. We found the statistically significant relationship in lead content into grape must by the influence of the vintage year, which as we supposed, it was connected with the atmospheric precipitation quantity and distribution during the vegetation. On the base of the assessed results of the lead and copper volume into wine, we state that by using of the faultless material and appropriate technological equipment during the wine production, it is possible to eliminate almost

9. New municipal solid waste processing technology reduces volume and provides beneficial reuse applications for soil improvement and dust control

Science.gov (United States)

A garbage-processing technology has been developed that shreds, sterilizes, and separates inorganic and organic components of municipal solid waste. The technology not only greatly reduces waste volume, but the non-composted byproduct of this process, Fluff®, has the potential to be utilized as a s...

10. Rho family GTP binding proteins are involved in the regulatory volume decrease process in NIH3T3 mouse fibroblasts

DEFF Research Database (Denmark)

Pedersen, Stine F; Beisner, Kristine H; Willumsen, Berthe M

2002-01-01

The role of Rho GTPases in the regulatory volume decrease (RVD) process following osmotic cell swelling is controversial and has so far only been investigated for the swelling-activated Cl- efflux. We investigated the involvement of RhoA in the RVD process in NIH3T3 mouse fibroblasts, using wild-...

11. Statistical early-warning indicators based on Auto-Regressive Moving-Average processes

CERN Document Server

Faranda, Davide; Dubrulle, Bérengère

2014-01-01

We address the problem of defining early warning indicators of critical transition. To this purpose, we fit the relevant time series through a class of linear models, known as Auto-Regressive Moving-Average (ARMA(p,q)) models. We define two indicators representing the total order and the total persistence of the process, linked, respectively, to the shape and to the characteristic decay time of the autocorrelation function of the process. We successfully test the method to detect transitions in a Langevin model and a 2D Ising model with nearest-neighbour interaction. We then apply the method to complex systems, namely for dynamo thresholds and financial crisis detection.

12. A Probabilistic Collocation Method Based Statistical Gate Delay Model Considering Process Variations and Multiple Input Switching

CERN Document Server

Kumar, Y Satish; Talarico, Claudio; Wang, Janet; 10.1109/DATE.2005.31

2011-01-01

Since the advent of new nanotechnologies, the variability of gate delay due to process variations has become a major concern. This paper proposes a new gate delay model that includes impact from both process variations and multiple input switching. The proposed model uses orthogonal polynomial based probabilistic collocation method to construct a delay analytical equation from circuit timing performance. From the experimental results, our approach has less that 0.2% error on the mean delay of gates and less than 3% error on the standard deviation.

13. Bayesian Reliability-Growth Analysis for Statistical of Diverse Population Based on Non-homogeneous Poisson Process

Institute of Scientific and Technical Information of China (English)

MING Zhimao; TAO Junyong; ZHANG Yunan; YI Xiaoshan; CHEN Xun

2009-01-01

New armament systems are subjected to the method for dealing with multi-stage system reliability-growth statistical problems of diverse population in order to improve reliability before starting mass production. Aiming at the test process which is high expense and small sample-size in the development of complex system, the specific methods are studied on how to process the statistical information of Bayesian reliability growth regarding diverse populations. Firstly, according to the characteristics of reliability growth during product development, the Bayesian method is used to integrate the testing information of multi-stage and the order relations of distribution parameters. And then a Gamma-Beta prior distribution is proposed based on non-homogeneous Poisson process(NHPP) corresponding to the reliability growth process. The posterior distribution of reliability parameters is obtained regarding different stages of product, and the reliability parameters are evaluated based on the posterior distribution. Finally, Bayesian approach proposed in this paper for multi-stage reliability growth test is applied to the test process which is small sample-size in the astronautics filed. The results of a numerical example show that the presented model can make use of the diverse information synthetically, and pave the way for the application of the Bayesian model for multi-stage reliability growth test evaluation with small sample-size. The method is useful for evaluating multi-stage system reliability and making reliability growth plan rationally.

14. Statistical methods

CERN Document Server

Szulc, Stefan

1965-01-01

Statistical Methods provides a discussion of the principles of the organization and technique of research, with emphasis on its application to the problems in social statistics. This book discusses branch statistics, which aims to develop practical ways of collecting and processing numerical data and to adapt general statistical methods to the objectives in a given field.Organized into five parts encompassing 22 chapters, this book begins with an overview of how to organize the collection of such information on individual units, primarily as accomplished by government agencies. This text then

15. Point Processes Modeling of Time Series Exhibiting Power-Law Statistics

CERN Document Server

Kaulakys, B; Gontis, V

2010-01-01

We consider stochastic point processes generating time series exhibiting power laws of spectrum and distribution density (Phys. Rev. E 71, 051105 (2005)) and apply them for modeling the trading activity in the financial markets and for the frequencies of word occurrences in the language.

16. Combining Natural Language Processing and Statistical Text Mining: A Study of Specialized versus Common Languages

Science.gov (United States)

Jarman, Jay

2011-01-01

This dissertation focuses on developing and evaluating hybrid approaches for analyzing free-form text in the medical domain. This research draws on natural language processing (NLP) techniques that are used to parse and extract concepts based on a controlled vocabulary. Once important concepts are extracted, additional machine learning algorithms,…

17. Gráficos de controle X para processos robustos Statistical control of robust processes

Directory of Open Access Journals (Sweden)

Antonio Fernando Branco Costa

1998-12-01

Full Text Available O avanço tecnológico tem tornado os processos produtivos cada vez mais robustos (robustos no sentido de produzirem itens cada vez mais iguais. Neste contexto, pequenas alterações no processo podem ser críticas, devendo portanto ser eliminadas com rapidez. Tradicionalmente, são utilizados os gráficos das somas acumuladas ou das médias móveis para controlar tais processos. O objetivo deste trabalho é mostrar que os gráficos de Shewhart também podem ser utilizados para controlar processos robustos, bastando variar seus parâmetros de projeto de maneira apropriada.Technological development has reduced the variability between items produced on s large scale. Today, a small change in the process can be critical, requiring rapid action to eliminate it. Traditionally, the CUSUM and the moving mean are the charts used to control these processes that we call robust processes. The aim of this paper is to show that the Shewhart chart can also be used to control robust processes, if the constraint of fixed design parameters is relaxed.

18. Combining Natural Language Processing and Statistical Text Mining: A Study of Specialized versus Common Languages

Science.gov (United States)

Jarman, Jay

2011-01-01

This dissertation focuses on developing and evaluating hybrid approaches for analyzing free-form text in the medical domain. This research draws on natural language processing (NLP) techniques that are used to parse and extract concepts based on a controlled vocabulary. Once important concepts are extracted, additional machine learning algorithms,…

19. Fully automated rodent brain MR image processing pipeline on a Midas server: from acquired images to region-based statistics.

Science.gov (United States)

Budin, Francois; Hoogstoel, Marion; Reynolds, Patrick; Grauer, Michael; O'Leary-Moore, Shonagh K; Oguz, Ipek

2013-01-01

Magnetic resonance imaging (MRI) of rodent brains enables study of the development and the integrity of the brain under certain conditions (alcohol, drugs etc.). However, these images are difficult to analyze for biomedical researchers with limited image processing experience. In this paper we present an image processing pipeline running on a Midas server, a web-based data storage system. It is composed of the following steps: rigid registration, skull-stripping, average computation, average parcellation, parcellation propagation to individual subjects, and computation of region-based statistics on each image. The pipeline is easy to configure and requires very little image processing knowledge. We present results obtained by processing a data set using this pipeline and demonstrate how this pipeline can be used to find differences between populations.

20. Statistical key variable analysis and model-based control for improvement performance in a deep reactive ion etching process

Institute of Scientific and Technical Information of China (English)

Chen Shan; Pan Tianhong; Li Zhengming; Jang Shi-Shang

2012-01-01

This paper proposes to develop a data-driven via's depth estimator of the deep reactive ion etching process based on statistical identification of key variables.Several feature extraction algorithms are presented to reduce the high-dimensional data and effectively undertake the subsequent virtual metrology (VM) model building process.With the available on-line VM model,the model-based controller is hence readily applicable to improve the quality ofa via's depth.Real operational data taken from a industrial manufacturing process are used to verify the effectiveness of the proposed method.The results demonstrate that the proposed method can decrease the MSE from 2.2 × 10-2 to 9 × 10-4 and has great potential in improving the existing DRIE process.

1. Stationarity and periodicities of linear speed of coronal mass ejection: a statistical signal processing approach

Science.gov (United States)

Chattopadhyay, Anirban; Khondekar, Mofazzal Hossain; Bhattacharjee, Anup Kumar

2017-09-01

In this paper initiative has been taken to search the periodicities of linear speed of Coronal Mass Ejection in solar cycle 23. Double exponential smoothing and Discrete Wavelet Transform are being used for detrending and filtering of the CME linear speed time series. To choose the appropriate statistical methodology for the said purpose, Smoothed Pseudo Wigner-Ville distribution (SPWVD) has been used beforehand to confirm the non-stationarity of the time series. The Time-Frequency representation tool like Hilbert Huang Transform and Empirical Mode decomposition has been implemented to unearth the underneath periodicities in the non-stationary time series of the linear speed of CME. Of all the periodicities having more than 95% Confidence Level, the relevant periodicities have been segregated out using Integral peak detection algorithm. The periodicities observed are of low scale ranging from 2-159 days with some relevant periods like 4 days, 10 days, 11 days, 12 days, 13.7 days, 14.5 and 21.6 days. These short range periodicities indicate the probable origin of the CME is the active longitude and the magnetic flux network of the sun. The results also insinuate about the probable mutual influence and causality with other solar activities (like solar radio emission, Ap index, solar wind speed, etc.) owing to the similitude between their periods and CME linear speed periods. The periodicities of 4 days and 10 days indicate the possible existence of the Rossby-type waves or planetary waves in Sun.

2. Neural sensitivity to statistical regularities as a fundamental biological process that underlies auditory learning: the role of musical practice.

Science.gov (United States)

François, Clément; Schön, Daniele

2014-02-01

There is increasing evidence that humans and other nonhuman mammals are sensitive to the statistical structure of auditory input. Indeed, neural sensitivity to statistical regularities seems to be a fundamental biological property underlying auditory learning. In the case of speech, statistical regularities play a crucial role in the acquisition of several linguistic features, from phonotactic to more complex rules such as morphosyntactic rules. Interestingly, a similar sensitivity has been shown with non-speech streams: sequences of sounds changing in frequency or timbre can be segmented on the sole basis of conditional probabilities between adjacent sounds. We recently ran a set of cross-sectional and longitudinal experiments showing that merging music and speech information in song facilitates stream segmentation and, further, that musical practice enhances sensitivity to statistical regularities in speech at both neural and behavioral levels. Based on recent findings showing the involvement of a fronto-temporal network in speech segmentation, we defend the idea that enhanced auditory learning observed in musicians originates via at least three distinct pathways: enhanced low-level auditory processing, enhanced phono-articulatory mapping via the left Inferior Frontal Gyrus and Pre-Motor cortex and increased functional connectivity within the audio-motor network. Finally, we discuss how these data predict a beneficial use of music for optimizing speech acquisition in both normal and impaired populations.

3. Results of wavelet processing of the 2K-capture Kr-78 experiment statistics

CERN Document Server

Gavrilyuk, Yu M; Kazalov, V V; Kuzminov, V V; Panasenko, S I; Ratkevich, S S

2010-01-01

Results of a search for Kr-78 double K-capture with the large low-background proportional counter (2005-2008 years) at the Baksan Neutrino Observatory are presented. An experimental method and characteristics of detectors are described. Basic features of the digitized pulses processing using wavelet transform are considered. With due account taken of the analysis of individual noise characteristic it has been shown that the appropriate choice of both wavelet characteristics and sequence of processing algorithms allows one to decrease the background in the energy region of useful events with a unique set of characteristics by ~2000 times. New limit on the half-life of Kr-78 with regard to 2K-capture has been found: T_{1/2} >= 2.4E21 yrs (90% C.L.).

4. Statistical post-processing of probabilistic wind speed forecasting in Hungary

Energy Technology Data Exchange (ETDEWEB)

Baran, Sandor; Nemoda, Dora [Debrecen Univ. (Hungary). Faculty of Informatics; Horanyi, Andras [Hungarian Meteorological Service (Hungary)

2013-10-15

Weather forecasting is mostly based on the outputs of deterministic numerical weather forecasting models. Multiple runs of these models with different initial conditions result in a forecast ensemble which is applied for estimating the future distribution of atmospheric variables. However, as these ensembles are usually under-dispersive and uncalibrated, post-processing is required. In the present work, Bayesian Model Averaging (BMA) is applied for calibrating ensembles of wind speed forecasts produced by the operational Limited Area Model Ensemble Prediction System of the Hungarian Meteorological Service (HMS). We describe two possible BMA models for wind speed data of the HMS and show that BMA post-processing significantly improves the calibration and accuracy of point forecasts. (orig.)

5. The privatization of central statistics office; case of Botswana: The challenges of the change process

OpenAIRE

Maitumelo Molebatsi; Rudolph Lenyeletse Boy

2012-01-01

The purpose of this study was to examine the challenges of the change process experienced by CSO in becoming partly-privatized organization and to suggest better ways of going through the transition. This research was done through oral and questionnaire based interviews and therefore resulted in quantitative and qualitative methods of research being used. The targeted respondents were CSO staff members as this change that the study is focusing on is mainly to do with changing their organizati...

6. Statistics; Tilastot

Energy Technology Data Exchange (ETDEWEB)

NONE

1998-12-31

For the year 1997 and 1998, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually includes also historical time series over a longer period (see e.g. Energiatilastot 1997, Statistics Finland, Helsinki 1998, ISSN 0784-3165). The inside of the Reviews back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO{sub 2}-emissions, Electricity supply, Energy imports by country of origin in January-September 1998, Energy exports by recipient country in January-September 1998, Consumer prices of liquid fuels, Consumer prices of hard coal, Natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, Value added taxes and fiscal charges and fees included in consumer prices of some energy sources, Energy taxes and precautionary stock fees, pollution fees on oil products

7. Statistics; Tilastot

Energy Technology Data Exchange (ETDEWEB)

NONE

1998-12-31

For the year 1997 and 1998, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually includes also historical time series over a longer period (see e.g. Energiatilastot 1996, Statistics Finland, Helsinki 1997, ISSN 0784-3165). The inside of the Reviews back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO{sub 2}-emissions, Electricity supply, Energy imports by country of origin in January-June 1998, Energy exports by recipient country in January-June 1998, Consumer prices of liquid fuels, Consumer prices of hard coal, Natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, Value added taxes and fiscal charges and fees included in consumer prices of some energy sources, Energy taxes and precautionary stock fees, pollution fees on oil products

8. Statistical properties of a filtered Poisson process with additive random noise: Distributions, correlations and moment estimation

CERN Document Server

Theodorsen, Audun; Rypdal, Martin

2016-01-01

The filtered Poisson process is often used as a reference model for intermittent fluctuations in physical systems. Here, this process is extended by adding a noise term, either as a purely additive term to the process or as a dynamical term in a stochastic differential equation. The moments, probability density function, auto- correlation function and power spectral density are derived and used to compare the effects of the different noise terms. Monte-Carlo studies of synthetic time series are used to investigate the accuracy of parameter estimation and to identify methods for separating the noise types. It is shown that the probability density function and the three lowest moments provide accurate estimations of the parameters, but are unable to separate the noise types. The auto-correlation function and the power spectral density also provide methods for estimating the model parameters, as well as being capable of determining the noise type. The number of times the signal passes a prescribed threshold in t...

9. Vulnerability of recent GPS adaptive antenna processing (and all STAP/SLC) to statistically nonstationary jammer threats

Science.gov (United States)

Kerr, Thomas H., III

2001-11-01

We alert the reader here to an apparent vulnerability in recent adaptive antenna processing that claims increased GPS jammer resistance via use of lucrative new approaches to adaptive beamforming and/or null-steering, but unfortunately, presumes only simplistic unsophisticated wideband barrage WGN jammers as threats. When jammers are less cooperative by being statistically non-stationary (e.g., by exhibiting time-varying means or biases, by being synchronized blinking jammer pairs, or by varying the total power output with time), then statistics on the jammers can apparently no longer be successfully extracted fro the time-averages (even in post-processing mode using long blocks of received data that was saved) because ergodicity of the underlying intermediate covariance estimate is lost. This unpleasant situation arises or exists for abstracted, idealized STAP or SLC algorithms making use of the, by now, familiar fundamental STAP-related equation: R-1υ, where this necessary intermediate covariance estimate R=R1+R2+R3 has the three indicated constituent components due to thermal/environmental noise, clutter, and jammers, respectively. Without an ability to accurately estimate R3, the appropriate jammer nullings apparently can no longer be activated successfully. Such systems susceptible to jamming can be revealed by in-situ tests with simple equipment. Topics herein include updates to PRN generation, updates to statistical tests in general and to CLT in particular and all three updates being new to most engineers.

10. Cintichem modified process - {sup 99}Mo precipitation step: application of statistical analysis tools over the reaction parameters

Energy Technology Data Exchange (ETDEWEB)

Teodoro, Rodrigo; Dias, Carla R.B.R.; Osso Junior, Joao A., E-mail: jaosso@ipen.b [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil); Fernandez Nunez, Eutimio Gustavo [Universidade de Sao Paulo (EP/USP), SP (Brazil). Escola Politecnica. Dept. de Engenharia Quimica

2011-07-01

Precipitation of {sup 99}Mo by {alpha}-benzoin oxime ({alpha}-Bz) is a standard precipitation method for molybdenum due the high selectivity of this agent. Nowadays, statistical analysis tools have been employed in analytical systems to prove its efficiency and feasibility. IPEN has a project aiming the production of {sup 99}Mo by the fission of {sup 235}U route. The processing uses as the first step the precipitation of {sup 99}Mo with {alpha}-Bz. This precipitation step involves many key reaction parameters. The aim of this work is based on the development of the already known acidic route to produce {sup 99}Mo as well as the optimization of the reactional parameters applying statistical tools. In order to simulate {sup 99}Mo precipitation, the study was conducted in acidic media using HNO{sub 3}, {alpha}Bz as precipitant agent and NaOH /1%H{sub 2}O{sub 2} as dissolver solution. Then, a Mo carrier, KMnO{sub 4} solutions and {sup 99}Mo tracer were added to the reaction flask. The reactional parameters ({alpha}-Bz/Mo ratio, Mo carrier, reaction time and temperature, and cooling reaction time before filtration) were evaluated under a fractional factorial design of resolution V. The best values of each reactional parameter were determined by a response surface statistical planning. The precipitation and recovery yields of {sup 99}Mo were measured using HPGe detector. Statistical analysis from experimental data suggested that the reactional parameters {alpha}-Bz/Mo ratio, reaction time and temperature have a significant impact on {sup 99}Mo precipitation. Optimization statistical planning showed that higher {alpha}Bz/Mo ratios, room temperature, and lower reaction time lead to higher {sup 99}Mo yields. (author)

11. Damage localization by statistical evaluation of signal-processed mode shapes

DEFF Research Database (Denmark)

Ulriksen, Martin Dalgaard; Damkilde, Lars

2015-01-01

in the spatial mode shape signals, hereby potentially facilitating damage detection and/or localization. However, by being based on distinguishing damage-induced discontinuities from other signal irregularities, an intrinsic deficiency in these methods is the high sensitivity towards measurement noise....... The present article introduces a damage localization method which, compared to the conventional mode shape-based methods, has greatly enhanced robustness towards measurement noise. The method is based on signal processing of spatial mode shapes by means of continuous wavelet transformation (CWT...

12. Ethanol production from banana peels using statistically optimized simultaneous saccharification and fermentation process.

Science.gov (United States)

Oberoi, Harinder Singh; Vadlani, Praveen V; Saida, Lavudi; Bansal, Sunil; Hughes, Joshua D

2011-07-01

Dried and ground banana peel biomass (BP) after hydrothermal sterilization pretreatment was used for ethanol production using simultaneous saccharification and fermentation (SSF). Central composite design (CCD) was used to optimize concentrations of cellulase and pectinase, temperature and time for ethanol production from BP using SSF. Analysis of variance showed a high coefficient of determination (R(2)) value of 0.92 for ethanol production. On the basis of model graphs and numerical optimization, the validation was done in a laboratory batch fermenter with cellulase, pectinase, temperature and time of nine cellulase filter paper unit/gram cellulose (FPU/g-cellulose), 72 international units/gram pectin (IU/g-pectin), 37 °C and 15 h, respectively. The experiment using optimized parameters in batch fermenter not only resulted in higher ethanol concentration than the one predicted by the model equation, but also saved fermentation time. This study demonstrated that both hydrothermal pretreatment and SSF could be successfully carried out in a single vessel, and use of optimized process parameters helped achieve significant ethanol productivity, indicating commercial potential for the process. To the best of our knowledge, ethanol concentration and ethanol productivity of 28.2 g/l and 2.3 g/l/h, respectively from banana peels have not been reported to date.

13. The Measurement of Optimization Performance of Managed Service Division with ITIL Framework using Statistical Process Control

Directory of Open Access Journals (Sweden)

Kasman Suhairi

2013-03-01

Full Text Available The purpose of the Configuration Management process is carrying and all IT assets, status, configuration, and relationship between each other being well documented. This documentation is useful, among others, for some purposes. The first objective is to create clarity in the relationship between key performance indicators (KPI an IT services with the infrastructure. Changes to the configuration of those devices would obviously very disturbing the performance of IT services. The second objective is the accuracy of the information which will be used by the Service Delivery processes. So a Service Desk staff who need to get information about how a user at a branch office to connect to the network's headquarters, linked to issues of access to certain applications. Accurate network configuration information will be helpful Service Desk staff in helping the user solve the problem. The third objective is the accuracy of the information will be used for the IT audit.PT. XYZ is a telecommunications company which relatively new and aware of the increasing competitive competition in the telecommunications industry. PT. XYZ was starting its operation in 2006. The company's ambition is to develop progressively by increasing operational performance which closely linkages between operational performance improvements company with a bottom line of the company. Thus, it is a necessity / obligation for companies in the global era of integrated telecommunications services, to focus on Quality of services (QoS provided to its customers, in order to survive in an increasingly competitive telecommunications business.

14. Single-digit arithmetic processing – anatomical evidence from statistical voxel-based lesion analysis

Directory of Open Access Journals (Sweden)

Urszula eMihulowicz

2014-05-01

Full Text Available Different specific mechanisms have been suggested for solving single-digit arithmetic operations. However, the neural correlates underlying basic arithmetic (multiplication, addition, subtraction are still under debate. In the present study, we systematically assessed single-digit arithmetic in a group of acute stroke patients (n=45 with circumscribed left- or right-hemispheric brain lesions. Lesion sites significantly related to impaired performance were found only in the left-hemisphere damaged group. Deficits in multiplication and addition were related to subcortical/white matter brain regions differing from those for subtraction tasks, corroborating the notion of distinct processing pathways for different arithmetic tasks. Additionally, our results further point to the importance of investigating fiber pathways in numerical cognition.

15. Use of statistical process control in the production of blood components

DEFF Research Database (Denmark)

Magnussen, K.; Quere, S.; Winkel, P.

2008-01-01

occasional component manufacturing staff to an experienced regular manufacturing staff. Production of blood products is a semi-automated process in which the manual steps may be difficult to control. This study was performed in an ongoing effort to improve the control and optimize the quality of the blood...... components produced and gives an example of how to meet EU legislative requirements in a small-scale production centre. Data included quality control measurements in 363 units of red blood cells, 79 units of platelets produced by an occasional staff with 11 technologists and 79 units of platelets produced...... staff were out of control, which was not the case with the experienced staff. Introduction of control charts to a small blood centre has elucidated the difficulties in controlling the blood production and shown the advantage of using experienced regular component manufacturing staff Udgivelsesdato: 2008/6...

16. Critical Infrastructure Protection II, The International Federation for Information Processing, Volume 290.

Science.gov (United States)

Papa, Mauricio; Shenoi, Sujeet

The information infrastructure -- comprising computers, embedded devices, networks and software systems -- is vital to day-to-day operations in every sector: information and telecommunications, banking and finance, energy, chemicals and hazardous materials, agriculture, food, water, public health, emergency services, transportation, postal and shipping, government and defense. Global business and industry, governments, indeed society itself, cannot function effectively if major components of the critical information infrastructure are degraded, disabled or destroyed. Critical Infrastructure Protection II describes original research results and innovative applications in the interdisciplinary field of critical infrastructure protection. Also, it highlights the importance of weaving science, technology and policy in crafting sophisticated, yet practical, solutions that will help secure information, computer and network assets in the various critical infrastructure sectors. Areas of coverage include: - Themes and Issues - Infrastructure Security - Control Systems Security - Security Strategies - Infrastructure Interdependencies - Infrastructure Modeling and Simulation This book is the second volume in the annual series produced by the International Federation for Information Processing (IFIP) Working Group 11.10 on Critical Infrastructure Protection, an international community of scientists, engineers, practitioners and policy makers dedicated to advancing research, development and implementation efforts focused on infrastructure protection. The book contains a selection of twenty edited papers from the Second Annual IFIP WG 11.10 International Conference on Critical Infrastructure Protection held at George Mason University, Arlington, Virginia, USA in the spring of 2008.

17. Feasibility evaluation solar heated textile process water. Volume II. Appendices. Final report

Energy Technology Data Exchange (ETDEWEB)

Hester, J. C.; Beard, J. N.; Robinson, G. F.; Harnett, R. M.

1977-02-01

The general objectives of this study are to determine the technical and economic feasibility of the use of solar energy for heating waters in the textile industry and to develop a plan for efforts beyond this feasibility study phase. Specific objectives include (1) determine the industry requirements for heated process water, (2) assess particular schemes and their economic impact, (3) study the total cost environment for solar water heating in this industry, and (4) recommend future experiments. This volume contains the appendices: (A) fiber distribution and end use data; (B) computer model description for textile plant energy balances; (C) computer model description to generate local solar potential; (D) computer model description for system synthesis and analysis; (E) computer model to determine pressure drop, flow distribution and plumbing components; (F) area requirement plots for various use rates, temperature levels, seasons, orientations and collector types for textile operations; (G) computer model description of economic variables for COSMO1 and COSMO2; (H) rate of return plots for various textile applications and energy cost scenerios; and (I) data base for efficiency curves for six collector types. (WHK)

18. Evaluation of multivariate statistical analyses for monitoring and prediction of processes in an seawater reverse osmosis desalination plant

Energy Technology Data Exchange (ETDEWEB)

Kolluri, Srinivas Sahan; Esfahani, Iman Janghorban; Garikiparthy, Prithvi Sai Nadh; Yoo, Chang Kyoo [Kyung Hee University, Yongin (Korea, Republic of)

2015-08-15

Our aim was to analyze, monitor, and predict the outcomes of processes in a full-scale seawater reverse osmosis (SWRO) desalination plant using multivariate statistical techniques. Multivariate analysis of variance (MANOVA) was used to investigate the performance and efficiencies of two SWRO processes, namely, pore controllable fiber filterreverse osmosis (PCF-SWRO) and sand filtration-ultra filtration-reverse osmosis (SF-UF-SWRO). Principal component analysis (PCA) was applied to monitor the two SWRO processes. PCA monitoring revealed that the SF-UF-SWRO process could be analyzed reliably with a low number of outliers and disturbances. Partial least squares (PLS) analysis was then conducted to predict which of the seven input parameters of feed flow rate, PCF/SF-UF filtrate flow rate, temperature of feed water, turbidity feed, pH, reverse osmosis (RO)flow rate, and pressure had a significant effect on the outcome variables of permeate flow rate and concentration. Root mean squared errors (RMSEs) of the PLS models for permeate flow rates were 31.5 and 28.6 for the PCF-SWRO process and SF-UF-SWRO process, respectively, while RMSEs of permeate concentrations were 350.44 and 289.4, respectively. These results indicate that the SF-UF-SWRO process can be modeled more accurately than the PCF-SWRO process, because the RMSE values of permeate flowrate and concentration obtained using a PLS regression model of the SF-UF-SWRO process were lower than those obtained for the PCF-SWRO process.

19. Visualizing microbial dechlorination processes in underground ecosystem by statistical correlation and network analysis approach.

Science.gov (United States)

Yamazawa, Akira; Date, Yasuhiro; Ito, Keijiro; Kikuchi, Jun

2014-03-01

Microbial ecosystems are typified by diverse microbial interactions and competition. Consequently, the microbial networks and metabolic dynamics of bioprocesses catalyzed by these ecosystems are highly complex, and their visualization is regarded as essential to bioengineering technology and innovation. Here we describe a means of visualizing the variants in a microbial community and their metabolic profiles. The approach enables previously unidentified bacterial functions in the ecosystems to be elucidated. We investigated the anaerobic bioremediation of chlorinated ethene in a soil column experiment as a case study. Microbial community and dechlorination profiles in the ecosystem were evaluated by denaturing gradient gel electrophoresis (DGGE) fingerprinting and gas chromatography, respectively. Dechlorination profiles were obtained from changes in dechlorination by microbial community (evaluated by data mining methods). Individual microbes were then associated with their dechlorination profiles by heterogenous correlation analysis. Our correlation-based visualization approach enables deduction of the roles and functions of bacteria in the dechlorination of chlorinated ethenes. Because it estimates functions and relationships between unidentified microbes and metabolites in microbial ecosystems, this approach is proposed as a control-logic tool by which to understand complex microbial processes.

20. Statistical optimization for biodiesel production from waste frying oil through two-step catalyzed process

Energy Technology Data Exchange (ETDEWEB)

Charoenchaitrakool, Manop [Department of Chemical Engineering, Faculty of Engineering, Kasetsart University, Bangkok (Thailand); Center for Advanced Studies in Nanotechnology and its Applications in Chemical, Food and Agricultural Industries, Kasetsart University, Bangkok (Thailand); Thienmethangkoon, Juthagate [Department of Chemical Engineering, Faculty of Engineering, Kasetsart University, Bangkok (Thailand)

2011-01-15

The aim of this work was to investigate the optimum conditions in biodiesel production from waste frying oil using two-step catalyzed process. In the first step, sulfuric acid was used as a catalyst for the esterification reaction of free fatty acid and methanol in order to reduce the free fatty acid content to be approximate 0.5%. In the second step, the product from the first step was further reacted with methanol using potassium hydroxide as a catalyst. The Box-Behnken design of experiment was carried out using the MINITAB RELEASE 14, and the results were analyzed using response surface methodology. The optimum conditions for biodiesel production were obtained when using methanol to oil molar ratio of 6.1:1, 0.68 wt.% of sulfuric acid, at 51 C with a reaction time of 60 min in the first step, followed by using molar ratio of methanol to product from the first step of 9.1:1, 1 wt.% KOH, at 55 C with a reaction time of 60 min in the second step. The percentage of methyl ester in the obtained product was 90.56 {+-} 0.28%. In addition, the fuel properties of the produced biodiesel were in the acceptable ranges according to Thai standard for community biodiesel. (author)

1. USING STATISTICAL PROCESS CONTROL TO MONITOR RADIOACTIVE WASTE CHARACTERIZATION AT A RADIOACTIVE FACILITY

Energy Technology Data Exchange (ETDEWEB)

WESTCOTT, J.L.

2006-11-15

Two facilities for storing spent nuclear fuel underwater at the Hanford site in southeastern Washington State being removed from service, decommissioned, and prepared for eventual demolition. The fuel-storage facilities consist of two separate basins called K East (KE) and K West (KW) that are large subsurface concrete pools filled with water, with a containment structure over each. The basins presently contain sludge, debris, and equipment that have accumulated over the years. The spent fuel has been removed from the basins. The process for removing the remaining sludge, equipment, and structure has been initiated for the basins. Ongoing removal operations generate solid waste that is being treated as required, and then disposed. The waste, equipment and building structures must be characterized to properly manage, ship, treat (if necessary), and dispose as radioactive waste. As the work progresses, it is expected that radiological conditions in each basin may change as radioactive materials are being moved within and between the basins. It is imperative that these changing conditions be monitored so that radioactive characterization of waste is adjusted as necessary.

2. USING STATISTICAL PROCESS CONTROL TO MONITOR RADIOACTIVE WASTE CHARACTERIZATION AT A RADIOACTIVE FACILITY

Energy Technology Data Exchange (ETDEWEB)

WESTCOTT, J.L.; JOCHEN; PREVETTE

2007-01-02

Two facilities for storing spent nuclear fuel underwater at the Hanford site in southeastern Washington State are being removed from service, decommissioned, and prepared for eventual demolition. The fuel-storage facilities consist of two separate basins called K East (KE) and K West (KW) that are large subsurface concrete pools filled with water, with a containment structure over each. The basins presently contain sludge, debris, and equipment that have accumulated over the years. The spent fuel has been removed from the basins. The process for removing the remaining sludge, equipment, and structure has been initiated for the basins. Ongoing removal operations generate solid waste that is being treated as required, and then disposed. The waste, equipment and building structures must be characterized to properly manage, ship, treat (if necessary), and dispose as radioactive waste. As the work progresses, it is expected that radiological conditions in each basin may change as radioactive materials are being moved within and between the basins. It is imperative that these changing conditions be monitored so that radioactive characterization of waste is adjusted as necessary.

3. Improvement of methods of determining design values of soils` strength characteristics on the basis of statistical processing of them

Energy Technology Data Exchange (ETDEWEB)

Ignatova, O.I.

1994-05-01

This article considers the question of statistical processing of test data on single-plane shear strength for computing the design values of strength parameters {var_phi} and c. It is noted that the procedure of State All-Union Standard 20522-75 does not fully take into account the peculiarities of test data, and the recommendations for constructing confidence boundaries for {var_phi} and c are incorrect. A conclusion is drawn about the need to improve the State All-Union Standard in effect.

4. SPC统计过程控制应用浅析%The Implementation of SPC Statistical Process Control

Institute of Scientific and Technical Information of China (English)

谢有明

2015-01-01

SPC是一种借助数学统计方法进行过程控制的工具，文章介绍了SPC应用的原理、实施步骤，并使用具体案例介绍了SPC在产品控制中的应用方法。%SPC is a tool that used in process control with statistical method. The paper introduces the theory , procedure and SPC control method through a concrete example.

5. Recognition of unnatural variation patterns in metal-stamping process using artificial neural network and statistical features

Science.gov (United States)

Rahman, Norasulaini Abdul; Masood, Ibrahim; Nasrull Abdol Rahman, Mohd

2016-11-01

Unnatural process variation (UPV) is vital in quality problem of a metalstamping process. It is a major contributor to a poor quality product. The sources of UPV usually found from special causes. Recently, there is still debated among researchers in finding an effective technique for on-line monitoring-diagnosis the sources of UPV. Control charts pattern recognition (CCPR) is the most investigated technique. The existing CCPR schemes were mainly developed using raw data-based artificial neural network (ANN) recognizer, whereby the process samples were mainly generated artificially using mathematical equations. This is because the real process samples were commonly confidential or not economically available. In this research, the statistical features - ANN recognizer was utilized as the control chart pattern recognizer, whereby process sample was taken directly from an actual manufacturing process. Based on dynamic data training, the proposed recognizer has resulted in better monitoring-diagnosis performance (Normal = 100%, Unnatural = 100%) compared to the raw data- ANN (Normal = 66.67%, Unnatural = 26.97%).

6. Probability of mediastinal involvement in non-small-cell lung cancer: a statistical definition of the clinical target volume for 3-dimensional conformal radiotherapy?

Science.gov (United States)

Giraud, Philippe; De Rycke, Yann; Lavole, Armelle; Milleron, Bernard; Cosset, Jean-Marc; Rosenzweig, Kenneth E

2006-01-01

Conformal irradiation (3D-CRT) of non-small-cell lung carcinoma (NSCLC) is largely based on precise definition of the nodal clinical target volume (CTVn). A reduction of the number of nodal stations to be irradiated would facilitate tumor dose escalation. The aim of this study was to design a mathematical tool based on documented data to predict the risk of metastatic involvement for each nodal station. We reviewed the large surgical series published in the literature to identify the main pretreatment parameters that modify the risk of nodal invasion. The probability of involvement for the 17 nodal stations described by the American Thoracic Society (ATS) was computed from all these publications. Starting with the primary site of the tumor as the main characteristic, we built a probabilistic tree for each nodal station representing the risk distribution as a function of each tumor feature. Statistical analysis used the inversion of probability trees method described by Weinstein and Feinberg. Validation of the software based on 134 patients from two different populations was performed by receiver operator characteristic (ROC) curves and multivariate logistic regression. Analysis of all of the various parameters of pretreatment staging relative to each level of the ATS map results in 20,000 different combinations. The first parameters included in the tree, depending on tumor site, were histologic classification, metastatic stage, nodal stage weighted as a function of the sensitivity and specificity of the diagnostic examination used (positron emission tomography scan, computed tomography scan), and tumor stage. Software is proposed to compute a predicted probability of involvement of each nodal station for any given clinical presentation. Double cross validation confirmed the methodology. A 10% cutoff point was calculated from ROC and logistic model giving the best prediction of mediastinal lymph node involvement. To more accurately define the CTVn in NSCLC three

7. Retroviral integration process in the human genome: is it really non-random? A new statistical approach.

Directory of Open Access Journals (Sweden)

Alessandro Ambrosi

Full Text Available Retroviral vectors are widely used in gene therapy to introduce therapeutic genes into patients' cells, since, once delivered to the nucleus, the genes of interest are stably inserted (integrated into the target cell genome. There is now compelling evidence that integration of retroviral vectors follows non-random patterns in mammalian genome, with a preference for active genes and regulatory regions. In particular, Moloney Leukemia Virus (MLV-derived vectors show a tendency to integrate in the proximity of the transcription start site (TSS of genes, occasionally resulting in the deregulation of gene expression and, where proto-oncogenes are targeted, in tumor initiation. This has drawn the attention of the scientific community to the molecular determinants of the retroviral integration process as well as to statistical methods to evaluate the genome-wide distribution of integration sites. In recent approaches, the observed distribution of MLV integration distances (IDs from the TSS of the nearest gene is assumed to be non-random by empirical comparison with a random distribution generated by computational simulation procedures. To provide a statistical procedure to test the randomness of the retroviral insertion pattern, we propose a probability model (Beta distribution based on IDs between two consecutive genes. We apply the procedure to a set of 595 unique MLV insertion sites retrieved from human hematopoietic stem/progenitor cells. The statistical goodness of fit test shows the suitability of this distribution to the observed data. Our statistical analysis confirms the preference of MLV-based vectors to integrate in promoter-proximal regions.

8. AIED 2009 Workshops Proceeedings Volume 10: Natural Language Processing in Support of Learning: Metrics, Feedback and Connectivity

NARCIS (Netherlands)

Dessus, Philippe; Trausan-Matu, Stefan; Van Rosmalen, Peter; Wild, Fridolin

2009-01-01

Dessus, P., Trausan-Matu, S., Van Rosmalen, P., & Wild, F. (Eds.) (2009). AIED 2009 Workshops Proceedings Volume 10 Natural Language Processing in Support of Learning: Metrics, Feedback and Connectivity. In S. D. Craig & D. Dicheva (Eds.), AIED 2009: 14th International Conference in Artificial

9. TV Trouble-Shooting Manual. Volumes 5-6. Part 2: Video Signal Processing Circuit. Student and Instructor's Manuals.

Science.gov (United States)

Mukai, Masaaki; Kobayashi, Ryozo

These volumes are, respectively, the self-instructional student manual and the teacher manual that cover the second set of training topics in this course for television repair technicians. Both contain identical explanations of the structure and function of the elements of the video signal processing circuit (the tuner, video intermediate…

10. Effect of the spray volume adjustment model on the efficiency of fungicides and residues in processing tomato

Energy Technology Data Exchange (ETDEWEB)

Ratajkiewicz, H.; Kierzek, R.; Raczkowski, M.; Hołodyńska-Kulas, A.; Łacka, A.; Wójtowicz, A.; Wachowiak, M.

2016-11-01

This study compared the effects of a proportionate spray volume (PSV) adjustment model and a fixed model (300 L/ha) on the infestation of processing tomato with potato late blight (Phytophthora infestans (Mont.) de Bary) (PLB) and azoxystrobin and chlorothalonil residues in fruits in three consecutive seasons. The fungicides were applied in alternating system with or without two spreader adjuvants. The proportionate spray volume adjustment model was based on the number of leaves on plants and spray volume index. The modified Quick, Easy, Cheap, Effective, Rugged, and Safe (QuEChERS) method was optimized and validated for extraction of azoxystrobin and chlorothalonil residue. Gas chromatography with a nitrogen and phosphorus detector and an electron capture detector were used for the analysis of fungicides. The results showed that higher fungicidal residues were connected with lower infestation of tomato with PLB. PSV adjustment model resulted in lower infestation of tomato than the fixed model (300 L/ha) when fungicides were applied at half the dose without adjuvants. Higher expected spray interception into the tomato canopy with the PSV system was recognized as the reasons of better control of PLB. The spreader adjuvants did not have positive effect on the biological efficacy of spray volume application systems. The results suggest that PSV adjustment model can be used to determine the spray volume for fungicide application for processing tomato crop. (Author)

11. Effect of the spray volume adjustment model on the efficiency of fungicides and residues in processing tomato

Directory of Open Access Journals (Sweden)

Henryk Ratajkiewicz

2016-08-01

Full Text Available This study compared the effects of a proportionate spray volume (PSV adjustment model and a fixed model (300 L/ha on the infestation of processing tomato with potato late blight (Phytophthora infestans (Mont. de Bary (PLB and azoxystrobin and chlorothalonil residues in fruits in three consecutive seasons. The fungicides were applied in alternating system with or without two spreader adjuvants. The proportionate spray volume adjustment model was based on the number of leaves on plants and spray volume index. The modified Quick, Easy, Cheap, Effective, Rugged, and Safe (QuEChERS method was optimized and validated for extraction of azoxystrobin and chlorothalonil residue. Gas chromatography with a nitrogen and phosphorus detector and an electron capture detector were used for the analysis of fungicides. The results showed that higher fungicidal residues were connected with lower infestation of tomato with PLB. PSV adjustment model resulted in lower infestation of tomato than the fixed model (300 L/ha when fungicides were applied at half the dose without adjuvants. Higher expected spray interception into the tomato canopy with the PSV system was recognized as the reasons of better control of PLB. The spreader adjuvants did not have positive effect on the biological efficacy of spray volume application systems. The results suggest that PSV adjustment model can be used to determine the spray volume for fungicide application for processing tomato crop.

12. From knowledge to recognition: Evaluation of the process of a supervision group through lexical and textual statistical analysis

Directory of Open Access Journals (Sweden)

Giovanna Di Falco

2014-09-01

Full Text Available The following article shows the process analysis of group supervision within a Therapeutic Community for adolescents with psychiatric problems. The main aim is to explore the influence of the group supervision on the quality of the patients care and, at the same time, the relationship between the therapists. The evaluation was made recording the group meetings and analyzing the transcriptions (representation of therapists’ speeches in written form through the use of statistical text analysis. Data analysis describes the frame of an institution – the TC – where it seems possible to “think” and “rethink” the clinical work and the relationship with patients bearing in mind – at the same time – institutional aspects; patients’ family role; social context; emotional and relational aspects of the therapists.Keywords: Process, Supervision, Therapeutic community

13. Statistics Clinic

Science.gov (United States)

Feiveson, Alan H.; Foy, Millennia; Ploutz-Snyder, Robert; Fiedler, James

2014-01-01

Do you have elevated p-values? Is the data analysis process getting you down? Do you experience anxiety when you need to respond to criticism of statistical methods in your manuscript? You may be suffering from Insufficient Statistical Support Syndrome (ISSS). For symptomatic relief of ISSS, come for a free consultation with JSC biostatisticians at our help desk during the poster sessions at the HRP Investigators Workshop. Get answers to common questions about sample size, missing data, multiple testing, when to trust the results of your analyses and more. Side effects may include sudden loss of statistics anxiety, improved interpretation of your data, and increased confidence in your results.

14. Statistical medium optimization of an alkaline protease from Pseudomonas aeruginosa MTCC 10501, its characterization and application in leather processing.

Science.gov (United States)

Boopathy, Naidu Ramachandra; Indhuja, Devadas; Srinivasan, Krishnan; Uthirappan, Mani; Gupta, Rishikesh; Ramudu, Kamini Numbi; Chellan, Rose

2013-04-01

Proteases are shown to have greener mode of application in leather processing for dehairing of goat skins and cow hides. Production of protease by submerged fermentation with potent activity is reported using a new isolate P. aeruginosa MTCC 10501. The production parameters were optimized by statistical methods such as Plackett-Burman and response surface methodology. The optimized production medium contained (g/L); tryptone, 2.5; yeast extract, 3.0; skim milk 30.0; dextrose 1.0; inoculum concentration 4%: initial pH 6.0; incubation temperature 30 degrees C and optimum production at 48 h with protease activity of 7.6 U/mL. The protease had the following characteristics: pH optima, 9.0; temperature optima 50 degrees C; pH stability between 5.0-10.0 and temperature stability between 10-40 degrees C. The protease was observed to have high potential for dehairing of goat skins in the pre- tanning process comparable to that of the chemical process as evidenced by histology. The method offers cleaner processing using enzyme only instead of toxic chemicals in the pre-tanning process of leather manufacture.

15. Waste Receiving and Processing Facility Module 2A: Advanced Conceptual Design Report. Volume 3B

Energy Technology Data Exchange (ETDEWEB)

1994-03-01

This volume consists of the following sections: WRAP 2A value engineering assessment, resolution of value engineering assessment actions (white paper), HAZOP studies for identifying major safety and operability problems, and time and motion simulation.

16. Landslides triggered by the Gorkha earthquake in the Langtang valley, volumes and initiation processes

Science.gov (United States)

Lacroix, Pascal

2016-03-01

The Gorkha earthquake (Nepal, 2015, M w 7.9) triggered many landslides. The most catastrophic mass movement was a debris avalanche that buried several villages in the Langtang valley. In this study, questions are raised about its volume and initiation. I investigate the possibility of high-resolution digital surface models computed from tri-stereo SPOT6/7 images to resolve this issue. This high-resolution dataset enables me to derive an inventory of 160 landslides triggered by this earthquake. I analyze the source of errors and estimate the uncertainties in the landslide volumes. The vegetation prevents to correctly estimate the volumes of landslides that occured in vegetated areas. However, I evaluate the volume and thickness of 73 landslides developing in vegetated-free areas, showing a power law between their surface areas and volumes with exponent of 1.20. Accumulations and depletion volumes are also well constrained for larger landslides, and I find that the main debris avalanches accumulated 6.95 × 106 m3 of deposits in the valley with thicknesses reaching 60 m, and 9.66 × 106 m3 in the glaciated part above 5000 m asl. The large amount of sediments is explained by an initiation of the debris avalanche due to serac falls and snow avalanches from five separate places between 6800 and 7200 m asl over 3 km length.

17. Tutorial for Collecting and Processing Images of Composite Structures to Determine the Fiber Volume Fraction

Science.gov (United States)

Conklin, Lindsey

2017-01-01

Fiber-reinforced composite structures have become more common in aerospace components due to their light weight and structural efficiency. In general, the strength and stiffness of a composite structure are directly related to the fiber volume fraction, which is defined as the fraction of fiber volume to total volume of the composite. The most common method to measure the fiber volume fraction is acid digestion, which is a useful method when the total weight of the composite, the fiber weight, and the total weight can easily be obtained. However, acid digestion is a destructive test, so the material will no longer be available for additional characterization. Acid digestion can also be difficult to machine out specific components of a composite structure with complex geometries. These disadvantages of acid digestion led the author to develop a method to calculate the fiber volume fraction. The developed method uses optical microscopy to calculate the fiber area fraction based on images of the cross section of the composite. The fiber area fraction and fiber volume fraction are understood to be the same, based on the assumption that the shape and size of the fibers are consistent in the depth of the composite. This tutorial explains the developed method for optically determining fiber area fraction performed at NASA Langley Research Center.

18. A MATHEMATICAL MODEL OF OPTIMIZATION OF THE VOLUME OF MATERIAL FLOWS IN GRAIN PROCESSING INTEGRATED PRODUCTION SYSTEMS

Directory of Open Access Journals (Sweden)

Baranovskaya T. P.

2014-06-01

Full Text Available The article suggests a mathematical model of optimization of the volume of material flows: the model for the ideal conditions; the model for the working conditions; generalized model of determining the optimal input parameters. These models optimize such parameters of inventory management in technology-integrated grain production systems, as the number of cycles supply, the volume of the source material and financial flows. The study was carried out on the example of the integrated system of production, processing and sales of wheat (bread with the full technological cycle

19. Statistical mechanics

CERN Document Server

Sheffield, Scott

2009-01-01

In recent years, statistical mechanics has been increasingly recognized as a central domain of mathematics. Major developments include the Schramm-Loewner evolution, which describes two-dimensional phase transitions, random matrix theory, renormalization group theory and the fluctuations of random surfaces described by dimers. The lectures contained in this volume present an introduction to recent mathematical progress in these fields. They are designed for graduate students in mathematics with a strong background in analysis and probability. This book will be of particular interest to graduate students and researchers interested in modern aspects of probability, conformal field theory, percolation, random matrices and stochastic differential equations.

20. FREIDLIN-WENTZELL TYPE ESTIMATES AND THE LAW OF THE ITERATED LOGARITHM FOR A CLASS OF STOCHASTIC PROCESSES RELATED TO SYMMETRIC STATISTICS

OpenAIRE

Mori, Toshio; Oodaira, Hiroshi

1988-01-01

Analogues of Freidlin and Wentzell's estimates for diffusion processes and the functional law of the iterated logarithm are obtained for a class of stochastic processes represented by multiple Wiener integrals with respect to two parameter Wiener processes, which arise as the limit processes of sequences of normalized symmetric statistics.

1. Statistical Image Processing.

Science.gov (United States)

1982-11-16

spectral analysist texture image analysis and classification, __ image software package, automatic spatial clustering.ITWA domenit hi ba apa for...ICOLOR(256),IBW(256) 1502 FORMATO (30( CNO(N): fF12.1)) 1503 FORMAT(o *FMINo DMRGE:0f2E20.8) 1504 FORMAT(/o IMRGE:or15) 1505 FOR14ATV FIRST SUBIMAGE:v...1506 FORMATO ’ JOIN CLUSTER NL:0) 1507 FORMAT( NEW CLUSTER:O) 1508 FORMAT( LLBS.GE.600) 1532 FORMAT(15XoTHETA ,7X, SIGMA-SQUAREr3Xe MERGING-DISTANCE

2. Multi-criteria optimization of the flesh melons skin separation process by experimental and statistical analysis methods

Directory of Open Access Journals (Sweden)

Y. B. Medvedkov

2016-01-01

Full Text Available Research and innovation activity to create energy-efficient processes in the melon processing, is a significant task. Separation skin from the melon flesh with their subsequent destination application in the creation of new food products is one of the time-consuming operations in this technology. Lack of scientific and experimental base of this operation holding back the development of high-performance machines for its implementation. In this connection, the technique of the experiment on the separation of the skins of melons in the pilot plant and the search for optimal regimes of its work methods by statistical modeling is offered. The late-ripening species of melon: Kalaysan, Thorlami, Gulab-sary are objects of study. Interaction of factors influencing on separating the melon skins process is carried out. A central composite rotatable design and fractional factorial experiment was used. Using the method of experimental design with treatment planning template in Design Expert v.10 software yielded a regression equations that adequately describe the actual process. Rational intervals input factors values are established: the ratio of the rotational speed of the drum to the abrasive supply roll rotational frequency; the gap between the supply drum and the shearing knife; shearing blade sharpening angle; the number of feed drum spikes; abrading drum orifices diameter. The mean square error does not exceed 12.4%. Regression equations graphic interpretation is presented by scatter plots and engineering nomograms that can be predictive of a choice of rational values of the input factors for three optimization criteria: minimal specific energy consumption in the process of cutting values, maximal specific performance by the pulp and pulp extraction ratio values. Obtained data can be used for the operational management of the process technological parameters, taking into account the geometrical dimensions of the melon and its inhomogeneous structure.

3. Reducing uncertainty in the selection of bi-variate distributions of flood peaks and volumes using copulas and hydrological process-based model selection

Science.gov (United States)

Szolgay, Jan; Gaál, Ladislav; Bacigál, Tomáš; Kohnová, Silvia; Blöschl, Günter

2016-04-01

and snow-melt floods. First, empirical copulas for the individual processes were compared at each site separately in order to assess whether peak-volume relationships are different for in terms of the respective flood processes. Next, the similarity of empirical distributions was tested in a regional perspective process-wise. In the last step, the goodness-of-fit of frequently used copula types was examined both for process based data samples (the current approach, based on a wider database of flood events) and annual maximum floods (the traditional approach that makes use of a limited number of events). It was concluded, that in order to reduce the uncertainty in model selection and parameter estimation, it is necessary to treat flood processes separately and analyze all available independent floods. Given that usually more than one statistically suitable copula model exists in practice, an uncertainty analysis of the design values in engineering studies resulting from the model selection is necessary. It was shown, that reducing uncertainty in the choice of model can be attempted by a deeper hydrological analysis of the dependence structure/model's suitability in specific hydrological environments or by a more specific distinction of the typical flood generation mechanisms.

4. USING STATISTICAL PROCESS CONTROL AND SIX SIGMA TO CRITICALLY ANALYSE SAFETY OF HELICAL SPRINGS: A RAILWAY CASE STUDY

Directory of Open Access Journals (Sweden)

Fulufhelo Ṋemavhola

2017-09-01

Full Text Available The paper exhibits the examination of life quality evaluation of helical coil springs in the railway industry as it impacts the safety of the transportation of goods and people. The types of spring considered are: the external spring, internal spring and stabiliser spring. Statistical process control was utilised as the fundamental instrument in the investigation. Measurements were performed using a measuring tape, dynamic actuators and the vernier caliper. The purpose of this research was to examine the usability of old helical springs found in a railway environment. The goal of the experiment was to obtain factual statistical information to determine the life quality of the helical springs used in the railroad transportation environment. Six sigma advocacies were additionally used as a part of this paper. According to six sigma estimation examination only the stabilizers and inner springs for coil bar diameter met the six sigma prerequisites. It is reasoned that the coil springs should be replaced as they do not meet the six sigma requirements.

5. Envelope statistics of self-motion signals experienced by human subjects during everyday activities: Implications for vestibular processing.

Science.gov (United States)

Carriot, Jérome; Jamali, Mohsen; Cullen, Kathleen E; Chacron, Maurice J

2017-01-01

There is accumulating evidence that the brain's neural coding strategies are constrained by natural stimulus statistics. Here we investigated the statistics of the time varying envelope (i.e. a second-order stimulus attribute that is related to variance) of rotational and translational self-motion signals experienced by human subjects during everyday activities. We found that envelopes can reach large values across all six motion dimensions (~450 deg/s for rotations and ~4 G for translations). Unlike results obtained in other sensory modalities, the spectral power of envelope signals decreased slowly for low (2 Hz) temporal frequencies and thus was not well-fit by a power law. We next compared the spectral properties of envelope signals resulting from active and passive self-motion, as well as those resulting from signals obtained when the subject is absent (i.e. external stimuli). Our data suggest that different mechanisms underlie deviation from scale invariance in rotational and translational self-motion envelopes. Specifically, active self-motion and filtering by the human body cause deviation from scale invariance primarily for translational and rotational envelope signals, respectively. Finally, we used well-established models in order to predict the responses of peripheral vestibular afferents to natural envelope stimuli. We found that irregular afferents responded more strongly to envelopes than their regular counterparts. Our findings have important consequences for understanding the coding strategies used by the vestibular system to process natural second-order self-motion signals.

6. A novel Q-based online model updating strategy and its application in statistical process control for rubber mixing

Institute of Scientific and Technical Information of China (English)

Chunying Zhang; Sun Chen; Fang Wu; Kai Song

2015-01-01

To overcome the large time-delay in measuring the hardness of mixed rubber, rheological parameters were used to predict the hardness. A novel Q-based model updating strategy was proposed as a universal platform to track time-varying properties. Using a few selected support samples to update the model, the strategy could dramat-ical y save the storage cost and overcome the adverse influence of low signal-to-noise ratio samples. Moreover, it could be applied to any statistical process monitoring system without drastic changes to them, which is practical for industrial practices. As examples, the Q-based strategy was integrated with three popular algorithms (partial least squares (PLS), recursive PLS (RPLS), and kernel PLS (KPLS)) to form novel regression ones, QPLS, QRPLS and QKPLS, respectively. The applications for predicting mixed rubber hardness on a large-scale tire plant in east China prove the theoretical considerations.

7. High productivity chromatography refolding process for Hepatitis B Virus X (HBx) protein guided by statistical design of experiment studies.

Science.gov (United States)

Basu, Anindya; Leong, Susanna Su Jan

2012-02-01

The Hepatitis B Virus X (HBx) protein is a potential therapeutic target for the treatment of hepatocellular carcinoma. However, consistent expression of the protein as insoluble inclusion bodies in bacteria host systems has largely hindered HBx manufacturing via economical biosynthesis routes, thereby impeding the development of anti-HBx therapeutic strategies. To eliminate this roadblock, this work reports the development of the first 'chromatography refolding'-based bioprocess for HBx using immobilised metal affinity chromatography (IMAC). This process enabled production of HBx at quantities and purity that facilitate their direct use in structural and molecular characterization studies. In line with the principles of quality by design (QbD), we used a statistical design of experiments (DoE) methodology to design the optimum process which delivered bioactive HBx at a productivity of 0.21 mg/ml/h at a refolding yield of 54% (at 10 mg/ml refolding concentration), which was 4.4-fold higher than that achieved in dilution refolding. The systematic DoE methodology adopted for this study enabled us to obtain important insights into the effect of different bioprocess parameters like the effect of buffer exchange gradients on HBx productivity and quality. Such a bioprocess design approach can play a pivotal role in developing intensified processes for other novel proteins, and hence helping to resolve validation and speed-to-market challenges faced by the biopharmaceutical industry today.

8. Characterization of mechanisms and processes of groundwater salinization in irrigated coastal area using statistics, GIS, and hydrogeochemical investigations.

Science.gov (United States)

Bouzourra, Hazar; Bouhlila, Rachida; Elango, L; Slama, Fairouz; Ouslati, Naceur

2015-02-01

Coastal aquifers are at threat of salinization in most parts of the world. This study was carried out in coastal shallow aquifers of Aousja-Ghar El Melh and Kalâat el Andalous, northeastern of Tunisia with an objective to identify sources and processes of groundwater salinization. Groundwater samples were collected from 42 shallow dug wells during July and September 2007. Chemical parameters such as Na(+), Ca(2+), Mg(2+), K(+), Cl(-), SO4 (2-), HCO3 (-), NO3 (-), Br(-), and F(-) were analyzed. The combination of hydrogeochemical, statistical, and GIS approaches was used to understand and to identify the main sources of salinization and contamination of these shallow coastal aquifers as follows: (i) water-rock interaction, (ii) evapotranspiration, (iii) saltwater is started to intrude before 1972 and it is still intruding continuously, (iv) irrigation return flow, (v) sea aerosol spray, and finally, (vi) agricultural fertilizers. During 2005/2006, the overexploitation of the renewable water resources of aquifers caused saline water intrusion. In 2007, the freshening of a brackish-saline groundwater occurred under natural recharge conditions by Ca-HCO3 meteoric freshwater. The cationic exchange processes are occurred at fresh-saline interfaces of mixtures along the hydraulic gradient. The sulfate reduction process and the neo-formation of clays minerals characterize the hypersaline coastal Sebkha environments. Evaporation tends to increase the concentrations of solutes in groundwater from the recharge areas to the discharge areas and leads to precipitate carbonate and sulfate minerals.

9. Modeling statistics and kinetics of the natural aggregation structures and processes with the solution of generalized logistic equation

Science.gov (United States)

Maslov, Lev A.; Chebotarev, Vladimir I.

2017-02-01

The generalized logistic equation is proposed to model kinetics and statistics of natural processes such as earthquakes, forest fires, floods, landslides, and many others. This equation has the form dN(A)/dA = s dot (1-N(A)) dot N(A)q dot A-α, q>0q>0 and A>0A>0 is the size of an element of a structure, and α≥0. The equation contains two exponents α and q taking into account two important properties of elements of a system: their fractal geometry, and their ability to interact either to enhance or to damp the process of aggregation. The function N(A)N(A) can be understood as an approximation to the number of elements the size of which is less than AA. The function dN(A)/dAdN(A)/dA where N(A)N(A) is the general solution of this equation for q=1 is a product of an increasing bounded function and power-law function with stretched exponential cut-off. The relation with Tsallis non-extensive statistics is demonstrated by solving the generalized logistic equation for q>0q>0. In the case 01q>1 it models sub-additive structures. The Gutenberg-Richter (G-R) formula results from interpretation of empirical data as a straight line in the area of stretched exponent with small α. The solution is applied for modeling distribution of foreshocks and aftershocks in the regions of Napa Valley 2014, and Sumatra 2004 earthquakes fitting the observed data well, both qualitatively and quantitatively.

10. The Use of Statistical Process Control-Charts for Person-Fit Analysis on Computerized Adaptive Testing. LSAC Research Report Series.

Science.gov (United States)

Meijer, Rob R.; van Krimpen-Stoop, Edith M. L. A.

In this study a cumulative-sum (CUSUM) procedure from the theory of Statistical Process Control was modified and applied in the context of person-fit analysis in a computerized adaptive testing (CAT) environment. Six person-fit statistics were proposed using the CUSUM procedure, and three of them could be used to investigate the CAT in online test…

11. Constitutive Modelling in Thermomechanical Processes, Using The Control Volume Method on Staggered Grid

DEFF Research Database (Denmark)

Thorborg, Jesper

The objective of this thesis has been to improve and further develop the existing staggered grid control volume formulation of the thermomechanical equations. During the last ten years the method has proven to be efficient and accurate even for calculation on large structures. The application of ...

12. SOLVENT-BASED TO WATERBASED ADHESIVE-COATED SUBSTRATE RETROFIT - VOLUME II: PROCESS OVERVIEW

Science.gov (United States)

This volume presents initial results of a study to identify the issues and barriers associated with retrofitting existing solvent-based equipment to accept waterbased adhesives as part of an EPA effort to improve equipment cleaning in the coated and laminated substrate manufactur...

13. The interpretation of X-ray computed microtomography images of rocks as an application of volume image processing and analysis

OpenAIRE

Kaczmarczyk, J.; Dohnalik, M; Zalewska, J; Cnudde, Veerle

2010-01-01

X-ray computed microtomography (CMT) is a non-destructive method of investigating internal structure of examined objects. During the reconstruction of CMT measurement data, large volume images are generated. Therefore, the image processing and analysis are very important steps in CMT data interpretation. The first step in analyzing the rocks is image segmentation. The differences in density are shown on the reconstructed image as the differences in gray level of voxel, so the proper threshold...

14. Trends in lumber processing in the western United States. Part I: board foot Scribner volume per cubic foot of timber

Science.gov (United States)

Charles E. Keegan; Todd A. Morgan; Keith A. Blatner; Jean M. Daniels

2010-01-01

This article describes trends in board foot Scribner volume per cubic foot of timber for logs processed by sawmills in the western United States. Board foot to cubic foot (BF/CF) ratios for the period from 2000 through 2006 ranged from 3.70 in Montana to 5.71 in the Four Corners Region (Arizona, Colorado, New Mexico, and Utah). Sawmills in the Four Corners Region,...

15. Economic impacts of climate change on agriculture: a comparison of process-based and statistical yield models

Science.gov (United States)

Moore, Frances C.; Baldos, Uris Lantz C.; Hertel, Thomas

2017-06-01

A large number of studies have been published examining the implications of climate change for agricultural productivity that, broadly speaking, can be divided into process-based modeling and statistical approaches. Despite a general perception that results from these methods differ substantially, there have been few direct comparisons. Here we use a data-base of yield impact studies compiled for the IPCC Fifth Assessment Report (Porter et al 2014) to systematically compare results from process-based and empirical studies. Controlling for differences in representation of CO2 fertilization between the two methods, we find little evidence for differences in the yield response to warming. The magnitude of CO2 fertilization is instead a much larger source of uncertainty. Based on this set of impact results, we find a very limited potential for on-farm adaptation to reduce yield impacts. We use the Global Trade Analysis Project (GTAP) global economic model to estimate welfare consequences of yield changes and find negligible welfare changes for warming of 1 °C-2 °C if CO2 fertilization is included and large negative effects on welfare without CO2. Uncertainty bounds on welfare changes are highly asymmetric, showing substantial probability of large declines in welfare for warming of 2 °C-3 °C even including the CO2 fertilization effect.

16. Processes affecting Aedes aegypti (Diptera: Culicidae) infestation and abundance: inference through statistical modeling and risk maps in northern Argentina.

Science.gov (United States)

Garelli, F M; Espinosa, M O; Gürtler, R E

2012-05-01

Understanding the processes that affect Aedes aegypti (L.) (Diptera: Culicidae) may serve as a starting point to create and/or improve vector control strategies. For this purpose, we performed statistical modeling of three entomological surveys conducted in Clorinda City, northern Argentina. Previous 'basic' models of presence or absence of larvae and/or pupae (infestation) and the number of pupae in infested containers (productivity), mainly based on physical characteristics of containers, were expanded to include variables selected a priori reflecting water use practices, vector-related context factors, the history of chemical control, and climate. Model selection was performed using Akaike's Information Criterion. In total, 5,431 water-holding containers were inspected and 12,369 Ae. aegypti pupae collected from 963 positive containers. Large tanks were the most productive container type. Variables reflecting every putative process considered, except for history of chemical control, were selected in the best models obtained for infestation and productivity. The associations found were very strong, particularly in the case of infestation. Water use practices and vector-related context factors were the most important ones, as evidenced by their impact on Akaike's Information Criterion scores of the infestation model. Risk maps based on empirical data and model predictions showed a heterogeneous distribution of entomological risk. An integrated vector control strategy is recommended, aiming at community participation for healthier water use practices and targeting large tanks for key elements such as lid status, water addition frequency and water use.

17. Application of statistical experimental design to study the formulation variables influencing the coating process of lidocaine liposomes.

Science.gov (United States)

González-Rodríguez, M L; Barros, L B; Palma, J; González-Rodríguez, P L; Rabasco, A M

2007-06-07

In this paper, we have used statistical experimental design to investigate the effect of several factors in coating process of lidocaine hydrochloride (LID) liposomes by a biodegradable polymer (chitosan, CH). These variables were the concentration of CH coating solution, the dripping rate of this solution on the liposome colloidal dispersion, the stirring rate, the time since the liposome production to the liposome coating and finally the amount of drug entrapped into liposomes. The selected response variables were drug encapsulation efficiency (EE, %), coating efficiency (CE, %) and zeta potential. Liposomes were obtained by thin-layer evaporation method. They were subsequently coated with CH according the experimental plan provided by a fractional factorial (2(5-1)) screening matrix. We have used spectroscopic methods to determine the zeta potential values. The EE (%) assay was carried out in dialysis bags and the brilliant red probe was used to determine CE (%) due to its property of forming molecular complexes with CH. The graphic analysis of the effects allowed the identification of the main formulation and technological factors by the analysis of the selected responses and permitted the determination of the proper level of these factors for the response improvement. Moreover, fractional design allowed quantifying the interactions between the factors, which will consider in next experiments. The results obtained pointed out that LID amount was the predominant factor that increased the drug entrapment capacity (EE). The CE (%) response was mainly affected by the concentration of the CH solution and the stirring rate, although all the interactions between the main factors have statistical significance.

18. METHODS TO RESTRUCTURE THE STATISTICAL COMMUNITIES

Directory of Open Access Journals (Sweden)

Emilia TITAN

2005-12-01

Full Text Available In view of knowing the essence of phenomena it is necessary to perform statistical data processing operations. This allows for shifting from individual data to derived, synthetic indicators that highlight the essence of various phenomena. The high volume and diversity of processing operations presuppose developing plans of computerised data processing. To identify distinct and homogenous groups and classes it is necessary to realise well-pondered groupings and classifications that presuppose to comply with the requirements presented in the article.

19. Waste Receiving and Processing Facility Module 2A: Advanced Conceptual Design Report. Volume 3A

Energy Technology Data Exchange (ETDEWEB)

1994-03-01

Objective of this document is to provide descriptions of all WRAP 2A feed streams, including physical and chemical attributes, and describe the pathway that was used to select data for volume estimates. WRAP 2A is being designed for nonthermal treatment of contact-handled mixed low-level waste Category 1 and 3. It is based on immobilization and encapsulation treatment using grout or polymer.

20. High-throughput in-volume processing in glass with isotropic spatial resolutions in three dimensions

CERN Document Server

Tan, Yuanxin; Chu, Wei; Liao, Yang; Qiao, Lingling; Cheng, Ya

2016-01-01

We report on fabrication of three dimensional (3D) microstructures in glass with isotropic spatial resolutions. To achieve high throughput fabrication, we expand the focal spot size with a low-numerical-aperture lens, which naturally results in a degraded axial resolution. We solve the problem with simultaneous spatial temporal focusing which leads to an isotropic laser-affected volume with a spatial resolution of ~100 micron.

1. The critical role of NIR spectroscopy and statistical process control (SPC) strategy towards captopril tablets (25 mg) manufacturing process understanding: a case study.

Science.gov (United States)

Curtivo, Cátia Panizzon Dal; Funghi, Nathália Bitencourt; Tavares, Guilherme Diniz; Barbosa, Sávio Fujita; Löbenberg, Raimar; Bou-Chacra, Nádia Araci

2015-05-01

In this work, near-infrared spectroscopy (NIRS) method was used to evaluate the uniformity of dosage units of three captopril 25 mg tablets commercial batches. The performance of the calibration method was assessed by determination of Q value (0.9986), standard error of estimation (C-set SEE = 1.956), standard error of prediction (V-set SEP = 2.076) as well as the consistency (106.1%). These results indicated the adequacy of the selected model. The method validation revealed the agreement of the reference high pressure liquid chromatography (HPLC) and NIRS methods. The process evaluation using the NIRS method showed that the variability was due to common causes and delivered predictable results consistently. Cp and Cpk values were, respectively, 2.05 and 1.80. These results revealed a non-centered process in relation to the average target (100% w/w), in the specified range (85-115%). The probability of failure was 21:100 million tablets of captopril. The NIRS in combination with the method of multivariate calibration, partial least squares (PLS) regression, allowed the development of methodology for the uniformity of dosage units evaluation of captopril tablets 25 mg. The statistical process control strategy associated with NIRS method as PAT played a critical role in understanding of the sources and degree of variation and its impact on the process. This approach led towards a better process understanding and provided the sound scientific basis for its continuous improvement.

2. 3D digital image processing for biofilm quantification from confocal laser scanning microscopy: Multidimensional statistical analysis of biofilm modeling

Science.gov (United States)

Zielinski, Jerzy S.

The dramatic increase in number and volume of digital images produced in medical diagnostics, and the escalating demand for rapid access to these relevant medical data, along with the need for interpretation and retrieval has become of paramount importance to a modern healthcare system. Therefore, there is an ever growing need for processed, interpreted and saved images of various types. Due to the high cost and unreliability of human-dependent image analysis, it is necessary to develop an automated method for feature extraction, using sophisticated mathematical algorithms and reasoning. This work is focused on digital image signal processing of biological and biomedical data in one- two- and three-dimensional space. Methods and algorithms presented in this work were used to acquire data from genomic sequences, breast cancer, and biofilm images. One-dimensional analysis was applied to DNA sequences which were presented as a non-stationary sequence and modeled by a time-dependent autoregressive moving average (TD-ARMA) model. Two-dimensional analyses used 2D-ARMA model and applied it to detect breast cancer from x-ray mammograms or ultrasound images. Three-dimensional detection and classification techniques were applied to biofilm images acquired using confocal laser scanning microscopy. Modern medical images are geometrically arranged arrays of data. The broadening scope of imaging as a way to organize our observations of the biophysical world has led to a dramatic increase in our ability to apply new processing techniques and to combine multiple channels of data into sophisticated and complex mathematical models of physiological function and dysfunction. With explosion of the amount of data produced in a field of biomedicine, it is crucial to be able to construct accurate mathematical models of the data at hand. Two main purposes of signal modeling are: data size conservation and parameter extraction. Specifically, in biomedical imaging we have four key problems

3. High Volume, Low-Cost Production Process for High-grade Silicon Carbide Optics Project

Data.gov (United States)

National Aeronautics and Space Administration — The following proposal summarizes the process by which Trex will utilize out patented CVC (Chemical Vapor Composite) SiC process to fabricate near net shape...

4. High Volume, Low-Cost Production Process for High-grade Silicon Carbide Optics Project

Data.gov (United States)

National Aeronautics and Space Administration — The following proposal summarizes the process by which Trex Enterprises will utilize our patented CVC (Chemical Vapor Composite) SiC process towards the fabrication...

5. Integrated payload and mission planning, phase 3. Volume 1: Integrated payload and mission planning process evaluation

Science.gov (United States)

Sapp, T. P.; Davin, D. E.

1977-01-01

The integrated payload and mission planning process for STS payloads was defined, and discrete tasks which evaluate performance and support initial implementation of this process were conducted. The scope of activity was limited to NASA and NASA-related payload missions only. The integrated payload and mission planning process was defined in detail, including all related interfaces and scheduling requirements. Related to the payload mission planning process, a methodology for assessing early Spacelab mission manager assignment schedules was defined.

6. Assessing the hydrogeochemical processes affecting groundwater pollution in arid areas using an integration of geochemical equilibrium and multivariate statistical techniques.

Science.gov (United States)

El Alfy, Mohamed; Lashin, Aref; Abdalla, Fathy; Al-Bassam, Abdulaziz

2017-10-01

Rapid economic expansion poses serious problems for groundwater resources in arid areas, which typically have high rates of groundwater depletion. In this study, integration of hydrochemical investigations involving chemical and statistical analyses are conducted to assess the factors controlling hydrochemistry and potential pollution in an arid region. Fifty-four groundwater samples were collected from the Dhurma aquifer in Saudi Arabia, and twenty-one physicochemical variables were examined for each sample. Spatial patterns of salinity and nitrate were mapped using fitted variograms. The nitrate spatial distribution shows that nitrate pollution is a persistent problem affecting a wide area of the aquifer. The hydrochemical investigations and cluster analysis reveal four significant clusters of groundwater zones. Five main factors were extracted, which explain >77% of the total data variance. These factors indicated that the chemical characteristics of the groundwater were influenced by rock-water interactions and anthropogenic factors. The identified clusters and factors were validated with hydrochemical investigations. The geogenic factors include the dissolution of various minerals (calcite, aragonite, gypsum, anhydrite, halite and fluorite) and ion exchange processes. The anthropogenic factors include the impact of irrigation return flows and the application of potassium, nitrate, and phosphate fertilizers. Over time, these anthropogenic factors will most likely contribute to further declines in groundwater quality. Copyright © 2017 Elsevier Ltd. All rights reserved.

7. Mediator effect of statistical process control between Total Quality Management (TQM) and business performance in Malaysian Automotive Industry

Science.gov (United States)

Ahmad, M. F.; Rasi, R. Z.; Zakuan, N.; Hisyamudin, M. N. N.

2015-12-01

In today's highly competitive market, Total Quality Management (TQM) is vital management tool in ensuring a company can success in their business. In order to survive in the global market with intense competition amongst regions and enterprises, the adoption of tools and techniques are essential in improving business performance. There are consistent results between TQM and business performance. However, only few previous studies have examined the mediator effect namely statistical process control (SPC) between TQM and business performance. A mediator is a third variable that changes the association between an independent variable and an outcome variable. This study present research proposed a TQM performance model with mediator effect of SPC with structural equation modelling, which is a more comprehensive model for developing countries, specifically for Malaysia. A questionnaire was prepared and sent to 1500 companies from automotive industry and the related vendors in Malaysia, giving a 21.8 per cent rate. Attempts were made at findings significant impact of mediator between TQM practices and business performance showed that SPC is important tools and techniques in TQM implementation. The result concludes that SPC is partial correlation between and TQM and BP with indirect effect (IE) is 0.25 which can be categorised as high moderator effect.

8. Non-parametric asymptotic statistics for the Palm mark distribution of \\beta-mixing marked point processes

CERN Document Server

Heinrich, Lothar; Schmidt, Volker

2012-01-01

We consider spatially homogeneous marked point patterns in an unboundedly expanding convex sampling window. Our main objective is to identify the distribution of the typical mark by constructing an asymptotic \\chi^2-goodness-of-fit test. The corresponding test statistic is based on a natural empirical version of the Palm mark distribution and a smoothed covariance estimator which turns out to be mean-square consistent. Our approach does not require independent marks and allows dependences between the mark field and the point pattern. Instead we impose a suitable \\beta-mixing condition on the underlying stationary marked point process which can be checked for a number of Poisson-based models and, in particular, in the case of geostatistical marking. Our method needs a central limit theorem for \\beta-mixing random fields which is proved by extending Bernstein's blocking technique to non-cubic index sets and seems to be of interest in its own right. By large-scale model-based simulations the performance of our t...

9. LIKELIHOOD RATIO TESTS OF HYPOTHESES ON MULTIVARIATE POPULATIONS, VOLUME 1, DISTRIBUTION THEORY--STATISTICAL MODELS FOR THE EVALUATION AND INTERPRETATION OF EDUCATIONAL CRITERIA, PART 4.

Science.gov (United States)

SAW, J.G.

THIS VOLUME DEALS WITH THE BIVARIATE NORMAL DISTRIBUTION. THE AUTHOR MAKES A DISTINCTION BETWEEN DISTRIBUTION AND DENSITY FROM WHICH HE DEVELOPS THE CONSEQUENCES OF THIS DISTINCTION FOR HYPOTHESIS TESTING. OTHER ENTRIES IN THIS SERIES ARE ED 003 044 AND ED 003 045. (JK)

10. Design and optimization of disintegrating pellets of MCC by non-aqueous extrusion process using statistical tools.

Science.gov (United States)

Gurram, Rajesh Kumar; Gandra, Suchithra; Shastri, Nalini R

2016-03-10

The objective of the study was to design and optimize a disintegrating pellet formulation of microcrystalline cellulose by non-aqueous extrusion process for a water sensitive drug using various statistical tools. Aspirin was used as a model drug. Disintegrating matrix pellets of aspirin using propylene glycol as a non-aqueous granulation liquid and croscarmellose as a disintegrant was developed. Plackett-Burman design was initially conducted to screen and identify the significant factors. Final optimization of formula was performed by response surface methodology using a central composite design. The critical attributes of the pellet dosage forms (dependent variables); disintegration time, sphericity and yield were predicted with adequate accuracy based on the regression model. Pareto charts and contour charts were studied to understand the influence of factors and predict the responses. A design space was constructed to meet the desirable targets of the responses in terms of disintegration time 0.95 and friability <1.7%. The optimized matrix pellets were enteric coated using Eudragit L 100. The drug release from the enteric coated pellets after 30min in the basic media was ~93% when compared to ~77% from the marketed pellets. The delayed release pellets stored at 25°C/60% RH were stable for a period of 10mo. In conclusion, it can be stated that the developed process for disintegrating pellets using non-aqueous granulating agents can be used as an alternative technique for various water sensitive drugs, circumventing the application of volatile organic solvents in conventional drug layering on inert cores. The scope of this study can be further extended to hydrophobic drugs, which may benefit from the rapid disintegration property and the use of various hydrophilic excipients used in the optimized pellet formulation to enhance dissolution and in turn improve bioavailability. Copyright © 2016 Elsevier B.V. All rights reserved.

11. Method for Detecting a Random Process in a Convex Hull Volume

Science.gov (United States)

2011-10-04

1), pp. 66-87 (March 1943) . [0014] Wald , A. and J. Wolfowitz . "On a test whether two samples are from the same population." The Annals of...Method A ( Wald - Wolfowitz Independent Sample Runs Test Procedure) [0052] An initial statistical test on input distributions is performed to evaluate...two-valued data sequence and is well known to those skilled in the art [ Wald , A. 14 and J. Wolfowitz . "On a test whether two samples are from the

12. Analysis of the permitting processes associated with exploration of Federal OCS leases. Final report. Volume II. Appendices

Energy Technology Data Exchange (ETDEWEB)

1980-11-01

Under contract to the Office of Leasing Policy Development (LPDO), Jack Faucett Associates is currently undertaking the description and analysis of the Outer Continental Shelf (OCS) regulatory process to determine the nature of time delays that affect OCS production of oil and gas. This report represents the results of the first phase of research under this contract, the description and analysis of regulatory activity associated with exploration activities on the Federal OCS. Volume 1 contains the following three sections: (1) study results; (2) Federal regulatory activities during exploration of Federal OCS leases which involved the US Geological Survey, Environmental Protection Agency, US Coast Guard, Corps of Engineers, and National Ocean and Atmospheric Administration; and (3) state regulatory activities during exploration of Federal OCS leases of Alaska, California, Louisiana, Massachusetts, New Jersey, North Carolina and Texas. Volume II contains appendices of US Geological Survey, Environmental Protection Agency, Coast Guard, Corps of Engineers, the Coastal Zone Management Act, and Alaska. The major causes of delay in the regulatory process governing exploration was summarized in four broad categories: (1) the long and tedious process associated with the Environmental Protection Agency's implementation of the National Pollutant Discharge Elimination System Permit; (2) thelack of mandated time periods for the completion of individual activities in the permitting process; (3) the lack of overall coordination of OCS exploratory regulation; and (4) the inexperience of states, the Federal government and industry relating to the appropriate level of regulation for first-time lease sale areas.

13. Experimental statistics

CERN Document Server

Natrella, Mary Gibbons

2005-01-01

Formulated to assist scientists and engineers engaged in army ordnance research and development programs, this well-known and highly regarded handbook is a ready reference for advanced undergraduate and graduate students as well as for professionals seeking engineering information and quantitative data for designing, developing, constructing, and testing equipment. Topics include characterizing and comparing the measured performance of a material, product, or process; general considerations in planning experiments; statistical techniques for analyzing extreme-value data; use of transformations

14. Extraterrestrial processing and manufacturing of large space systems. Volume 3: Executive summary

Science.gov (United States)

Miller, R. H.; Smith, D. B. S.

1979-01-01

Facilities and equipment are defined for refining processes to commercial grade of lunar material that is delivered to a 'space manufacturing facility' in beneficiated, primary processed quality. The manufacturing facilities and the equipment for producing elements of large space systems from these materials and providing programmatic assessments of the concepts are also defined. In-space production processes of solar cells (by vapor deposition) and arrays, structures and joints, conduits, waveguides, RF equipment radiators, wire cables, converters, and others are described.

15. HTGR high temperature process heat design and cost status report. Volume II. Appendices

Energy Technology Data Exchange (ETDEWEB)

None

1981-12-01

Information is presented concerning the 850/sup 0/C IDC reactor vessel; primary cooling system; secondary helium system; steam generator; heat cycle evaluations for the 850/sup 0/C IDC plant; 950/sup 0/C DC reactor vessel; 950/sup 0/C DC steam generator; direct and indirect cycle reformers; methanation plant; thermochemical pipeline; methodology for screening candidate synfuel processes; ECCG process; project technical requirements; process gas explosion assessment; HTGR program economic guidelines; and vendor respones.

16. Injury Statistics

Science.gov (United States)

... Certification Import Safety International Recall Guidance Civil and Criminal Penalties Federal Court Orders & Decisions Research & Statistics Research & Statistics Technical Reports Injury Statistics NEISS Injury ...

17. Simulation of cooling channel rheocasting process of A356 aluminum alloy using three-phase volume averaging model

Institute of Scientific and Technical Information of China (English)

T. Wang; B.Pustal; M. Abondano; T. Grimmig; A. B(u)hrig-Polaczek; M. Wu; A. Ludwig

2005-01-01

The cooling channel process is a rehocasting method by which the prematerial with globular microstructure can be produced to fit the thixocasting process. A three-phase model based on volume averaging approach is proposed to simulate the cooling channel process of A356 Aluminum alloy. The three phases are liquid, solid and air respectively and treated as separated and interacting continua, sharing a single pressure field. The mass, momentum, enthalpy transport equations for each phase are solved. The developed model can predict the evolution of liquid, solid and air fraction as well as the distribution of grain density and grain size. The effect of pouring temperature on the grain density, grain size and solid fraction is analyzed in detail.

18. Cosmic Statistics of Statistics

OpenAIRE

Szapudi, I.; Colombi, S.; Bernardeau, F.

1999-01-01

The errors on statistics measured in finite galaxy catalogs are exhaustively investigated. The theory of errors on factorial moments by Szapudi & Colombi (1996) is applied to cumulants via a series expansion method. All results are subsequently extended to the weakly non-linear regime. Together with previous investigations this yields an analytic theory of the errors for moments and connected moments of counts in cells from highly nonlinear to weakly nonlinear scales. The final analytic formu...

19. Process system evaluation: Consolidated letter reports. Volume 3: Formulation of final products

Energy Technology Data Exchange (ETDEWEB)

Josephson, G.B.; Chapman, C.C.; Albertsen, K.H.

1996-04-01

Glass discharged from the low-level waste (LLW) melter may be processed into a variety of different forms for storage and disposal. The purpose of the study reported here is to identify and evaluate processing options for forming the glass.

20. Space vehicle electrical power processing distribution and control study. Volume 1: Summary

Science.gov (United States)

Krausz, A.

1972-01-01

A concept for the processing, distribution, and control of electric power for manned space vehicles and future aircraft is presented. Emphasis is placed on the requirements of the space station and space shuttle configurations. The systems involved are referred to as the processing distribution and control system (PDCS), electrical power system (EPS), and electric power generation system (EPGS).