Applying contemporary statistical techniques
Wilcox, Rand R
2003-01-01
Applying Contemporary Statistical Techniques explains why traditional statistical methods are often inadequate or outdated when applied to modern problems. Wilcox demonstrates how new and more powerful techniques address these problems far more effectively, making these modern robust methods understandable, practical, and easily accessible.* Assumes no previous training in statistics * Explains how and why modern statistical methods provide more accurate results than conventional methods* Covers the latest developments on multiple comparisons * Includes recent advanc
Optimization techniques in statistics
Rustagi, Jagdish S
1994-01-01
Statistics help guide us to optimal decisions under uncertainty. A large variety of statistical problems are essentially solutions to optimization problems. The mathematical techniques of optimization are fundamentalto statistical theory and practice. In this book, Jagdish Rustagi provides full-spectrum coverage of these methods, ranging from classical optimization and Lagrange multipliers, to numerical techniques using gradients or direct search, to linear, nonlinear, and dynamic programming using the Kuhn-Tucker conditions or the Pontryagin maximal principle. Variational methods and optimiza
Statistical considerations on safety analysis
International Nuclear Information System (INIS)
Pal, L.; Makai, M.
2004-01-01
The authors have investigated the statistical methods applied to safety analysis of nuclear reactors and arrived at alarming conclusions: a series of calculations with the generally appreciated safety code ATHLET were carried out to ascertain the stability of the results against input uncertainties in a simple experimental situation. Scrutinizing those calculations, we came to the conclusion that the ATHLET results may exhibit chaotic behavior. A further conclusion is that the technological limits are incorrectly set when the output variables are correlated. Another formerly unnoticed conclusion of the previous ATHLET calculations that certain innocent looking parameters (like wall roughness factor, the number of bubbles per unit volume, the number of droplets per unit volume) can influence considerably such output parameters as water levels. The authors are concerned with the statistical foundation of present day safety analysis practices and can only hope that their own misjudgment will be dispelled. Until then, the authors suggest applying correct statistical methods in safety analysis even if it makes the analysis more expensive. It would be desirable to continue exploring the role of internal parameters (wall roughness factor, steam-water surface in thermal hydraulics codes, homogenization methods in neutronics codes) in system safety codes and to study their effects on the analysis. In the validation and verification process of a code one carries out a series of computations. The input data are not precisely determined because measured data have an error, calculated data are often obtained from a more or less accurate model. Some users of large codes are content with comparing the nominal output obtained from the nominal input, whereas all the possible inputs should be taken into account when judging safety. At the same time, any statement concerning safety must be aleatory, and its merit can be judged only when the probability is known with which the
Statistical considerations in NRDA studies
International Nuclear Information System (INIS)
Harner, E.G.; Parker, K.R.; Skalski, J.R.
1993-01-01
Biological, chemical, and toxicological variables are usually modeled with lognormal, Poisson, negative binomial, or binomial error distributions. Species counts and densities often have frequent zeros and overdispersion. Chemical concentrations can have frequent non-detects and a small proportion of high values. The feasibility of making adjustments to these response variables, such as zero-inflated models, are discussed. Toxicity measurements are usually modeled with the binomial distribution. A strategy for determining the most appropriate distribution is presented. Model-based methods, using concomitant variables and interactions, enhance assessment of impacts. Concomitant variable models reduce variability and also reduce bias by adjusting means to a common basis. Variable selection strategies are given for determining the most appropriate set of concomitant variables. Multi-year generalized linear models test impact-by-time interactions, possibly after adjusting for time-dependent concomitant variables. Communities are analyzed to make inferences about overall biological impact and recovery and require non-normal multivariate techniques. Partial canonical corresponding analysis is an appropriate community model for ordinating spatial and temporal shifts due to impact. The Exxon Valdez is used as a case study
Statistical Techniques for Project Control
Badiru, Adedeji B
2012-01-01
A project can be simple or complex. In each case, proven project management processes must be followed. In all cases of project management implementation, control must be exercised in order to assure that project objectives are achieved. Statistical Techniques for Project Control seamlessly integrates qualitative and quantitative tools and techniques for project control. It fills the void that exists in the application of statistical techniques to project control. The book begins by defining the fundamentals of project management then explores how to temper quantitative analysis with qualitati
The maximum entropy technique. System's statistical description
International Nuclear Information System (INIS)
Belashev, B.Z.; Sulejmanov, M.K.
2002-01-01
The maximum entropy technique (MENT) is applied for searching the distribution functions of physical values. MENT takes into consideration the demand of maximum entropy, the characteristics of the system and the connection conditions, naturally. It is allowed to apply MENT for statistical description of closed and open systems. The examples in which MENT had been used for the description of the equilibrium and nonequilibrium states and the states far from the thermodynamical equilibrium are considered
Probabilistic cellular automata: Some statistical mechanical considerations
International Nuclear Information System (INIS)
Lebowitz, J.L.; Maes, C.; Speer, E.R.
1990-01-01
Spin systems evolving in continuous or discrete time under the action of stochastic dynamics are used to model phenomena as diverse as the structure of alloys and the functioning of neural networks. While in some cases the dynamics are secondary, designed to produce a specific stationary measure whose properties one is interested in studying, there are other cases in which the only available information is the dynamical rule. Prime examples of the former are computer simulations, via Glauber dynamics, of equilibrium Gibbs measures with a specified interaction potential. Examples of the latter include various types of majority rule dynamics used as models for pattern recognition and for error-tolerant computations. The present note discusses ways in which techniques found useful in equilibrium statistical mechanics can be applied to a particular class of models of the latter types. These are cellular automata with noise: systems in which the spins are updated stochastically at integer times, simultaneously at all sites of some regular lattice. These models were first investigated in detail in the Soviet literature of the late sixties and early seventies. They are now generally referred to as Stochastic or Probabilistic Cellular Automata (PCA), and may be considered to include deterministic automata (CA) as special limits. 16 refs., 3 figs
21 CFR 820.250 - Statistical techniques.
2010-04-01
... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Statistical techniques. 820.250 Section 820.250...) MEDICAL DEVICES QUALITY SYSTEM REGULATION Statistical Techniques § 820.250 Statistical techniques. (a... statistical techniques required for establishing, controlling, and verifying the acceptability of process...
Statistical considerations in design of spacelab experiments
Robinson, J.
1978-01-01
After making an analysis of experimental error sources, statistical models were developed for the design and analysis of potential Space Shuttle experiments. Guidelines for statistical significance and/or confidence limits of expected results were also included. The models were then tested out on the following proposed Space Shuttle biomedical experiments: (1) bone density by computer tomography; (2) basal metabolism; and (3) total body water. Analysis of those results and therefore of the models proved inconclusive due to the lack of previous research data and statistical values. However, the models were seen as possible guides to making some predictions and decisions.
Difficult cases for chromosomal dosimetry: Statistical considerations
Energy Technology Data Exchange (ETDEWEB)
Vinnikov, Volodymyr A., E-mail: vlad.vinnikov@mail.ru [Grigoriev Institute for Medical Radiology of the National Academy of Medical Science of Ukraine, Pushkinskaya Street 82, Kharkiv 61024 (Ukraine); Ainsbury, Elizabeth A., E-mail: liz.ainsbury@hpa.org.uk [Health Protection Agency, Centre for Radiation, Chemical and Environmental Hazards, Chilton, Didcot, Oxon OX11 0RQ (United Kingdom); Lloyd, David C., E-mail: david.lloyd@hpa.org.uk [Health Protection Agency, Centre for Radiation, Chemical and Environmental Hazards, Chilton, Didcot, Oxon OX11 0RQ (United Kingdom); Maznyk, Nataliya A., E-mail: maznik.cytogen@mail.ru [Grigoriev Institute for Medical Radiology of the National Academy of Medical Science of Ukraine, Pushkinskaya Street 82, Kharkiv 61024 (Ukraine); Rothkamm, Kai, E-mail: kai.rothkamm@hpa.org.uk [Health Protection Agency, Centre for Radiation, Chemical and Environmental Hazards, Chilton, Didcot, Oxon OX11 0RQ (United Kingdom)
2011-09-15
Several examples are selected from the literature in order to illustrate combinations of complicating factors, which may occur in real-life radiation exposure scenarios that affect the accuracy of cytogenetic dose estimates. An analysis of limitations in the current statistical methods used in biodosimetry was carried out. Possible directions for further improvement of the statistical basis of chromosomal dosimetry by specific mathematical procedures are outlined.
Projection operator techniques in nonequilibrium statistical mechanics
International Nuclear Information System (INIS)
Grabert, H.
1982-01-01
This book is an introduction to the application of the projection operator technique to the statistical mechanics of irreversible processes. After a general introduction to the projection operator technique and statistical thermodynamics the Fokker-Planck and the master equation approach are described together with the response theory. Then, as applications the damped harmonic oscillator, simple fluids, and the spin relaxation are considered. (HSI)
Statistically tuned Gaussian background subtraction technique for ...
Indian Academy of Sciences (India)
ground, small objects, moving background and multiple objects are considered for evaluation. The technique is statistically compared with frame differencing technique, temporal median method and mixture of Gaussian model and performance evaluation is done to check the effectiveness of the proposed technique after ...
Statistical and Computational Techniques in Manufacturing
2012-01-01
In recent years, interest in developing statistical and computational techniques for applied manufacturing engineering has been increased. Today, due to the great complexity of manufacturing engineering and the high number of parameters used, conventional approaches are no longer sufficient. Therefore, in manufacturing, statistical and computational techniques have achieved several applications, namely, modelling and simulation manufacturing processes, optimization manufacturing parameters, monitoring and control, computer-aided process planning, etc. The present book aims to provide recent information on statistical and computational techniques applied in manufacturing engineering. The content is suitable for final undergraduate engineering courses or as a subject on manufacturing at the postgraduate level. This book serves as a useful reference for academics, statistical and computational science researchers, mechanical, manufacturing and industrial engineers, and professionals in industries related to manu...
Scientific, statistical, practical, and regulatory considerations in design space development.
Debevec, Veronika; Srčič, Stanko; Horvat, Matej
2018-03-01
The quality by design (QbD) paradigm guides the pharmaceutical industry towards improved understanding of products and processes, and at the same time facilitates a high degree of manufacturing and regulatory flexibility throughout the establishment of the design space. This review article presents scientific, statistical and regulatory considerations in design space development. All key development milestones, starting with planning, selection of factors, experimental execution, data analysis, model development and assessment, verification, and validation, and ending with design space submission, are presented and discussed. The focus is especially on frequently ignored topics, like management of factors and CQAs that will not be included in experimental design, evaluation of risk of failure on design space edges, or modeling scale-up strategy. Moreover, development of a design space that is independent of manufacturing scale is proposed as the preferred approach.
Review of the Statistical Techniques in Medical Sciences | Okeh ...
African Journals Online (AJOL)
... medical researcher in selecting the appropriate statistical techniques. Of course, all statistical techniques have certain underlying assumptions, which must be checked before the technique is applied. Keywords: Variable, Prospective Studies, Retrospective Studies, Statistical significance. Bio-Research Vol. 6 (1) 2008: pp.
Fatigue crack initiation and growth life prediction with statistical consideration
International Nuclear Information System (INIS)
Kwon, J.D.; Choi, S.H.; Kwak, S.G.; Chun, K.O.
1991-01-01
Life prediction or residual life prediction of structures or machines is one of the most strongly world wide needed problems as requirement in the stage of slowly developing economy which comes after rapidly and highly developing stage. For the purpose of statistical life prediction, fatigue test was conducted under the 3 stress levels, and for each stress level, 20 specimens are used. The statistical properties of the crack growth parameter m and C in the fatigue crack growth law of da/dN = C(ΔK) m , and the relationship between m and C, and the statistical distribution pattern of fatigue crack initiation, growth and fracture lives can be obtained by experimental results
Statistically tuned Gaussian background subtraction technique for ...
Indian Academy of Sciences (India)
Keywords. Tuning factor; background segmentation; unmanned aerial vehicle; aerial surveillance; thresholding. Abstract. Background subtraction is one of the efficient techniques to segment the targets from non-informative background of a video. The traditional background subtraction technique suits for videos with static ...
Developing Communication Skills: General Considerations and Specific Techniques.
Joiner, Elizabeth Garner; Westphal, Patricia Barney, Ed.
This practical book is designed for the classroom teacher of a second or foreign language at any level. The articles are grouped into two distinct but interdependent sections on general considerations and specific techniques. The contents of the first section are as follows: "Moi Tarzan, Vous Jane?: A Study of Communicative Competence" by P.B.…
Statistical Techniques in Electrical and Computer Engineering
Indian Academy of Sciences (India)
Stochastic models and statistical inference from them have been popular methodologies in a variety of engineering disciplines, notably in electrical and computer engineering. Recent years have seen explosive growth in this area, driven by technological imperatives. These now go well beyond their traditional domain of ...
Testing of statistical techniques used in SYVAC
International Nuclear Information System (INIS)
Dalrymple, G.; Edwards, H.; Prust, J.
1984-01-01
Analysis of the SYVAC (SYstems Variability Analysis Code) output adopted four techniques to provide a cross comparison of their performance. The techniques used were: examination of scatter plots; correlation/regression; Kruskal-Wallis one-way analysis of variance by ranks; comparison of cumulative distribution functions and risk estimates between sub-ranges of parameter values. The analysis was conducted for the case of a single nuclide chain and was based mainly on simulated dose after 500,000 years. The results from this single SYVAC case showed that site parameters had the greatest influence on dose to man. The techniques of correlation/regression and Kruskal-Wallis were both successful and consistent in their identification of important parameters. Both techniques ranked the eight most important parameters in the same order when analysed for maximum dose. The results from a comparison of cdfs and risks in sub-ranges of the parameter values were not entirely consistent with other techniques. Further sampling of the high dose region is recommended in order to improve the accuracy of this method. (author)
Variation in reaction norms : Statistical considerations and biological interpretation
Morrissey, Michael B.; Liefting, Maartje
2016-01-01
Analysis of reaction norms, the functions by which the phenotype produced by a given genotype depends on the environment, is critical to studying many aspects of phenotypic evolution. Different techniques are available for quantifying different aspects of reaction norm variation. We examine what
Time series prediction: statistical and neural techniques
Zahirniak, Daniel R.; DeSimio, Martin P.
1996-03-01
In this paper we compare the performance of nonlinear neural network techniques to those of linear filtering techniques in the prediction of time series. Specifically, we compare the results of using the nonlinear systems, known as multilayer perceptron and radial basis function neural networks, with the results obtained using the conventional linear Wiener filter, Kalman filter and Widrow-Hoff adaptive filter in predicting future values of stationary and non- stationary time series. Our results indicate the performance of each type of system is heavily dependent upon the form of the time series being predicted and the size of the system used. In particular, the linear filters perform adequately for linear or near linear processes while the nonlinear systems perform better for nonlinear processes. Since the linear systems take much less time to be developed, they should be tried prior to using the nonlinear systems when the linearity properties of the time series process are unknown.
FINE-GRAINEDCELLULAR CONCRETE CREEP ANALYSIS TECHNIQUE WITH CONSIDERATION FORCARBONATION
Directory of Open Access Journals (Sweden)
M. A. Gaziev
2015-01-01
Full Text Available The article considers the creep and creep deformation analysis technique in fine-grainedcellular concrete with consideration for carbonation and assurance requirements for the repairing properties and seismic stability. The procedure for determining the creep of fine-grainedcellular concrete is proposed with account of its carbonationby atmospheric carbon dioxide. It has been found theoretically and experimentally that the proposed technique allows obtaining reproducible results and can be recommended for creep determination of fine-grainedcellular concretes, including repairingones, taking into account their carbonation.
Predicting radiotherapy outcomes using statistical learning techniques
Energy Technology Data Exchange (ETDEWEB)
El Naqa, Issam; Bradley, Jeffrey D; Deasy, Joseph O [Washington University, Saint Louis, MO (United States); Lindsay, Patricia E; Hope, Andrew J [Department of Radiation Oncology, Princess Margaret Hospital, Toronto, ON (Canada)
2009-09-21
Radiotherapy outcomes are determined by complex interactions between treatment, anatomical and patient-related variables. A common obstacle to building maximally predictive outcome models for clinical practice is the failure to capture potential complexity of heterogeneous variable interactions and applicability beyond institutional data. We describe a statistical learning methodology that can automatically screen for nonlinear relations among prognostic variables and generalize to unseen data before. In this work, several types of linear and nonlinear kernels to generate interaction terms and approximate the treatment-response function are evaluated. Examples of institutional datasets of esophagitis, pneumonitis and xerostomia endpoints were used. Furthermore, an independent RTOG dataset was used for 'generalizabilty' validation. We formulated the discrimination between risk groups as a supervised learning problem. The distribution of patient groups was initially analyzed using principle components analysis (PCA) to uncover potential nonlinear behavior. The performance of the different methods was evaluated using bivariate correlations and actuarial analysis. Over-fitting was controlled via cross-validation resampling. Our results suggest that a modified support vector machine (SVM) kernel method provided superior performance on leave-one-out testing compared to logistic regression and neural networks in cases where the data exhibited nonlinear behavior on PCA. For instance, in prediction of esophagitis and pneumonitis endpoints, which exhibited nonlinear behavior on PCA, the method provided 21% and 60% improvements, respectively. Furthermore, evaluation on the independent pneumonitis RTOG dataset demonstrated good generalizabilty beyond institutional data in contrast with other models. This indicates that the prediction of treatment response can be improved by utilizing nonlinear kernel methods for discovering important nonlinear interactions among
Predicting radiotherapy outcomes using statistical learning techniques
El Naqa, Issam; Bradley, Jeffrey D.; Lindsay, Patricia E.; Hope, Andrew J.; Deasy, Joseph O.
2009-09-01
Radiotherapy outcomes are determined by complex interactions between treatment, anatomical and patient-related variables. A common obstacle to building maximally predictive outcome models for clinical practice is the failure to capture potential complexity of heterogeneous variable interactions and applicability beyond institutional data. We describe a statistical learning methodology that can automatically screen for nonlinear relations among prognostic variables and generalize to unseen data before. In this work, several types of linear and nonlinear kernels to generate interaction terms and approximate the treatment-response function are evaluated. Examples of institutional datasets of esophagitis, pneumonitis and xerostomia endpoints were used. Furthermore, an independent RTOG dataset was used for 'generalizabilty' validation. We formulated the discrimination between risk groups as a supervised learning problem. The distribution of patient groups was initially analyzed using principle components analysis (PCA) to uncover potential nonlinear behavior. The performance of the different methods was evaluated using bivariate correlations and actuarial analysis. Over-fitting was controlled via cross-validation resampling. Our results suggest that a modified support vector machine (SVM) kernel method provided superior performance on leave-one-out testing compared to logistic regression and neural networks in cases where the data exhibited nonlinear behavior on PCA. For instance, in prediction of esophagitis and pneumonitis endpoints, which exhibited nonlinear behavior on PCA, the method provided 21% and 60% improvements, respectively. Furthermore, evaluation on the independent pneumonitis RTOG dataset demonstrated good generalizabilty beyond institutional data in contrast with other models. This indicates that the prediction of treatment response can be improved by utilizing nonlinear kernel methods for discovering important nonlinear interactions among model
Predicting radiotherapy outcomes using statistical learning techniques
International Nuclear Information System (INIS)
El Naqa, Issam; Bradley, Jeffrey D; Deasy, Joseph O; Lindsay, Patricia E; Hope, Andrew J
2009-01-01
Radiotherapy outcomes are determined by complex interactions between treatment, anatomical and patient-related variables. A common obstacle to building maximally predictive outcome models for clinical practice is the failure to capture potential complexity of heterogeneous variable interactions and applicability beyond institutional data. We describe a statistical learning methodology that can automatically screen for nonlinear relations among prognostic variables and generalize to unseen data before. In this work, several types of linear and nonlinear kernels to generate interaction terms and approximate the treatment-response function are evaluated. Examples of institutional datasets of esophagitis, pneumonitis and xerostomia endpoints were used. Furthermore, an independent RTOG dataset was used for 'generalizabilty' validation. We formulated the discrimination between risk groups as a supervised learning problem. The distribution of patient groups was initially analyzed using principle components analysis (PCA) to uncover potential nonlinear behavior. The performance of the different methods was evaluated using bivariate correlations and actuarial analysis. Over-fitting was controlled via cross-validation resampling. Our results suggest that a modified support vector machine (SVM) kernel method provided superior performance on leave-one-out testing compared to logistic regression and neural networks in cases where the data exhibited nonlinear behavior on PCA. For instance, in prediction of esophagitis and pneumonitis endpoints, which exhibited nonlinear behavior on PCA, the method provided 21% and 60% improvements, respectively. Furthermore, evaluation on the independent pneumonitis RTOG dataset demonstrated good generalizabilty beyond institutional data in contrast with other models. This indicates that the prediction of treatment response can be improved by utilizing nonlinear kernel methods for discovering important nonlinear interactions among model
"Statistical Techniques for Particle Physics" (4/4)
CERN. Geneva
2009-01-01
This series will consist of four 1-hour lectures on statistics for particle physics. The goal will be to build up to techniques meant for dealing with problems of realistic complexity while maintaining a formal approach. I will also try to incorporate usage of common tools like ROOT, RooFit, and the newly developed RooStats framework into the lectures. The first lecture will begin with a review the basic principles of probability, some terminology, and the three main approaches towards statistical inference (Frequentist, Bayesian, and Likelihood-based). I will then outline the statistical basis for multivariate analysis techniques (the Neyman-Pearson lemma) and the motivation for machine learning algorithms. Later, I will extend simple hypothesis testing to the case in which the statistical model has one or many parameters (the Neyman Construction and the Feldman-Cousins technique). From there I will outline techniques to incorporate background uncertainties. If time allows, I will touch on the statist...
"Statistical Techniques for Particle Physics" (1/4)
CERN. Geneva
2009-01-01
This series will consist of four 1-hour lectures on statistics for particle physics. The goal will be to build up to techniques meant for dealing with problems of realistic complexity while maintaining a formal approach. I will also try to incorporate usage of common tools like ROOT, RooFit, and the newly developed RooStats framework into the lectures. The first lecture will begin with a review the basic principles of probability, some terminology, and the three main approaches towards statistical inference (Frequentist, Bayesian, and Likelihood-based). I will then outline the statistical basis for multivariate analysis techniques (the Neyman-Pearson lemma) and the motivation for machine learning algorithms. Later, I will extend simple hypothesis testing to the case in which the statistical model has one or many parameters (the Neyman Construction and the Feldman-Cousins technique). From there I will outline techniques to incorporate background uncertainties. If time allows, I will touch on the statist...
"Statistical Techniques for Particle Physics" (2/4)
CERN. Geneva
2009-01-01
This series will consist of four 1-hour lectures on statistics for particle physics. The goal will be to build up to techniques meant for dealing with problems of realistic complexity while maintaining a formal approach. I will also try to incorporate usage of common tools like ROOT, RooFit, and the newly developed RooStats framework into the lectures. The first lecture will begin with a review the basic principles of probability, some terminology, and the three main approaches towards statistical inference (Frequentist, Bayesian, and Likelihood-based). I will then outline the statistical basis for multivariate analysis techniques (the Neyman-Pearson lemma) and the motivation for machine learning algorithms. Later, I will extend simple hypothesis testing to the case in which the statistical model has one or many parameters (the Neyman Construction and the Feldman-Cousins technique). From there I will outline techniques to incorporate background uncertainties. If time allows, I will touch on the statist...
"Statistical Techniques for Particle Physics" (3/4)
CERN. Geneva
2009-01-01
This series will consist of four 1-hour lectures on statistics for particle physics. The goal will be to build up to techniques meant for dealing with problems of realistic complexity while maintaining a formal approach. I will also try to incorporate usage of common tools like ROOT, RooFit, and the newly developed RooStats framework into the lectures. The first lecture will begin with a review the basic principles of probability, some terminology, and the three main approaches towards statistical inference (Frequentist, Bayesian, and Likelihood-based). I will then outline the statistical basis for multivariate analysis techniques (the Neyman-Pearson lemma) and the motivation for machine learning algorithms. Later, I will extend simple hypothesis testing to the case in which the statistical model has one or many parameters (the Neyman Construction and the Feldman-Cousins technique). From there I will outline techniques to incorporate background uncertainties. If time allows, I will touch on the statist...
The statistical chopper in the time-of-flight technique
International Nuclear Information System (INIS)
Albuquerque Vieira, J. de.
1975-12-01
A detailed study of the 'statistical' chopper and of the method of analysis of the data obtained by this technique is made. The study includes the basic ideas behind correlation methods applied in time-of-flight techniques; comparisons with the conventional chopper made by an analysis of statistical errors; the development of a FORTRAN computer programme to analyse experimental results; the presentation of the related fields of work to demonstrate the potential of this method and suggestions for future study together with the criteria for a time-of-flight experiment using the method being studied [pt
Advanced Statistical Signal Processing Techniques for Landmine Detection Using GPR
2014-07-12
Processing Techniques for Landmine Detection Using GPR The views, opinions and/or findings contained in this report are those of the author(s) and should not...310 Jesse Hall Columbia, MO 65211 -1230 654808 633606 ABSTRACT Advanced Statistical Signal Processing Techniques for Landmine Detection Using GPR Report...Aggregation Operator For Humanitarian Demining Using Hand- Held GPR , , (01 2008): . doi: D. Ho, P. Gader, J. Wilson, H. Frigui. Subspace Processing
GIS-based bivariate statistical techniques for groundwater potential ...
Indian Academy of Sciences (India)
Ali Haghizadeh
2017-11-23
Nov 23, 2017 ... of both systemic and stochastic uncertainty. Finally, it can be realized that these techniques are very beneficial for groundwater potential analyzing and can be practical for water-resource management experts. Keywords. Groundwater; statistical index; Dempster–Shafer theory; water resource management; ...
GIS-based bivariate statistical techniques for groundwater potential ...
Indian Academy of Sciences (India)
Groundwater potential analysis prepares better comprehension of hydrological settings of different regions. This study shows the potency of two GIS-based data driven bivariate techniques namely statistical index (SI) and Dempster–Shafer theory (DST) to analyze groundwater potential in Broujerd region of Iran.
1980 Summer Study on Statistical Techniques in Army Testing.
1980-07-01
WASHINGTON, D. C. 20310 f ARMY CIENCE BOARD 1980 SUMMER STUDY ON STATISTICAL TECHNIQUES IN ARMY TESTING JULY 1980 DTICS ELECTE NOV 2 5 1980 B _STRI...statisticians is adequate, and in some cases, excellent. In the areas of education and the dissemination of information, the Study Group found that the
Techniques in teaching statistics : linking research production and research use.
Energy Technology Data Exchange (ETDEWEB)
Martinez-Moyano, I .; Smith, A. (Decision and Information Sciences); (Univ. of Massachusetts at Boston)
2012-01-01
In the spirit of closing the 'research-practice gap,' the authors extend evidence-based principles to statistics instruction in social science graduate education. The authors employ a Delphi method to survey experienced statistics instructors to identify teaching techniques to overcome the challenges inherent in teaching statistics to students enrolled in practitioner-oriented master's degree programs. Among the teaching techniques identi?ed as essential are using real-life examples, requiring data collection exercises, and emphasizing interpretation rather than results. Building on existing research, preliminary interviews, and the ?ndings from the study, the authors develop a model describing antecedents to the strength of the link between research and practice.
Statistical Theory of the Vector Random Decrement Technique
DEFF Research Database (Denmark)
Asmussen, J. C.; Brincker, Rune; Ibrahim, S. R.
1999-01-01
decays. Due to the speed and/or accuracy of the Vector Random Decrement technique, it was introduced as an attractive alternative to the Random Decrement technique. In this paper, the theory of the Vector Random Decrement technique is extended by applying a statistical description of the stochastic...... processes describing the ambient measurements. The Vector Random Decrement functions are linked to the correlation functions of the stochastic processes provided they are stationary and Gaussian distributed. Furthermore, a new approach for quality assessment of the Vector Random Decrement functions is given...... on the basis of the derived results. The work presented in this paper makes the theory of the Vector Random Decrement technique equivalent to the theory of the Random Decrement technique. The theoretical derivations are illustrated by the analysis of the response of a 3DOF system loaded by white noise. ...
Lightweight and Statistical Techniques for Petascale PetaScale Debugging
Energy Technology Data Exchange (ETDEWEB)
Miller, Barton
2014-06-30
This project investigated novel techniques for debugging scientific applications on petascale architectures. In particular, we developed lightweight tools that narrow the problem space when bugs are encountered. We also developed techniques that either limit the number of tasks and the code regions to which a developer must apply a traditional debugger or that apply statistical techniques to provide direct suggestions of the location and type of error. We extend previous work on the Stack Trace Analysis Tool (STAT), that has already demonstrated scalability to over one hundred thousand MPI tasks. We also extended statistical techniques developed to isolate programming errors in widely used sequential or threaded applications in the Cooperative Bug Isolation (CBI) project to large scale parallel applications. Overall, our research substantially improved productivity on petascale platforms through a tool set for debugging that complements existing commercial tools. Previously, Office Of Science application developers relied either on primitive manual debugging techniques based on printf or they use tools, such as TotalView, that do not scale beyond a few thousand processors. However, bugs often arise at scale and substantial effort and computation cycles are wasted in either reproducing the problem in a smaller run that can be analyzed with the traditional tools or in repeated runs at scale that use the primitive techniques. New techniques that work at scale and automate the process of identifying the root cause of errors were needed. These techniques significantly reduced the time spent debugging petascale applications, thus leading to a greater overall amount of time for application scientists to pursue the scientific objectives for which the systems are purchased. We developed a new paradigm for debugging at scale: techniques that reduced the debugging scenario to a scale suitable for traditional debuggers, e.g., by narrowing the search for the root-cause analysis
Air Quality Forecasting through Different Statistical and Artificial Intelligence Techniques
Mishra, D.; Goyal, P.
2014-12-01
Urban air pollution forecasting has emerged as an acute problem in recent years because there are sever environmental degradation due to increase in harmful air pollutants in the ambient atmosphere. In this study, there are different types of statistical as well as artificial intelligence techniques are used for forecasting and analysis of air pollution over Delhi urban area. These techniques are principle component analysis (PCA), multiple linear regression (MLR) and artificial neural network (ANN) and the forecasting are observed in good agreement with the observed concentrations through Central Pollution Control Board (CPCB) at different locations in Delhi. But such methods suffers from disadvantages like they provide limited accuracy as they are unable to predict the extreme points i.e. the pollution maximum and minimum cut-offs cannot be determined using such approach. Also, such methods are inefficient approach for better output forecasting. But with the advancement in technology and research, an alternative to the above traditional methods has been proposed i.e. the coupling of statistical techniques with artificial Intelligence (AI) can be used for forecasting purposes. The coupling of PCA, ANN and fuzzy logic is used for forecasting of air pollutant over Delhi urban area. The statistical measures e.g., correlation coefficient (R), normalized mean square error (NMSE), fractional bias (FB) and index of agreement (IOA) of the proposed model are observed in better agreement with the all other models. Hence, the coupling of statistical and artificial intelligence can be use for the forecasting of air pollutant over urban area.
Categorical and nonparametric data analysis choosing the best statistical technique
Nussbaum, E Michael
2014-01-01
Featuring in-depth coverage of categorical and nonparametric statistics, this book provides a conceptual framework for choosing the most appropriate type of test in various research scenarios. Class tested at the University of Nevada, the book's clear explanations of the underlying assumptions, computer simulations, and Exploring the Concept boxes help reduce reader anxiety. Problems inspired by actual studies provide meaningful illustrations of the techniques. The underlying assumptions of each test and the factors that impact validity and statistical power are reviewed so readers can explain
Statistical and Economic Techniques for Site-specific Nematode Management.
Liu, Zheng; Griffin, Terry; Kirkpatrick, Terrence L
2014-03-01
Recent advances in precision agriculture technologies and spatial statistics allow realistic, site-specific estimation of nematode damage to field crops and provide a platform for the site-specific delivery of nematicides within individual fields. This paper reviews the spatial statistical techniques that model correlations among neighboring observations and develop a spatial economic analysis to determine the potential of site-specific nematicide application. The spatial econometric methodology applied in the context of site-specific crop yield response contributes to closing the gap between data analysis and realistic site-specific nematicide recommendations and helps to provide a practical method of site-specifically controlling nematodes.
Statistical and Economic Techniques for Site-specific Nematode Management
Liu, Zheng; Griffin, Terry; Kirkpatrick, Terrence L.
2014-01-01
Recent advances in precision agriculture technologies and spatial statistics allow realistic, site-specific estimation of nematode damage to field crops and provide a platform for the site-specific delivery of nematicides within individual fields. This paper reviews the spatial statistical techniques that model correlations among neighboring observations and develop a spatial economic analysis to determine the potential of site-specific nematicide application. The spatial econometric methodology applied in the context of site-specific crop yield response contributes to closing the gap between data analysis and realistic site-specific nematicide recommendations and helps to provide a practical method of site-specifically controlling nematodes. PMID:24643451
The application of statistical techniques to nuclear materials accountancy
International Nuclear Information System (INIS)
Annibal, P.S.; Roberts, P.D.
1990-02-01
Over the past decade much theoretical research has been carried out on the development of statistical methods for nuclear materials accountancy. In practice plant operation may differ substantially from the idealized models often cited. This paper demonstrates the importance of taking account of plant operation in applying the statistical techniques, to improve the accuracy of the estimates and the knowledge of the errors. The benefits are quantified either by theoretical calculation or by simulation. Two different aspects are considered; firstly, the use of redundant measurements to reduce the error on the estimate of the mass of heavy metal in an accountancy tank is investigated. Secondly, a means of improving the knowledge of the 'Material Unaccounted For' (the difference between the inventory calculated from input/output data, and the measured inventory), using information about the plant measurement system, is developed and compared with existing general techniques. (author)
Green, John; Wheeler, James R
2013-11-15
Solvents are often used to aid test item preparation in aquatic ecotoxicity experiments. This paper discusses the practical, statistical and regulatory considerations. The selection of the appropriate control (if a solvent is used) for statistical analysis is investigated using a database of 141 responses (endpoints) from 71 experiments. The advantages and disadvantages of basing the statistical analysis of treatment effects to the water control alone, solvent control alone, combined controls, or a conditional strategy of combining controls, when not statistically significantly different, are tested. The latter two approaches are shown to have distinct advantages. It is recommended that this approach continue to be the standard used for regulatory and research aquatic ecotoxicology studies. However, wherever technically feasible a solvent should not be employed or at least the concentration minimized. Copyright © 2013 Elsevier B.V. All rights reserved.
Statistical Theory of the Vector Random Decrement Technique
ASMUSSEN, J. C.; BRINCKER, R.; IBRAHIM, S. R.
1999-09-01
The Vector Random Decrement technique has previously been introduced as an efficient method to transform ambient responses of linear structures into Vector Random Decrement functions which are equivalent to free decays of the current structure. The modal parameters can be extracted from the free decays. Due to the speed and/or accuracy of the Vector Random Decrement technique, it was introduced as an attractive alternative to the Random Decrement technique. In this paper, the theory of the Vector Random Decrement technique is extended by applying a statistical description of the stochastic processes describing the ambient measurements. The Vector Random Decrement functions are linked to the correlation functions of the stochastic processes provided they are stationary and Gaussian distributed. Furthermore, a new approach for quality assessment of the Vector Random Decrement functions is given on the basis of the derived results. The work presented in this paper makes the theory of the Vector Random Decrement technique equivalent to the theory of the Random Decrement technique. The theoretical derivations are illustrated by the analysis of the response of a 3DOF system loaded by white noise.
Bone marrow aspiration and biopsy. Technique and considerations
Directory of Open Access Journals (Sweden)
R.A. Trejo-Ayala
2015-10-01
Full Text Available Bone marrow aspiration and bone marrow biopsy are invasive procedures in which good technical skill is crucial to obtain samples suitable for processing and diagnostic interpretation. The type and calibre of the needle is one of the main variables of the technique, and is selected on the basis of the age, gender and body mass of the patient. This article provides a practical, step-by-step guide to the technique for both procedures. It also discusses existing techniques for reducing the pain associated with the procedure, an essential aspect for the patient that if poorly handled, can force cancellation of the procedure.
Statistical techniques to extract information during SMAP soil moisture assimilation
Kolassa, J.; Reichle, R. H.; Liu, Q.; Alemohammad, S. H.; Gentine, P.
2017-12-01
Statistical techniques permit the retrieval of soil moisture estimates in a model climatology while retaining the spatial and temporal signatures of the satellite observations. As a consequence, the need for bias correction prior to an assimilation of these estimates is reduced, which could result in a more effective use of the independent information provided by the satellite observations. In this study, a statistical neural network (NN) retrieval algorithm is calibrated using SMAP brightness temperature observations and modeled soil moisture estimates (similar to those used to calibrate the SMAP Level 4 DA system). Daily values of surface soil moisture are estimated using the NN and then assimilated into the NASA Catchment model. The skill of the assimilation estimates is assessed based on a comprehensive comparison to in situ measurements from the SMAP core and sparse network sites as well as the International Soil Moisture Network. The NN retrieval assimilation is found to significantly improve the model skill, particularly in areas where the model does not represent processes related to agricultural practices. Additionally, the NN method is compared to assimilation experiments using traditional bias correction techniques. The NN retrieval assimilation is found to more effectively use the independent information provided by SMAP resulting in larger model skill improvements than assimilation experiments using traditional bias correction techniques.
Application of multivariate statistical techniques in microbial ecology.
Paliy, O; Shankar, V
2016-03-01
Recent advances in high-throughput methods of molecular analyses have led to an explosion of studies generating large-scale ecological data sets. In particular, noticeable effect has been attained in the field of microbial ecology, where new experimental approaches provided in-depth assessments of the composition, functions and dynamic changes of complex microbial communities. Because even a single high-throughput experiment produces large amount of data, powerful statistical techniques of multivariate analysis are well suited to analyse and interpret these data sets. Many different multivariate techniques are available, and often it is not clear which method should be applied to a particular data set. In this review, we describe and compare the most widely used multivariate statistical techniques including exploratory, interpretive and discriminatory procedures. We consider several important limitations and assumptions of these methods, and we present examples of how these approaches have been utilized in recent studies to provide insight into the ecology of the microbial world. Finally, we offer suggestions for the selection of appropriate methods based on the research question and data set structure. © 2016 John Wiley & Sons Ltd.
Statistical Techniques for Assessing water‐quality effects of BMPs
Walker, John F.
1994-01-01
Little has been published on the effectiveness of various management practices in small rural lakes and streams at the watershed scale. In this study, statistical techniques were used to test for changes in water‐quality data from watersheds where best management practices (BMPs) were implemented. Reductions in data variability due to climate and seasonality were accomplished through the use of regression methods. This study discusses the merits of using storm‐mass‐transport data as a means of improving the ability to detect BMP effects on stream‐water quality. Statistical techniques were applied to suspended‐sediment records from three rural watersheds in Illinois for the period 1981–84. None of the techniques identified changes in suspended sediment, primarily because of the small degree of BMP implementation and because of potential errors introduced through the estimation of storm‐mass transport. A Monte Carlo sensitivity analysis was used to determine the level of discrete change that could be detected for each watershed. In all cases, the use of regressions improved the ability to detect trends.Read More: http://ascelibrary.org/doi/abs/10.1061/(ASCE)0733-9437(1994)120:2(334)
Statistical techniques for noise removal from visual images
Allred, Lloyd G.; Kelly, Gary E.
1992-07-01
The median operator has been demonstrated to be a very effective method for restoring recognizable images from very noisy image data. The power of the median operator stems from its non-algebraic formulation, which prevents erroneous data corrupting the final color computation. A principal drawback is that the median operator replaces all data, erroneous or not, the result being a net loss of information. This paper presents alternative statistical outlier techniques by which erroneous data is readily recognized, but valid data usually remains unchanged. The result is an effective noise removal algorithm with reduced loss of information.
Electromagnetic considerations for RF current density imaging [MRI technique].
Scott, G C; Joy, M G; Armstrong, R L; Henkelman, R M
1995-01-01
Radio frequency current density imaging (RF-CDI) is a recent MRI technique that can image a Larmor frequency current density component parallel to B(0). Because the feasibility of the technique was demonstrated only for homogeneous media, the authors' goal here is to clarify the electromagnetic assumptions and field theory to allow imaging RF currents in heterogeneous media. The complete RF field and current density imaging problem is posed. General solutions are given for measuring lab frame magnetic fields from the rotating frame magnetic field measurements. For the general case of elliptically polarized fields, in which current and magnetic field components are not in phase, one can obtain a modified single rotation approximation. Sufficient information exists to image the amplitude and phase of the RF current density parallel to B(0) if the partial derivative in the B(0) direction of the RF magnetic field (amplitude and phase) parallel to B(0) is much smaller than the corresponding current density component. The heterogeneous extension was verified by imaging conduction and displacement currents in a phantom containing saline and pure water compartments. Finally, the issues required to image eddy currents are presented. Eddy currents within a sample will distort both the transmitter coil reference system, and create measurable rotating frame magnetic fields. However, a three-dimensional electro-magnetic analysis will be required to determine how the reference system distortion affects computed eddy current images.
International Nuclear Information System (INIS)
Barthel, R.
2008-01-01
The German Radiation Protection Commission has recommended 'Principles and Methods for the Consideration of Statistical Uncertainties for the Determination of Representative Values of the Specific Activity of NORM wastes' concerning the proof of compliance with supervision limits or dose standards according to paragraph 97 and paragraph 98 of the Radiation Protection Ordinance, respectively. The recommendation comprises a method ensuring the representativeness of estimates for the specific activity of NORM wastes, which also assures the required evidence for conformity with respect to supervision limits or dose standards, respectively. On the basis of a sampling survey, confidence limits for expectation values of specific activities are determined, which will be used to show that the supervision limit or the dose standard is met or exceeded with certainty, or that the performed sampling is not sufficient for the intended assessment. The sampling effort depends on the type and the width of the distribution of specific activities and is determined by the position of the confidence interval with respect to the supervision limit or of the resulting doses with respect to the dose standard. The statistical uncertainties that are described by confidence limits may be reduced by an optimised extension of the sample number, as far as necessary. (orig.)
Groundwater quality assessment of urban Bengaluru using multivariate statistical techniques
Gulgundi, Mohammad Shahid; Shetty, Amba
2018-03-01
Groundwater quality deterioration due to anthropogenic activities has become a subject of prime concern. The objective of the study was to assess the spatial and temporal variations in groundwater quality and to identify the sources in the western half of the Bengaluru city using multivariate statistical techniques. Water quality index rating was calculated for pre and post monsoon seasons to quantify overall water quality for human consumption. The post-monsoon samples show signs of poor quality in drinking purpose compared to pre-monsoon. Cluster analysis (CA), principal component analysis (PCA) and discriminant analysis (DA) were applied to the groundwater quality data measured on 14 parameters from 67 sites distributed across the city. Hierarchical cluster analysis (CA) grouped the 67 sampling stations into two groups, cluster 1 having high pollution and cluster 2 having lesser pollution. Discriminant analysis (DA) was applied to delineate the most meaningful parameters accounting for temporal and spatial variations in groundwater quality of the study area. Temporal DA identified pH as the most important parameter, which discriminates between water quality in the pre-monsoon and post-monsoon seasons and accounts for 72% seasonal assignation of cases. Spatial DA identified Mg, Cl and NO3 as the three most important parameters discriminating between two clusters and accounting for 89% spatial assignation of cases. Principal component analysis was applied to the dataset obtained from the two clusters, which evolved three factors in each cluster, explaining 85.4 and 84% of the total variance, respectively. Varifactors obtained from principal component analysis showed that groundwater quality variation is mainly explained by dissolution of minerals from rock water interactions in the aquifer, effect of anthropogenic activities and ion exchange processes in water.
Techniques for the Statistical Analysis of Observer Data
National Research Council Canada - National Science Library
Bennett, John G
2001-01-01
.... The two techniques are as follows: (1) fitting logistic curves to the vehicle data, and (2) using the Fisher Exact Test to compare the probability of detection of the two vehicles at each range...
Sensitivity analysis and related analysis : A survey of statistical techniques
Kleijnen, J.P.C.
1995-01-01
This paper reviews the state of the art in five related types of analysis, namely (i) sensitivity or what-if analysis, (ii) uncertainty or risk analysis, (iii) screening, (iv) validation, and (v) optimization. The main question is: when should which type of analysis be applied; which statistical
Velocity field statistics and tessellation techniques : Unbiased estimators of Omega
Van de Weygaert, R; Bernardeau, F; Muller,; Gottlober, S; Mucket, JP; Wambsganss, J
1998-01-01
We describe two new - stochastic-geometrical - methods to obtain reliable velocity field statistics from N-body simulations and from any general density and velocity fluctuation field sampled at a discrete set of locations. These methods, the Voronoi tessellation method and Delaunay tessellation
Velocity Field Statistics and Tessellation Techniques : Unbiased Estimators of Omega
Weygaert, R. van de; Bernardeau, F.
1998-01-01
Abstract: We describe two new, stochastic-geometrical, methods to obtain reliable velocity field statistics from N-body simulations and from any general density and velocity fluctuation field sampled at a discrete set of locations. These methods, the Voronoi tessellation method and Delaunay
Statistical techniques for sampling and monitoring natural resources
Hans T. Schreuder; Richard Ernst; Hugo Ramirez-Maldonado
2004-01-01
We present the statistical theory of inventory and monitoring from a probabilistic point of view. We start with the basics and show the interrelationships between designs and estimators illustrating the methods with a small artificial population as well as with a mapped realistic population. For such applications, useful open source software is given in Appendix 4....
Statistical sampling techniques as applied to OSE inspections
International Nuclear Information System (INIS)
Davis, J.J.; Cote, R.W.
1987-01-01
The need has been recognized for statistically valid methods for gathering information during OSE inspections; and for interpretation of results, both from performance testing and from records reviews, interviews, etc. Battelle Columbus Division, under contract to DOE OSE has performed and is continuing to perform work in the area of statistical methodology for OSE inspections. This paper represents some of the sampling methodology currently being developed for use during OSE inspections. Topics include population definition, sample size requirements, level of confidence and practical logistical constraints associated with the conduct of an inspection based on random sampling. Sequential sampling schemes and sampling from finite populations are also discussed. The methods described are applicable to various data gathering activities, ranging from the sampling and examination of classified documents to the sampling of Protective Force security inspectors for skill testing
Statistical techniques for the characterization of partially observed epidemics.
Energy Technology Data Exchange (ETDEWEB)
Safta, Cosmin; Ray, Jaideep; Crary, David (Applied Research Associates, Inc, Arlington, VA); Cheng, Karen (Applied Research Associates, Inc, Arlington, VA)
2010-11-01
Techniques appear promising to construct and integrate automated detect-and-characterize technique for epidemics - Working off biosurveillance data, and provides information on the particular/ongoing outbreak. Potential use - in crisis management and planning, resource allocation - Parameter estimation capability ideal for providing the input parameters into an agent-based model, Index Cases, Time of Infection, infection rate. Non-communicable diseases are easier than communicable ones - Small anthrax can be characterized well with 7-10 days of data, post-detection; plague takes longer, Large attacks are very easy.
Williams, Immanuel James; Williams, Kelley Kim
2016-01-01
Understanding summary statistics and graphical techniques are building blocks to comprehending concepts beyond basic statistics. It's known that motivated students perform better in school. Using examples that students find engaging allows them to understand the concepts at a deeper level.
GIS-based bivariate statistical techniques for groundwater potential ...
Indian Academy of Sciences (India)
... complicated relation between groundwater occurrence and groundwater conditioning factors, which permits investigation of both systemic and stochastic uncertainty. Finally, it can be realized that these techniques are very beneficial for groundwater potential analyzing and can be practical for water-resource management ...
GIS-based bivariate statistical techniques for groundwater potential ...
Indian Academy of Sciences (India)
Ali Haghizadeh
2017-11-23
Nov 23, 2017 ... So, these models are known as computational intel- ligence and machine learning techniques to use for replacing physically based models. In contrast, knowledge-driven methods (KDM) use rich prior knowledge for model building based on knowledge engineering and management technologies (Azkune.
(NHIS) using data mining technique as a statistical model
African Journals Online (AJOL)
kofi.mereku
2014-05-23
May 23, 2014 ... Scheme (NHIS) claims in the Awutu-Effutu-Senya District using data mining techniques, with a specific focus on .... transform them into a format that is friendly to data mining algorithms, such as .... many groups to access the data, facilitate updating the data, and improve the efficiency of checking the data for ...
GIS-Based bivariate statistical techniques for groundwater potential ...
Indian Academy of Sciences (India)
24
driven and knowledge-driven models (Corsini ... addition, the usage application of GIS-based SI technique in groundwater potential mapping .... lithology of an given area and affect the drainage density and can be of great big value for to evaluate ...
National Research Council Canada - National Science Library
Steed, Chad A; Fitzpatrick, Patrick J; Jankun-Kelly, T. J; Swan II, J. E
2008-01-01
.... Innovative visual interaction techniques such as dynamic axis scaling, conjunctive parallel coordinates, statistical indicators, and aerial perspective shading are exploited to enhance the utility...
48 CFR 970.1504-1-5 - General considerations and techniques for determining fixed fees.
2010-10-01
... Contracting by Negotiation 970.1504-1-5 General considerations and techniques for determining fixed fees. (a... the following eight significant factors, as outlined in order of importance, and the assignment of... particular annual fixed fee negotiation is established by evaluating the factors in this subsection...
Application of Statistical Potential Techniques to Runaway Transport Studies
Energy Technology Data Exchange (ETDEWEB)
Eguilior, S.; Castejon, F. [Ciemat.Madrid (Spain); Parrondo, J. M. [Universidad Complutense. Madrid (Spain)
2001-07-01
A method is presented for computing runaway production rate based on techniques of noise-activated escape in a potential is presented in this work. A generalised potential in 2D momentum space is obtained from the deterministic or drift terms of Langevin equations. The diffusive or stochastic terms that arise directly from the stochastic nature of collisions, play the role of the noise that activates barrier crossings. The runaway electron source is given by the escape rate in such a potential which is obtained from an Arrenius-like relation. Runaway electrons are those skip the potential barrier due to the effect of stochastic collisions. In terms of computation time, this method allows one to quickly obtain the source term for a runway electron transport code.(Author) 11 refs.
Statistical techniques for the identification of reactor component structural vibrations
International Nuclear Information System (INIS)
Kemeny, L.G.
1975-01-01
The identification, on-line and in near real-time, of the vibration frequencies, modes and amplitudes of selected key reactor structural components and the visual monitoring of these phenomena by nuclear power plant operating staff will serve to further the safety and control philosophy of nuclear systems and lead to design optimisation. The School of Nuclear Engineering has developed a data acquisition system for vibration detection and identification. The system is interfaced with the HIFAR research reactor of the Australian Atomic Energy Commission. The reactor serves to simulate noise and vibrational phenomena which might be pertinent in power reactor situations. The data acquisition system consists of a small computer interfaced with a digital correlator and a Fourier transform unit. An incremental tape recorder is utilised as a backing store and as a means of communication with other computers. A small analogue computer and an analogue statistical analyzer can be used in the pre and post computational analysis of signals which are received from neutron and gamma detectors, thermocouples, accelerometers, hydrophones and strain gauges. Investigations carried out to date include a study of the role of local and global pressure fields due to turbulence in coolant flow and pump impeller induced perturbations on (a) control absorbers, (B) fuel element and (c) coolant external circuit and core tank structure component vibrations. (Auth.)
Consideration of techniques to mitigate the unauthorized 3D printing production of keys
Straub, Jeremy; Kerlin, Scott
2016-05-01
The illicit production of 3D printed keys based on remote-sensed imagery is problematic as it allows a would-be intruder to access a secured facility without the attack attempt being as obviously detectable as conventional techniques. This paper considers the problem from multiple perspectives. First, it looks at different attack types and considers the prospective attack from a digital information perspective. Second, based on this, techniques for securing keys are considered. Third, the design of keys is considered from the perspective of making them more difficult to duplicate using visible light sensing and 3D printing. Policy and legal considerations are discussed.
International Nuclear Information System (INIS)
Brodsky, A.
1986-04-01
This report provides statistical concepts and formulas for defining minimum detectable amount (MDA), bias and precision of sample analytical measurements of radioactivity for radiobioassay purposes. The defined statistical quantities and accuracy criteria were developed for use in standard performance criteria for radiobioassay, but are also useful in intralaboratory quality assurance programs. This report also includes a literature review and analysis of accuracy needs and accuracy recommendations of national and international scientific organizations for radiation or radioactivity measurements used for radiation protection purposes. Computer programs are also included for calculating the probabilities of passing or failing multiple analytical tests for different acceptable ranges of bias and precision
Vaughn, Brandon K.
2009-01-01
This study considers the effectiveness of a "balanced amalgamated" approach to teaching graduate level introductory statistics. Although some research stresses replacing traditional lectures with more active learning methods, the approach of this study is to combine effective lecturing with active learning and team projects. The results of this…
International Nuclear Information System (INIS)
Land, C.E.; Pierce, D.A.
1983-01-01
Statistical theory and methodology provide the logical structure for scientific inference about the cancer risk associated with exposure to ionizing radiation. Although much is known about radiation carcinogenesis, the risk associated with low-level exposures is difficult to assess because it is too small to measure directly. Estimation must therefore depend upon mathematical models which relate observed risks at high exposure levels to risks at lower exposure levels. Extrapolated risk estimates obtained using such models are heavily dependent upon assumptions about the shape of the dose-response relationship, the temporal distribution of risk following exposure, and variation of risk according to variables such as age at exposure, sex, and underlying population cancer rates. Expanded statistical models, which make explicit certain assumed relationships between different data sets, can be used to strengthen inferences by incorporating relevant information from diverse sources. They also allow the uncertainties inherent in information from related data sets to be expressed in estimates which partially depend upon that information. To the extent that informed opinion is based upon a valid assessment of scientific data, the larger context of decision theory, which includes statistical theory, provides a logical framework for the incorporation into public policy decisions of the informational content of expert opinion
Scientific Opinion on Statistical considerations for the safety evaluation of GMOs
DEFF Research Database (Denmark)
Sørensen, Ilona Kryspin
in the experimental design of field trials, such as the inclusion of commercial varieties, in order to ensure sufficient statistical power and reliable estimation of natural variability. A graphical representation is proposed to allow the comparison of the GMO, its conventional counterpart and the commercial...... such estimates are unavailable may they be estimated from databases or literature. Estimated natural variability should be used to specify equivalence limits to test the difference between the GMO and the commercial varieties. Adjustments to these equivalence limits allow a simple graphical representation so...... in this opinion may be used, in certain cases, for the evaluation of GMOs other than plants....
Statistical and Methodological Considerations for the Interpretation of Intranasal Oxytocin Studies.
Walum, Hasse; Waldman, Irwin D; Young, Larry J
2016-02-01
Over the last decade, oxytocin (OT) has received focus in numerous studies associating intranasal administration of this peptide with various aspects of human social behavior. These studies in humans are inspired by animal research, especially in rodents, showing that central manipulations of the OT system affect behavioral phenotypes related to social cognition, including parental behavior, social bonding, and individual recognition. Taken together, these studies in humans appear to provide compelling, but sometimes bewildering, evidence for the role of OT in influencing a vast array of complex social cognitive processes in humans. In this article, we investigate to what extent the human intranasal OT literature lends support to the hypothesis that intranasal OT consistently influences a wide spectrum of social behavior in humans. We do this by considering statistical features of studies within this field, including factors like statistical power, prestudy odds, and bias. Our conclusion is that intranasal OT studies are generally underpowered and that there is a high probability that most of the published intranasal OT findings do not represent true effects. Thus, the remarkable reports that intranasal OT influences a large number of human social behaviors should be viewed with healthy skepticism, and we make recommendations to improve the reliability of human OT studies in the future. Copyright © 2016 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.
International Nuclear Information System (INIS)
Tal, Balazs; Bencze, Attila; Zoletnik, Sandor; Veres, Gabor; Por, Gabor
2011-01-01
Time delay estimation methods (TDE) are well-known techniques to investigate poloidal flows in hot magnetized plasmas through the propagation properties of turbulent structures in the medium. One of these methods is based on the estimation of the time lag at which the cross-correlation function (CCF) estimation reaches its maximum value. The uncertainty of the peak location refers to the smallest determinable flow velocity modulation, and therefore the standard deviation of the time delay imposes important limitation to the measurements. In this article, the relative standard deviation of the CCF estimation and the standard deviation of its peak location are calculated analytically using a simple model of turbulent signals. This model assumes independent (non interacting) overlapping events (coherent structures) with randomly distributed spatio-temporal origins moving with background flow. The result of our calculations is the derivation of a general formula for the CCF variance, which is valid not exclusively in the high event density limit, but also for arbitrary event densities. Our formula reproduces the well known expression for high event densities previously published in the literature. In this paper we also present a derivation of the variance of time delay estimation that turns out to be inversely proportional to the applied time window. The derived formulas were tested in real plasma measurements. The calculations are an extension of the earlier work of Bencze and Zoletnik [Phys. Plasmas 12, 052323 (2005)] where the autocorrelation-width technique was developed. Additionally, we show that velocities calculated by a TDE method possess a broadband noise which originates from this variance, its power spectral density cannot be decreased by worsening the time resolution and can be coherent with noises of other velocity measurements where the same turbulent structures are used. This noise should not be confused with the impact of zero mean frequency zonal flow
Directory of Open Access Journals (Sweden)
Henry Braun
2017-11-01
Full Text Available Abstract Background Economists are making increasing use of measures of student achievement obtained through large-scale survey assessments such as NAEP, TIMSS, and PISA. The construction of these measures, employing plausible value (PV methodology, is quite different from that of the more familiar test scores associated with assessments such as the SAT or ACT. These differences have important implications both for utilization and interpretation. Although much has been written about PVs, it appears that there are still misconceptions about whether and how to employ them in secondary analyses. Methods We address a range of technical issues, including those raised in a recent article that was written to inform economists using these databases. First, an extensive review of the relevant literature was conducted, with particular attention to key publications that describe the derivation and psychometric characteristics of such achievement measures. Second, a simulation study was carried out to compare the statistical properties of estimates based on the use of PVs with those based on other, commonly used methods. Results It is shown, through both theoretical analysis and simulation, that under fairly general conditions appropriate use of PV yields approximately unbiased estimates of model parameters in regression analyses of large scale survey data. The superiority of the PV methodology is particularly evident when measures of student achievement are employed as explanatory variables. Conclusions The PV methodology used to report student test performance in large scale surveys remains the state-of-the-art for secondary analyses of these databases.
International Nuclear Information System (INIS)
Takeda, H.; Isha, H.
1981-01-01
The paper is concerned with the displacement-assumed-finite elements by applying the reduced numerical integration technique in structural problems. The first part is a general consideration on the technique. Its purpose is to examine a variational interpretation of the finite element displacement formulation with the reduced integration technique in structural problems. The formulation is critically studied from a standpoint of the natural stiffness approach. It is shown that these types of elements are equivalent to a certain type of displacement and stress assumed mixed elements. The rank deficiency of the stiffness matrix of these elements is interpreted as a problem in the transformation from the natural system to a Cartesian system. It will be shown that a variational basis of the equivalent mixed formulation is closely related to the Hellinger-Reissner's functional. It is presented that for simple elements, e.g. bilinear quadrilateral plane stress and plate bending there are corresponding mixed elements from the functional. For relatively complex types of these elements, it is shown that they are equivalent to localized mixed elements from the Hellinger-Reissner's functional. In the second part, typical finite elements with the reduced integration technique are studied to demonstrate this equivalence. A bilinear displacement and rotation assumed shear beam element, a bilinear displacement assumed quadrilateral plane stress element and a bilinear deflection and rotation assumed quadrilateral plate bending element are examined to present equivalent mixed elements. Not only the theoretical consideration is presented but numerical studies are shown to demonstrate the effectiveness of these elements in practical analysis. (orig.)
Comparison of statistical accuracy between the 'direct' and the 'reverse' time-of-flight techniques
International Nuclear Information System (INIS)
Kudryashev, V.A.; Hartung, U.
1992-01-01
The statistical accuracy between two neutron time-of-flight (TOF) diffraction techniques, the classic 'forward' TOF and the 'reverse' TOF technique, are compared. This problem is discussed in dependence on the diffracted spectrum, the background and some special device parameters. In general the 'reverse' TOF method yields better statistics in the spectrum's range above the medium channel content; by the classic TOF method this is achieved in the lower area. For that reason, the reverse TOF measurement is especially recommendable for structure problems and the forward TOF technique for studying the background (e.g. the inelastic scattered portion). (orig.)
Pattern recognition in remote-sensing imagery using data mining and statistical techniques
Singh, Rajesh Kumar
The remote sensing image classification domain has been explored and examined by scientists in the past using classical statistical and machine-learning techniques. Statistical techniques like Bayesian classifiers are good when the data is noise-free or normalized, while implicit models, or machine learning algorithms, such as artificial neural networks (ANN) are more of a "black box", relying on iterative training to adjust parameters using transfer functions to improve their predictive ability relative to training data for which the outputs are known. The statistical approach performs better when a priori information about categories is available, but they have limitations in the case of objective classification and when the distribution of data points are not known, as is the case with remote sensing satellite data. Data mining algorithms, which have potential advantages over classical statistical classifiers in analyzing remote sensing imagery data, were examined for use in land use classification of remote sensing data. Spectral classifications of LANDSAT(TM) imagery from 1989 were conducted using data mining and statistical techniques. The site selected for this research was NASA's Kennedy Space Center (KSC) in Florida. The raw satellite data used in classification was obtained using feature-extraction image processing techniques. The classification effort can broadly be divided into two major categories: (a) Supervised classification with subjectively defined prior known classes, and (b) Unsupervised classification with objectively categorized natural groups of similar attributes. Several predictive models and segmentation classification schemes were developed. The techniques used for evaluation of spectral patterns were based on both statistical and data mining algorithms. The statistical technique involved k-nearest neighbor statistical method, while data mining algorithms included: (1) back-propagation artificial neural network technique for two
International Nuclear Information System (INIS)
Moscati, A.F. Jr.; Hediger, E.M.; Rupp, M.J.
1986-01-01
High concentrations of lead in soils along an abandoned railroad line prompted a remedial investigation to characterize the extent of contamination across a 7-acre site. Contamination was thought to be spotty across the site reflecting its past use in battery recycling operations at discrete locations. A screening technique was employed to delineate the more highly contaminated areas by testing a statistically determined minimum number of random samples from each of seven discrete site areas. The approach not only quickly identified those site areas which would require more extensive grid sampling, but also provided a statistically defensible basis for excluding other site areas from further consideration, thus saving the cost of additional sample collection and analysis. The reduction in the number of samples collected in ''clean'' areas of the site ranged from 45 to 60%
Hayslett, H T
1991-01-01
Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the
Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.
Application of Multivariable Statistical Techniques in Plant-wide WWTP Control Strategies Analysis
DEFF Research Database (Denmark)
Flores Alsina, Xavier; Comas, J.; Rodríguez-Roda, I.
2007-01-01
The main objective of this paper is to present the application of selected multivariable statistical techniques in plant-wide wastewater treatment plant (WWTP) control strategies analysis. In this study, cluster analysis (CA), principal component analysis/factor analysis (PCA/FA) and discriminant...
Statistical designs and response surface techniques for the optimization of chromatographic systems.
Ferreira, Sergio Luis Costa; Bruns, Roy Edward; da Silva, Erik Galvão Paranhos; Dos Santos, Walter Nei Lopes; Quintella, Cristina Maria; David, Jorge Mauricio; de Andrade, Jailson Bittencourt; Breitkreitz, Marcia Cristina; Jardim, Isabel Cristina Sales Fontes; Neto, Benicio Barros
2007-07-27
This paper describes fundamentals and applications of multivariate statistical techniques for the optimization of chromatographic systems. The surface response methodologies: central composite design, Doehlert matrix and Box-Behnken design are discussed and applications of these techniques for optimization of sample preparation steps (extractions) and determination of experimental conditions for chromatographic separations are presented. The use of mixture design for optimization of mobile phases is also related. An optimization example involving a real separation process is exhaustively described. A discussion about model validation is presented. Some applications of other multivariate techniques for optimization of chromatographic methods are also summarized.
Directory of Open Access Journals (Sweden)
D.P. van der Nest
2015-03-01
Full Text Available This article explores the use by internal audit functions of audit sampling techniques in order to test the effectiveness of controls in the banking sector. The article focuses specifically on the use of statistical and/or non-statistical sampling techniques by internal auditors. The focus of the research for this article was internal audit functions in the banking sector of South Africa. The results discussed in the article indicate that audit sampling is still used frequently as an audit evidence-gathering technique. Non-statistical sampling techniques are used more frequently than statistical sampling techniques for the evaluation of the sample. In addition, both techniques are regarded as important for the determination of the sample size and the selection of the sample items
de Savigny, Don; Riley, Ian; Chandramohan, Daniel; Odhiambo, Frank; Nichols, Erin; Notzon, Sam; AbouZahr, Carla; Mitra, Raj; Cobos Muñoz, Daniel; Firth, Sonja; Maire, Nicolas; Sankoh, Osman; Bronson, Gay; Setel, Philip; Byass, Peter; Jakob, Robert; Boerma, Ties; Lopez, Alan D.
2017-01-01
ABSTRACT Background: Reliable and representative cause of death (COD) statistics are essential to inform public health policy, respond to emerging health needs, and document progress towards Sustainable Development Goals. However, less than one-third of deaths worldwide are assigned a cause. Civil registration and vital statistics (CRVS) systems in low- and lower-middle-income countries are failing to provide timely, complete and accurate vital statistics, and it will still be some time before they can provide physician-certified COD for every death. Proposals: Verbal autopsy (VA) is a method to ascertain the probable COD and, although imperfect, it is the best alternative in the absence of medical certification. There is extensive experience with VA in research settings but only a few examples of its use on a large scale. Data collection using electronic questionnaires on mobile devices and computer algorithms to analyse responses and estimate probable COD have increased the potential for VA to be routinely applied in CRVS systems. However, a number of CRVS and health system integration issues should be considered in planning, piloting and implementing a system-wide intervention such as VA. These include addressing the multiplicity of stakeholders and sub-systems involved, integration with existing CRVS work processes and information flows, linking VA results to civil registration records, information technology requirements and data quality assurance. Conclusions: Integrating VA within CRVS systems is not simply a technical undertaking. It will have profound system-wide effects that should be carefully considered when planning for an effective implementation. This paper identifies and discusses the major system-level issues and emerging practices, provides a planning checklist of system-level considerations and proposes an overview for how VA can be integrated into routine CRVS systems. PMID:28137194
de Savigny, Don; Riley, Ian; Chandramohan, Daniel; Odhiambo, Frank; Nichols, Erin; Notzon, Sam; AbouZahr, Carla; Mitra, Raj; Cobos Muñoz, Daniel; Firth, Sonja; Maire, Nicolas; Sankoh, Osman; Bronson, Gay; Setel, Philip; Byass, Peter; Jakob, Robert; Boerma, Ties; Lopez, Alan D
2017-01-01
Reliable and representative cause of death (COD) statistics are essential to inform public health policy, respond to emerging health needs, and document progress towards Sustainable Development Goals. However, less than one-third of deaths worldwide are assigned a cause. Civil registration and vital statistics (CRVS) systems in low- and lower-middle-income countries are failing to provide timely, complete and accurate vital statistics, and it will still be some time before they can provide physician-certified COD for every death. Proposals: Verbal autopsy (VA) is a method to ascertain the probable COD and, although imperfect, it is the best alternative in the absence of medical certification. There is extensive experience with VA in research settings but only a few examples of its use on a large scale. Data collection using electronic questionnaires on mobile devices and computer algorithms to analyse responses and estimate probable COD have increased the potential for VA to be routinely applied in CRVS systems. However, a number of CRVS and health system integration issues should be considered in planning, piloting and implementing a system-wide intervention such as VA. These include addressing the multiplicity of stakeholders and sub-systems involved, integration with existing CRVS work processes and information flows, linking VA results to civil registration records, information technology requirements and data quality assurance. Integrating VA within CRVS systems is not simply a technical undertaking. It will have profound system-wide effects that should be carefully considered when planning for an effective implementation. This paper identifies and discusses the major system-level issues and emerging practices, provides a planning checklist of system-level considerations and proposes an overview for how VA can be integrated into routine CRVS systems.
International Nuclear Information System (INIS)
2005-01-01
For the years 2004 and 2005 the figures shown in the tables of Energy Review are partly preliminary. The annual statistics published in Energy Review are presented in more detail in a publication called Energy Statistics that comes out yearly. Energy Statistics also includes historical time-series over a longer period of time (see e.g. Energy Statistics, Statistics Finland, Helsinki 2004.) The applied energy units and conversion coefficients are shown in the back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes, precautionary stock fees and oil pollution fees
International Nuclear Information System (INIS)
2000-01-01
For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g., Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-March 2000, Energy exports by recipient country in January-March 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
International Nuclear Information System (INIS)
1999-01-01
For the year 1998 and the year 1999, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 1999, Energy exports by recipient country in January-June 1999, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
International Nuclear Information System (INIS)
2001-01-01
For the year 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions from the use of fossil fuels, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in 2000, Energy exports by recipient country in 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
International Nuclear Information System (INIS)
Higgins, J.; Fullwood, R.; Kroeger, P.; Youngblood, R.
1992-01-01
The PIUS (Process Inherent Ultimate Safety) reactor is an advanced design nuclear power plant that uses passive safety features and basic physical processes to address safety concerns. Brookhaven National Laboratory (BNL) performed a detailed study of the PIUS design for the NRC using primarily qualitative engineering analysis techniques. Some quantitative methods were also employed. There are three key initial areas of analysis: FMECA, HAZOP, and deterministic analyses, which are described herein. Once these three analysis methods were completed, the important findings from each of the methods were assembled into thePIUS Interim Table (PIT). This table thus contains a first cut sort of the important design considerations and features of the PIUS reactor. The table also identifies some potential initiating events and systems used for mitigating these initiators. The next stage of the analysis was the construction of event trees for each of the identified initiators. The most significant sequences were then determined qualitatively, using, some quantitative input. Finally, overall insights on the PIUS design developed from the PIT and from the event tree analysis were developed and presented
Fogleman, Sarah; Santana, Casey; Bishop, Casey; Miller, Alyssa; Capco, David G
2016-01-01
Thousands of mothers are at risk of transmitting mitochondrial diseases to their offspring each year, with the most severe form of these diseases being fatal [1]. With no cure, transmission prevention is the only current hope for decreasing the disease incidence. Current methods of prevention rely on low mutant maternal mitochondrial DNA levels, while those with levels close to or above threshold (>60%) are still at a very high risk of transmission [2]. Two novel approaches may offer hope for preventing and treating mitochondrial disease: mitochondrial replacement therapy, and CRISPR/Cas9. Mitochondrial replacement therapy has emerged as a promising tool that has the potential to prevent transmission in patients with higher mutant mitochondrial loads. This method is the subject of many ethical concerns due its use of a donor embryo to transplant the patient's nuclear DNA; however, it has ultimately been approved for use in the United Kingdom and was recently declared ethically permissible by the FDA. The leading-edge CRISPR/Cas9 technology exploits the principles of bacterial immune function to target and remove specific sequences of mutated DNA. This may have potential in treating individuals with disease caused by mutant mitochondrial DNA. As the technology progresses, it is important that the ethical considerations herein emerge and become more established. The purpose of this review is to discuss current research surrounding the procedure and efficacy of the techniques, compare the ethical concerns of each approach, and look into the future of mitochondrial gene replacement therapy.
Directory of Open Access Journals (Sweden)
Rink eHoekstra
2012-05-01
Full Text Available A valid interpretation of most statistical techniques requires that the criteria for one or more assumptions are met. In published articles, however, little information tends to be reported on whether the data satisfy the assumptions underlying the statistical techniques used. This could be due to self-selection: Only manuscripts with data fulfilling the assumptions are submitted. Another, more disquieting, explanation would be that violations of assumptions are hardly checked for in the first place. In this article a study is presented on whether and how 30 researchers checked fictitious data for violations of assumptions in their own working environment. They were asked to analyze the data as they would their own data, for which often used and well-known techniques like the t-procedure, ANOVA and regression were required. It was found that they hardly ever checked for violations of assumptions. Interviews afterwards revealed that mainly lack of knowledge and nonchalance, rather than more rational reasons like being aware of the robustness of a technique or unfamiliarity with an alternative, seem to account for this behavior. These data suggest that merely encouraging people to check for violations of assumptions will not lead them to do so, and that the use of statistics is opportunistic.
Statistical Approaches in GIS-Based Techniques for Sustainable Planning: Kayaçukuru Case
Aygun Erdogan
2003-01-01
The purpose of this study is to make both a summary and additional descriptive and inferential statistical analyses for a completed thesis on "Sustainable/Environment Friendly Development Planning of Fethiye-Kayaçukuru Using GIS-Based Techniques" (M.Sc. in the Graduate School of Geodetic and Geographic Information Technologies, Middle East Technical University, Supervisor: Assoc.Prof.Dr. Oğuz Işık, September 2000, 214 pages). The statistical analyses explained in this paper comprise a part of...
International Nuclear Information System (INIS)
2003-01-01
For the year 2002, part of the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot 2001, Statistics Finland, Helsinki 2002). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supply and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees on energy products
International Nuclear Information System (INIS)
2000-01-01
For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy also includes historical time series over a longer period (see e.g., Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 2000, Energy exports by recipient country in January-June 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
International Nuclear Information System (INIS)
2004-01-01
For the year 2003 and 2004, the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot, Statistics Finland, Helsinki 2003, ISSN 0785-3165). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-March 2004, Energy exports by recipient country in January-March 2004, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees
Shaikh, Muhammad Mujtaba; Memon, Abdul Jabbar; Hussain, Manzoor
2016-09-01
In this article, we describe details of the data used in the research paper "Confidence bounds for energy conservation in electric motors: An economical solution using statistical techniques" [1]. The data presented in this paper is intended to show benefits of high efficiency electric motors over the standard efficiency motors of similar rating in the industrial sector of Pakistan. We explain how the data was collected and then processed by means of formulas to show cost effectiveness of energy efficient motors in terms of three important parameters: annual energy saving, cost saving and payback periods. This data can be further used to construct confidence bounds for the parameters using statistical techniques as described in [1].
International Nuclear Information System (INIS)
Park, Jinyong; Balasingham, P.; McKenna, Sean Andrew; Kulatilake, Pinnaduwa H. S. W.
2004-01-01
Sandia National Laboratories, under contract to Nuclear Waste Management Organization of Japan (NUMO), is performing research on regional classification of given sites in Japan with respect to potential volcanic disruption using multivariate statistics and geo-statistical interpolation techniques. This report provides results obtained for hierarchical probabilistic regionalization of volcanism for the Sengan region in Japan by applying multivariate statistical techniques and geostatistical interpolation techniques on the geologic data provided by NUMO. A workshop report produced in September 2003 by Sandia National Laboratories (Arnold et al., 2003) on volcanism lists a set of most important geologic variables as well as some secondary information related to volcanism. Geologic data extracted for the Sengan region in Japan from the data provided by NUMO revealed that data are not available at the same locations for all the important geologic variables. In other words, the geologic variable vectors were found to be incomplete spatially. However, it is necessary to have complete geologic variable vectors to perform multivariate statistical analyses. As a first step towards constructing complete geologic variable vectors, the Universal Transverse Mercator (UTM) zone 54 projected coordinate system and a 1 km square regular grid system were selected. The data available for each geologic variable on a geographic coordinate system were transferred to the aforementioned grid system. Also the recorded data on volcanic activity for Sengan region were produced on the same grid system. Each geologic variable map was compared with the recorded volcanic activity map to determine the geologic variables that are most important for volcanism. In the regionalized classification procedure, this step is known as the variable selection step. The following variables were determined as most important for volcanism: geothermal gradient, groundwater temperature, heat discharge, groundwater
An Automated Energy Detection Algorithm Based on Morphological and Statistical Processing Techniques
2018-01-09
ARL-TR-8272 ● JAN 2018 US Army Research Laboratory An Automated Energy Detection Algorithm Based on Morphological and...is no longer needed. Do not return it to the originator. ARL-TR-8272 ● JAN 2018 US Army Research Laboratory An Automated Energy ...4. TITLE AND SUBTITLE An Automated Energy Detection Algorithm Based on Morphological and Statistical Processing Techniques 5a. CONTRACT NUMBER
Ratner, Bruce
2011-01-01
The second edition of a bestseller, Statistical and Machine-Learning Data Mining: Techniques for Better Predictive Modeling and Analysis of Big Data is still the only book, to date, to distinguish between statistical data mining and machine-learning data mining. The first edition, titled Statistical Modeling and Analysis for Database Marketing: Effective Techniques for Mining Big Data, contained 17 chapters of innovative and practical statistical data mining techniques. In this second edition, renamed to reflect the increased coverage of machine-learning data mining techniques, the author has
[Statistical study of the wavelet-based lossy medical image compression technique].
Puniene, Jūrate; Navickas, Ramūnas; Punys, Vytenis; Jurkevicius, Renaldas
2002-01-01
Medical digital images have informational redundancy. Both the amount of memory for image storage and their transmission time could be reduced if image compression techniques are applied. The techniques are divided into two groups: lossless (compression ratio does not exceed 3 times) and lossy ones. Compression ratio of lossy techniques depends on visibility of distortions. It is a variable parameter and it can exceed 20 times. A compression study was performed to evaluate the compression schemes, which were based on the wavelet transform. The goal was to develop a set of recommendations for an acceptable compression ratio for different medical image modalities: ultrasound cardiac images and X-ray angiographic images. The acceptable image quality after compression was evaluated by physicians. Statistical analysis of the evaluation results was used to form a set of recommendations.
Directory of Open Access Journals (Sweden)
VIMALA C.
2015-05-01
Full Text Available In recent years, speech technology has become a vital part of our daily lives. Various techniques have been proposed for developing Automatic Speech Recognition (ASR system and have achieved great success in many applications. Among them, Template Matching techniques like Dynamic Time Warping (DTW, Statistical Pattern Matching techniques such as Hidden Markov Model (HMM and Gaussian Mixture Models (GMM, Machine Learning techniques such as Neural Networks (NN, Support Vector Machine (SVM, and Decision Trees (DT are most popular. The main objective of this paper is to design and develop a speaker-independent isolated speech recognition system for Tamil language using the above speech recognition techniques. The background of ASR system, the steps involved in ASR, merits and demerits of the conventional and machine learning algorithms and the observations made based on the experiments are presented in this paper. For the above developed system, highest word recognition accuracy is achieved with HMM technique. It offered 100% accuracy during training process and 97.92% for testing process.
Gaitán Fernández, E.; García Moreno, R.; Pino Otín, M. R.; Ribalaygua Batalla, J.
2012-04-01
Climate and soil are two of the most important limiting factors for agricultural production. Nowadays climate change has been documented in many geographical locations affecting different cropping systems. The General Circulation Models (GCM) has become important tools to simulate the more relevant aspects of the climate expected for the XXI century in the frame of climatic change. These models are able to reproduce the general features of the atmospheric dynamic but their low resolution (about 200 Km) avoids a proper simulation of lower scale meteorological effects. Downscaling techniques allow overcoming this problem by adapting the model outcomes to local scale. In this context, FIC (Fundación para la Investigación del Clima) has developed a statistical downscaling technique based on a two step analogue methods. This methodology has been broadly tested on national and international environments leading to excellent results on future climate models. In a collaboration project, this statistical downscaling technique was applied to predict future scenarios for the grape growing systems in Spain. The application of such model is very important to predict expected climate for the different growing crops, mainly for grape, where the success of different varieties are highly related to climate and soil. The model allowed the implementation of agricultural conservation practices in the crop production, detecting highly sensible areas to negative impacts produced by any modification of climate in the different regions, mainly those protected with protected designation of origin, and the definition of new production areas with optimal edaphoclimatic conditions for the different varieties.
Directory of Open Access Journals (Sweden)
Land Walker H
2011-01-01
Full Text Available Abstract Background When investigating covariate interactions and group associations with standard regression analyses, the relationship between the response variable and exposure may be difficult to characterize. When the relationship is nonlinear, linear modeling techniques do not capture the nonlinear information content. Statistical learning (SL techniques with kernels are capable of addressing nonlinear problems without making parametric assumptions. However, these techniques do not produce findings relevant for epidemiologic interpretations. A simulated case-control study was used to contrast the information embedding characteristics and separation boundaries produced by a specific SL technique with logistic regression (LR modeling representing a parametric approach. The SL technique was comprised of a kernel mapping in combination with a perceptron neural network. Because the LR model has an important epidemiologic interpretation, the SL method was modified to produce the analogous interpretation and generate odds ratios for comparison. Results The SL approach is capable of generating odds ratios for main effects and risk factor interactions that better capture nonlinear relationships between exposure variables and outcome in comparison with LR. Conclusions The integration of SL methods in epidemiology may improve both the understanding and interpretation of complex exposure/disease relationships.
Heine, John J; Land, Walker H; Egan, Kathleen M
2011-01-27
When investigating covariate interactions and group associations with standard regression analyses, the relationship between the response variable and exposure may be difficult to characterize. When the relationship is nonlinear, linear modeling techniques do not capture the nonlinear information content. Statistical learning (SL) techniques with kernels are capable of addressing nonlinear problems without making parametric assumptions. However, these techniques do not produce findings relevant for epidemiologic interpretations. A simulated case-control study was used to contrast the information embedding characteristics and separation boundaries produced by a specific SL technique with logistic regression (LR) modeling representing a parametric approach. The SL technique was comprised of a kernel mapping in combination with a perceptron neural network. Because the LR model has an important epidemiologic interpretation, the SL method was modified to produce the analogous interpretation and generate odds ratios for comparison. The SL approach is capable of generating odds ratios for main effects and risk factor interactions that better capture nonlinear relationships between exposure variables and outcome in comparison with LR. The integration of SL methods in epidemiology may improve both the understanding and interpretation of complex exposure/disease relationships.
International Nuclear Information System (INIS)
Carvajal Escobar Yesid; Munoz, Flor Matilde
2007-01-01
The project this centred in the revision of the state of the art of the ocean-atmospheric phenomena that you affect the Colombian hydrology especially The Phenomenon Enos that causes a socioeconomic impact of first order in our country, it has not been sufficiently studied; therefore it is important to approach the thematic one, including the variable macroclimates associated to the Enos in the analyses of water planning. The analyses include revision of statistical techniques of analysis of consistency of hydrological data with the objective of conforming a database of monthly flow of the river reliable and homogeneous Cauca. Statistical methods are used (Analysis of data multivariante) specifically The analysis of principal components to involve them in the development of models of prediction of flows monthly means in the river Cauca involving the Lineal focus as they are the model autoregressive AR, ARX and Armax and the focus non lineal Net Artificial Network.
Directory of Open Access Journals (Sweden)
В А Бубнов
2017-12-01
Full Text Available In article the maintenance of a technique of training in a computer class on the example of the statistical analysis of the price of dollar in rubles within March, 2017 with use of the Microsoft Excel program is shown. This analysis allows from the traditional data defining dynamics of the price of dollar depending on date of day of this month to reveal days of month in which the price of dollar is grouped rather average price of dollar, and also to reveal so-called rare days in which the dollar price strongly differs from average as towards her reduction, and increase.
The k-means clustering technique: General considerations and implementation in Mathematica
Directory of Open Access Journals (Sweden)
Laurence Morissette
2013-02-01
Full Text Available Data clustering techniques are valuable tools for researchers working with large databases of multivariate data. In this tutorial, we present a simple yet powerful one: the k-means clustering technique, through three different algorithms: the Forgy/Lloyd, algorithm, the MacQueen algorithm and the Hartigan and Wong algorithm. We then present an implementation in Mathematica and various examples of the different options available to illustrate the application of the technique.
GIS-based bivariate statistical techniques for groundwater potential analysis (an example of Iran)
Haghizadeh, Ali; Moghaddam, Davoud Davoudi; Pourghasemi, Hamid Reza
2017-12-01
Groundwater potential analysis prepares better comprehension of hydrological settings of different regions. This study shows the potency of two GIS-based data driven bivariate techniques namely statistical index (SI) and Dempster-Shafer theory (DST) to analyze groundwater potential in Broujerd region of Iran. The research was done using 11 groundwater conditioning factors and 496 spring positions. Based on the ground water potential maps (GPMs) of SI and DST methods, 24.22% and 23.74% of the study area is covered by poor zone of groundwater potential, and 43.93% and 36.3% of Broujerd region is covered by good and very good potential zones, respectively. The validation of outcomes displayed that area under the curve (AUC) of SI and DST techniques are 81.23% and 79.41%, respectively, which shows SI method has slightly a better performance than the DST technique. Therefore, SI and DST methods are advantageous to analyze groundwater capacity and scrutinize the complicated relation between groundwater occurrence and groundwater conditioning factors, which permits investigation of both systemic and stochastic uncertainty. Finally, it can be realized that these techniques are very beneficial for groundwater potential analyzing and can be practical for water-resource management experts.
Platnick, S.
1999-01-01
Photon transport in a multiple scattering medium is critically dependent on scattering statistics, in particular the average number of scatterings. A superposition technique is derived to accurately determine the average number of scatterings encountered by reflected and transmitted photons within arbitrary layers in plane-parallel, vertically inhomogeneous clouds. As expected, the resulting scattering number profiles are highly dependent on cloud particle absorption and solar/viewing geometry. The technique uses efficient adding and doubling radiative transfer procedures, avoiding traditional time-intensive Monte Carlo methods. Derived superposition formulae are applied to a variety of geometries and cloud models, and selected results are compared with Monte Carlo calculations. Cloud remote sensing techniques that use solar reflectance or transmittance measurements generally assume a homogeneous plane-parallel cloud structure. The scales over which this assumption is relevant, in both the vertical and horizontal, can be obtained from the superposition calculations. Though the emphasis is on photon transport in clouds, the derived technique is applicable to any scattering plane-parallel radiative transfer problem, including arbitrary combinations of cloud, aerosol, and gas layers in the atmosphere.
MacLean, Adam L.
2015-12-16
The last decade has seen an explosion in models that describe phenomena in systems medicine. Such models are especially useful for studying signaling pathways, such as the Wnt pathway. In this chapter we use the Wnt pathway to showcase current mathematical and statistical techniques that enable modelers to gain insight into (models of) gene regulation and generate testable predictions. We introduce a range of modeling frameworks, but focus on ordinary differential equation (ODE) models since they remain the most widely used approach in systems biology and medicine and continue to offer great potential. We present methods for the analysis of a single model, comprising applications of standard dynamical systems approaches such as nondimensionalization, steady state, asymptotic and sensitivity analysis, and more recent statistical and algebraic approaches to compare models with data. We present parameter estimation and model comparison techniques, focusing on Bayesian analysis and coplanarity via algebraic geometry. Our intention is that this (non-exhaustive) review may serve as a useful starting point for the analysis of models in systems medicine.
Wu, Jianning; Wu, Bin
2015-01-01
The accurate identification of gait asymmetry is very beneficial to the assessment of at-risk gait in the clinical applications. This paper investigated the application of classification method based on statistical learning algorithm to quantify gait symmetry based on the assumption that the degree of intrinsic change in dynamical system of gait is associated with the different statistical distributions between gait variables from left-right side of lower limbs; that is, the discrimination of small difference of similarity between lower limbs is considered the reorganization of their different probability distribution. The kinetic gait data of 60 participants were recorded using a strain gauge force platform during normal walking. The classification method is designed based on advanced statistical learning algorithm such as support vector machine algorithm for binary classification and is adopted to quantitatively evaluate gait symmetry. The experiment results showed that the proposed method could capture more intrinsic dynamic information hidden in gait variables and recognize the right-left gait patterns with superior generalization performance. Moreover, our proposed techniques could identify the small significant difference between lower limbs when compared to the traditional symmetry index method for gait. The proposed algorithm would become an effective tool for early identification of the elderly gait asymmetry in the clinical diagnosis. PMID:25705672
Directory of Open Access Journals (Sweden)
Jianning Wu
2015-01-01
Full Text Available The accurate identification of gait asymmetry is very beneficial to the assessment of at-risk gait in the clinical applications. This paper investigated the application of classification method based on statistical learning algorithm to quantify gait symmetry based on the assumption that the degree of intrinsic change in dynamical system of gait is associated with the different statistical distributions between gait variables from left-right side of lower limbs; that is, the discrimination of small difference of similarity between lower limbs is considered the reorganization of their different probability distribution. The kinetic gait data of 60 participants were recorded using a strain gauge force platform during normal walking. The classification method is designed based on advanced statistical learning algorithm such as support vector machine algorithm for binary classification and is adopted to quantitatively evaluate gait symmetry. The experiment results showed that the proposed method could capture more intrinsic dynamic information hidden in gait variables and recognize the right-left gait patterns with superior generalization performance. Moreover, our proposed techniques could identify the small significant difference between lower limbs when compared to the traditional symmetry index method for gait. The proposed algorithm would become an effective tool for early identification of the elderly gait asymmetry in the clinical diagnosis.
de Savigny, D; Riley, I; Chandramohan, D; Odhiambo, F; Nichols, E; Notzon, S; AbouZahr, C; Mitra, R; Cobos Muñoz, D; Firth, S; Maire, N; Sankoh, O; Bronson, G; Setel, P; Byass, P
2017-01-01
ABSTRACT Background: Reliable and representative cause of death (COD) statistics are essential to inform public health policy, respond to emerging health needs, and document progress towards Sustainable Development Goals. However, less than one-third of deaths worldwide are assigned a cause. Civil registration and vital statistics (CRVS) systems in low- and lower-middle-income countries are failing to provide timely, complete and accurate vital statistics, and it will still be some time befor...
Costa, Elisabete C; de Melo-Diogo, Duarte; Moreira, André F; Carvalho, Marco P; Correia, Ilídio J
2018-01-01
Scalable and reproducible production of 3D cellular spheroids is highly demanded, by pharmaceutical companies, for drug screening purposes during the pre-clinical evaluation phase. These 3D cellular constructs, unlike the monolayer culture of cells, can mimic different features of human tissues, including cellular organization, cell-cell and cell-extracellular matrix (ECM) interactions. Up to now, different techniques (scaffold-based and -free) have been used for spheroids formation, being the Liquid Overlay Technique (LOT) one of the most explored methodologies, due to its low cost and easy handling. Additionally, during the last few decades, this technique has been widely investigated in order to enhance its potential for being applied in high-throughput analysis. Herein, an overview of the LOT advances, practical approaches, and troubleshooting is provided for those researchers that intend to produce spheroids using LOT, for drug screening purposes. Moreover, the advantages of the LOT over the other scaffold-free techniques used for the spheroids formation are also addressed. Highlights • 2D cell culture drawbacks are summarized; • spheroids mimic the features of human tissues; • scaffold-based and scaffold-free technologies for spheroids production are discussed; • advantages of LOT over other scaffold-free techniques are highlighted; • LOT advances, practical approaches and troubleshooting are underlined. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Star-Galaxy Classification Using Data Mining Techniques with Considerations for Unbalanced Datasets
O'Keefe, P. J.; Gowanlock, M. G.; McConnell, S. M.; Patton, D.
2009-09-01
We used a range of data-mining techniques in an effort to improve the classification of stars and galaxies for imaging data from the Canada-France-Hawaii Telescope Legacy Survey (CFHTLS), and extracted with SExtractor. We found that the Artificial Neural Network (ANN) achieved higher accuracies than Support Vector Machines, but was outperformed by the Random Forest and Decision Tree data-mining techniques on 5000 randomly sampled objects. This has potentially negative implications for SExtractor which uses an ANN to produce a measure of stellarity for each object. We found that the classification of stars and galaxies can be improved by voting (between Decision Trees, Random Forests and ANNs) and using balanced datasets. For the balanced datasets that we created, the three data mining techniques agreed over 80% of the time on the type of object.
International Nuclear Information System (INIS)
Beale, E.M.L.
1983-05-01
The Department of the Environment has embarked on a programme to develop computer models to help with assessment of sites suitable for the disposal of nuclear wastes. The first priority is to produce a system, based on the System Variability Analysis Code (SYVAC) obtained from Atomic Energy of Canada Ltd., suitable for assessing radioactive waste disposal in land repositories containing non heat producing wastes from typical UK sources. The requirements of the SYVAC system development were so diverse that each portion of the development was contracted to a different company. Scicon are responsible for software coordination, system integration and user interface. Their present report contains comments on 'Statistical techniques for the development and application of SYVAC'. (U.K.)
International Nuclear Information System (INIS)
Kumar, A.; Srinivasan, M.
1986-01-01
A new equation, called the neutron multiplicity equation (NME), has been derived starting from basic physics principles. Neutron multiplicity v is defined as the integral number of neutrons leaking from a neutron multiplying system for a source neutron introduced into it. Probability distribution of neutron multiplicities (PDNMs) gives the probability of leakage of neutrons as a function of their multiplicity v. The PDNM is directly measurable through statistical correlation techniques. In a specific application, the NME has been solved for PDNM as a function of v for /sup 9/Be spheres of varying radii and driven by a centrally located 14-MeV deuterium-tritium neutron source. The potential of NME for sensitivity analysis is demonstrated through a particular modification of secondary neutron transfer cross sections of /sup 9/Be. It turns out that PDNM is very sensitive, even as the ''average'' neutron leakage is practically insensitive to it
Source of statistical noises in the Monte Carlo sampling techniques for coherently scattered photons
Muhammad, Wazir; Lee, Sang Hoon
2013-01-01
Detailed comparisons of the predictions of the Relativistic Form Factors (RFFs) and Modified Form Factors (MFFs) and their advantages and shortcomings in calculating elastic scattering cross sections can be found in the literature. However, the issues related to their implementation in the Monte Carlo (MC) sampling for coherently scattered photons is still under discussion. Secondly, the linear interpolation technique (LIT) is a popular method to draw the integrated values of squared RFFs/MFFs (i.e. ) over squared momentum transfer (). In the current study, the role/issues of RFFs/MFFs and LIT in the MC sampling for the coherent scattering were analyzed. The results showed that the relative probability density curves sampled on the basis of MFFs are unable to reveal any extra scientific information as both the RFFs and MFFs produced the same MC sampled curves. Furthermore, no relationship was established between the multiple small peaks and irregular step shapes (i.e. statistical noise) in the PDFs and either RFFs or MFFs. In fact, the noise in the PDFs appeared due to the use of LIT. The density of the noise depends upon the interval length between two consecutive points in the input data table of and has no scientific background. The probability density function curves became smoother as the interval lengths were decreased. In conclusion, these statistical noises can be efficiently removed by introducing more data points in the data tables. PMID:22984278
Mansouri, Majdi; Nounou, Mohamed N; Nounou, Hazem N
2017-09-01
In our previous work, we have demonstrated the effectiveness of the linear multiscale principal component analysis (PCA)-based moving window (MW)-generalized likelihood ratio test (GLRT) technique over the classical PCA and multiscale principal component analysis (MSPCA)-based GLRT methods. The developed fault detection algorithm provided optimal properties by maximizing the detection probability for a particular false alarm rate (FAR) with different values of windows, and however, most real systems are nonlinear, which make the linear PCA method not able to tackle the issue of non-linearity to a great extent. Thus, in this paper, first, we apply a nonlinear PCA to obtain an accurate principal component of a set of data and handle a wide range of nonlinearities using the kernel principal component analysis (KPCA) model. The KPCA is among the most popular nonlinear statistical methods. Second, we extend the MW-GLRT technique to one that utilizes exponential weights to residuals in the moving window (instead of equal weightage) as it might be able to further improve fault detection performance by reducing the FAR using exponentially weighed moving average (EWMA). The developed detection method, which is called EWMA-GLRT, provides improved properties, such as smaller missed detection and FARs and smaller average run length. The idea behind the developed EWMA-GLRT is to compute a new GLRT statistic that integrates current and previous data information in a decreasing exponential fashion giving more weight to the more recent data. This provides a more accurate estimation of the GLRT statistic and provides a stronger memory that will enable better decision making with respect to fault detection. Therefore, in this paper, a KPCA-based EWMA-GLRT method is developed and utilized in practice to improve fault detection in biological phenomena modeled by S-systems and to enhance monitoring process mean. The idea behind a KPCA-based EWMA-GLRT fault detection algorithm is to
Duhan, Darshana; Pandey, Ashish
2015-08-01
In this study, downscaling models were developed for the projections of monthly maximum and minimum air temperature for three stations, namely, Allahabad, Satna, and Rewa in Tons River basin, which is a sub-basin of the Ganges River in Central India. The three downscaling techniques, namely, multiple linear regression (MLR), artificial neural network (ANN), and least square support vector machine (LS-SVM), were used for the development of models, and best identified model was used for simulations of future predictand (temperature) using third-generation Canadian Coupled Global Climate Model (CGCM3) simulation of A2 emission scenario for the period 2001-2100. The performance of the models was evaluated based on four statistical performance indicators. To reduce the bias in monthly projected temperature series, bias correction technique was employed. The results show that all the models are able to simulate temperature; however, LS-SVM models perform slightly better than ANN and MLR. The best identified LS-SVM models are then employed to project future temperature. The results of future projections show the increasing trends in maximum and minimum temperature for A2 scenario. Further, it is observed that minimum temperature will increase at greater rate than maximum temperature.
Directory of Open Access Journals (Sweden)
Khalifa M. Al-Kindi
2017-08-01
Full Text Available In order to understand the distribution and prevalence of Ommatissus lybicus (Hemiptera: Tropiduchidae as well as analyse their current biographical patterns and predict their future spread, comprehensive and detailed information on the environmental, climatic, and agricultural practices are essential. The spatial analytical techniques such as Remote Sensing and Spatial Statistics Tools, can help detect and model spatial links and correlations between the presence, absence and density of O. lybicus in response to climatic, environmental, and human factors. The main objective of this paper is to review remote sensing and relevant analytical techniques that can be applied in mapping and modelling the habitat and population density of O. lybicus. An exhaustive search of related literature revealed that there are very limited studies linking location-based infestation levels of pests like the O. lybicus with climatic, environmental, and human practice related variables. This review also highlights the accumulated knowledge and addresses the gaps in this area of research. Furthermore, it makes recommendations for future studies, and gives suggestions on monitoring and surveillance methods in designing both local and regional level integrated pest management strategies of palm tree and other affected cultivated crops.
Directory of Open Access Journals (Sweden)
Tudor BURLAN-ROTAR
2017-05-01
Full Text Available Studies of groundwater consist in data acquisition, their processing and interpretation. In areas of interest hydrogeological is assumed that there is a network of wells drilled. This network provides a first in the hydrogeological information. Electromagnetic (EM mapping through the use of such areas, using data obtained from existing network of wells drilled, calibration and confirmation. Measurements using the EM can highlight the existence of several layers with different characteristics: clay, limestone, sand, etc. Studies of groundwater interpretation are used for developing a regional hydrogeologic model. The application of electromagnetic techniques for measuring soil resistivity or conductivity has been known for a long time. Conductivity is preferable in inductive techniques, as instrumentation readings are generally directly proportional to conductivity and inversely proportional to resistivity. The operating principle of this method is: a Tx coil transmitter, supplied with alternating current at an audio frequency, is placed on the ground. An Rx coil receiver is located at a short distance, s, away from the Tx coil. The magnetic field varies in time and the Tx coil induces very small currents in the ground. These currents generate a secondary magnetic field, Hs, which is sensed by the Rx receiver coil, together, with primary magnetic field Hp. The ratio of the secondary field, Hs, to the primary magnetic field, Hp, (Hs/Hp is directly proportional to terrain conductivity. Measuring this ratio, it is possible to construct a device which measures the terrain conductivity by contactless, direct-reading electromagnetic technique (linear meter. This technique for measuring conductivity by electromagnetic induction, using Very Low Frequency (VLF, is a non-intrusive, non-destructive sampling method. The measurements can be done quickly and are not expensive. The Electromagnetic induction technology was originally developed for the mining
Adamu Abdul Abubakar; John Bayo Adeyanju; Raphael O. Chukudi Kene; Mohammad Legbo Sonfada; Abubakar Sadiq Yakubu; Umaru Adamu
2012-01-01
The cosmetic and economic evaluations of three suture techniques were evaluated in closure of caprine skin incision. Fifteen apparently healthy male and female intact goats, free of any dermatological lesions were used for the investigation. They were randomly grouped into three; A (Subcuticular); B (Ford interlocking) and C (Simple interrupted). Cosmetic appearance of the surgical site was assessed on day 7, 14 and 21 post surgery using standard procedure as described by Sakka et al, 1995...
A critical review of wetland greenhouse gas measurement techniques and scaling considerations
Allen, S. T.; Krauss, K. W.; Stagg, C. L.; Neubauer, S. C.
2016-12-01
The role of wetlands in terrestrial greenhouse gas fluxes is disproportionately large compared to the relatively small terrestrial area they encompass. There is an established and growing interest in accurately measuring these fluxes, and extrapolating inferences to larger spatial scales. However, a lack of uniformity in measurement approaches impedes progress because it is a challenge to synthesize data, parameterize models, and develop generalizable concepts from disparate data. Furthermore, pairing different methods can result in double-accounting and other aggregation errors. Our objective is to review gas flux measurement techniques and synthesize concepts, factors, and constraints associated with measuring and scaling greenhouse gas fluxes. This work will contribute to a conceptual framework designed to aid in the collection and use of gas flux data obtained by different methods. This review focuses specifically on wetlands which have both distinct transport processes and a unique biogeochemical environment, causing gas fluxes that are not prominent in other terrestrial or aquatic systems. We review techniques and implications of measuring at different steps along the soil-plant-atmosphere continuum; an emphasis of this work is identifying pathways and transit times for different fluxes in different wetland hydrogeomorphic settings. Measurement location along the path from source to atmosphere connotes the spatial and temporal scales at which a technique is applied, the spatiotemporal representation, and the factors that constrain extrapolation.
Díaz, Zuleyka; Segovia, María Jesús; Fernández, José
2005-01-01
Prediction of insurance companies insolvency has arisen as an important problem in the field of financial research. Most methods applied in the past to tackle this issue are traditional statistical techniques which use financial ratios as explicative variables. However, these variables often do not satisfy statistical assumptions, which complicates the application of the mentioned methods. In this paper, a comparative study of the performance of two non-parametric machine learning techniques ...
Herd, Maria-Teresa; Hall, Timothy J; Jiang, Jingfeng; Zagzebski, James A
2011-12-01
Many quantitative ultrasound (QUS) techniques are based on estimates of the radio-frequency (RF) echo signal power spectrum. Historically, reliable spectral estimates required spatial averaging over large regions-of-interest (ROIs). Spatial compounding techniques have been used to obtain robust spectral estimates for data acquired over small regions of interest. A new technique referred to as "deformation compounding" is another method for providing robust spectral estimates over smaller regions of interest. Motion tracking software is used to follow an ROI while the tissue is deformed (typically by pressing with the transducer). The deformation spatially reorganizes the scatterers so that the resulting echo signal is decorrelated. The RF echo signal power spectrum for the ROI is then averaged over several frames of RF echo data as the tissue is deformed, thus, undergoing deformation compounding. More specifically, averaging spectral estimates among the uncorrelated RF data acquired following small deformations allows reduction in the variance of the power spectral density estimates and, thereby, improves accuracy of spectrum-based tissue property estimation. The viability of deformation compounding has been studied using phantoms with known attenuation and backscatter coefficients. Data from these phantoms demonstrates that a deformation of about 2% frame-to-frame average strain is sufficient to obtain statistically-independent echo signals (with correlations of less than 0.2). Averaging five such frames, where local scatterer reorganization has taken place due to mechanical deformations, reduces the average percent standard deviation among power spectra by 26% and averaging 10 frames reduces the average percent standard deviation by 49%. Deformation compounding is used in this study to improve measurements of backscatter coefficients. These tests show deformation compounding is a promising method to improve the accuracy of spectrum-based quantitative ultrasound
Directory of Open Access Journals (Sweden)
Lyons MD
2015-07-01
Full Text Available Matthew D Lyons, Culley C Carson III, Robert M Coward Department of Urology, University of North Carolina, Chapel Hill, NC, USA Abstract: Placement of an inflatable penile prosthesis (IPP is the mainstay of surgical treatment for patients with Peyronie's disease (PD and concomitant medication-refractory erectile dysfunction. Special considerations and adjunctive surgical techniques during the IPP procedure are often required for patients with PD to improve residual penile curvature, as well as postoperative penile length. The surgical outcomes and various adjunctive techniques are not significantly different from one another, and selection of the appropriate technique must be tailored to patient-specific factors including the extent of the deformity, the degree of penile shortening, and preoperative patient expectations. The aims of this review were to assess the current literature on published outcomes and surgical techniques involving IPP placement in the treatment of PD. Patient satisfaction and preferences are reported, along with the description and patient selection for surgical techniques that include manual penile modeling, management of refractory curvature with concurrent plication, and correction of severe residual curvature and penile shortening with tunica release and plaque incision and grafting. A thorough description of the available techniques and their associated outcomes may help guide surgeons to the most appropriate choice for their patients. Keywords: Peyronie's disease, outcomes, inflatable penile prosthesis, patient expectation, patient satisfaction
The Integration of Voice and Dance Techniques in Musical Theatre: Anatomical Considerations.
Morton, Jennie
2015-06-01
Musical theatre performers are required to be proficient in the three artistic disciplines of dancing, singing, and acting, although in today's modern productions, there is often a requirement to incorporate other skills such as acrobatics and the playing of an instrument. This article focuses on the issues faced by performers when dancing and voicing simultaneously, as it is between these two disciplines where we see the greatest pedagogical divide in terms of breath management and muscle recruitment patterns. The traditional teaching methods of dance and voice techniques are examined, areas of conflict highlighted, and solutions proposed through an exploration of the relevant anatomy.
Brooner, W. G.; Nichols, D. A.
1972-01-01
Development of a scheme for utilizing remote sensing technology in an operational program for regional land use planning and land resource management program applications. The scheme utilizes remote sensing imagery as one of several potential inputs to derive desired and necessary data, and considers several alternative approaches to the expansion and/or reduction and analysis of data, using automated data handling techniques. Within this scheme is a five-stage program development which includes: (1) preliminary coordination, (2) interpretation and encoding, (3) creation of data base files, (4) data analysis and generation of desired products, and (5) applications.
Adams, Jennifer R.; Quartiroli, Alessandro
2010-01-01
The authors note that although the "Diagnostic and Statistical Manual of Mental Disorders" (4th ed., text rev.; American Psychiatric Association, 2000) provides a useful tool for assessment and treatment planning, there has been debate over the lack of attention to issues of diversity. The elements of this debate are presented, along with…
Energy Technology Data Exchange (ETDEWEB)
de Supinski, B R; Miller, B P; Liblit, B
2011-09-13
Petascale platforms with O(10{sup 5}) and O(10{sup 6}) processing cores are driving advancements in a wide range of scientific disciplines. These large systems create unprecedented application development challenges. Scalable correctness tools are critical to shorten the time-to-solution on these systems. Currently, many DOE application developers use primitive manual debugging based on printf or traditional debuggers such as TotalView or DDT. This paradigm breaks down beyond a few thousand cores, yet bugs often arise above that scale. Programmers must reproduce problems in smaller runs to analyze them with traditional tools, or else perform repeated runs at scale using only primitive techniques. Even when traditional tools run at scale, the approach wastes substantial effort and computation cycles. Continued scientific progress demands new paradigms for debugging large-scale applications. The Correctness on Petascale Systems (CoPS) project is developing a revolutionary debugging scheme that will reduce the debugging problem to a scale that human developers can comprehend. The scheme can provide precise diagnoses of the root causes of failure, including suggestions of the location and the type of errors down to the level of code regions or even a single execution point. Our fundamentally new strategy combines and expands three relatively new complementary debugging approaches. The Stack Trace Analysis Tool (STAT), a 2011 R&D 100 Award Winner, identifies behavior equivalence classes in MPI jobs and highlights behavior when elements of the class demonstrate divergent behavior, often the first indicator of an error. The Cooperative Bug Isolation (CBI) project has developed statistical techniques for isolating programming errors in widely deployed code that we will adapt to large-scale parallel applications. Finally, we are developing a new approach to parallelizing expensive correctness analyses, such as analysis of memory usage in the Memgrind tool. In the first two
Directory of Open Access Journals (Sweden)
D. Fenta Mekonnen
2018-04-01
Full Text Available Climate change is becoming one of the most threatening issues for the world today in terms of its global context and its response to environmental and socioeconomic drivers. However, large uncertainties between different general circulation models (GCMs and coarse spatial resolutions make it difficult to use the outputs of GCMs directly, especially for sustainable water management at regional scale, which introduces the need for downscaling techniques using a multimodel approach. This study aims (i to evaluate the comparative performance of two widely used statistical downscaling techniques, namely the Long Ashton Research Station Weather Generator (LARS-WG and the Statistical Downscaling Model (SDSM, and (ii to downscale future climate scenarios of precipitation, maximum temperature (Tmax and minimum temperature (Tmin of the Upper Blue Nile River basin at finer spatial and temporal scales to suit further hydrological impact studies. The calibration and validation result illustrates that both downscaling techniques (LARS-WG and SDSM have shown comparable and good ability to simulate the current local climate variables. Further quantitative and qualitative comparative performance evaluation was done by equally weighted and varying weights of statistical indexes for precipitation only. The evaluation result showed that SDSM using the canESM2 CMIP5 GCM was able to reproduce more accurate long-term mean monthly precipitation but LARS-WG performed best in capturing the extreme events and distribution of daily precipitation in the whole data range. Six selected multimodel CMIP3 GCMs, namely HadCM3, GFDL-CM2.1, ECHAM5-OM, CCSM3, MRI-CGCM2.3.2 and CSIRO-MK3 GCMs, were used for downscaling climate scenarios by the LARS-WG model. The result from the ensemble mean of the six GCM showed an increasing trend for precipitation, Tmax and Tmin. The relative change in precipitation ranged from 1.0 to 14.4 % while the change for mean annual Tmax may increase
Fenta Mekonnen, Dagnenet; Disse, Markus
2018-04-01
Climate change is becoming one of the most threatening issues for the world today in terms of its global context and its response to environmental and socioeconomic drivers. However, large uncertainties between different general circulation models (GCMs) and coarse spatial resolutions make it difficult to use the outputs of GCMs directly, especially for sustainable water management at regional scale, which introduces the need for downscaling techniques using a multimodel approach. This study aims (i) to evaluate the comparative performance of two widely used statistical downscaling techniques, namely the Long Ashton Research Station Weather Generator (LARS-WG) and the Statistical Downscaling Model (SDSM), and (ii) to downscale future climate scenarios of precipitation, maximum temperature (Tmax) and minimum temperature (Tmin) of the Upper Blue Nile River basin at finer spatial and temporal scales to suit further hydrological impact studies. The calibration and validation result illustrates that both downscaling techniques (LARS-WG and SDSM) have shown comparable and good ability to simulate the current local climate variables. Further quantitative and qualitative comparative performance evaluation was done by equally weighted and varying weights of statistical indexes for precipitation only. The evaluation result showed that SDSM using the canESM2 CMIP5 GCM was able to reproduce more accurate long-term mean monthly precipitation but LARS-WG performed best in capturing the extreme events and distribution of daily precipitation in the whole data range. Six selected multimodel CMIP3 GCMs, namely HadCM3, GFDL-CM2.1, ECHAM5-OM, CCSM3, MRI-CGCM2.3.2 and CSIRO-MK3 GCMs, were used for downscaling climate scenarios by the LARS-WG model. The result from the ensemble mean of the six GCM showed an increasing trend for precipitation, Tmax and Tmin. The relative change in precipitation ranged from 1.0 to 14.4 % while the change for mean annual Tmax may increase from 0.4 to 4.3
Study of a 5 kW PEMFC using experimental design and statistical analysis techniques
Energy Technology Data Exchange (ETDEWEB)
Wahdame, B.; Francois, X.; Kauffmann, J.M. [Laboratory of Electrical Engineering and Systems (L2ES), Unite mixte de recherche UTBM and UFC - EA 3898, L2ES-UTBM Batiment F, rue Thierry Mieg, 90010 BELFORT Cedex (France); Candusso, D.; Harel, F.; De Bernardinis, A.; Coquery, G. [The French National Institute for Transport and Safety Research (INRETS), 2 avenue du General Malleret-Joinville, 94 114 ARCUEIL Cedex (France)
2007-02-15
Within the framework of the French inter lab SPACT project (Fuel Cell Systems for Transportation Applications), the behavior of a 5 kW PEM fuel cell stack, fed by humidified hydrogen and compressed air, is investigated in a test platform at Belfort, France. A set of polarization curves are recorded, under various conditions of stack temperature, gas pressure, and stoichiometry rates, in order to obtain a kind of cartography, representing the static stack performance. Initially, the tests are defined considering experimental design techniques. In order to study the relative impacts of the physical factors on the fuel cell voltage, some polarization curve results are selected from the static tests available applying experimental design methodology. First, several analyses are used to estimate the impact of the stack temperature, gas pressure, and stoichiometry rate on the fuel cell voltage. Statistical sensitivity analyses (ANOVA) are used to compute, from the available data, the effects and respective contributions of the various physical factors on the stack voltage. The potential for the detection of any interactions between the different parameters is shown. Also, some graphic representations are used to display the results of the statistical analyses made for different current values of the polarization curves. Then, the experimental design method and its associated statistical tools are employed in order to identify the influence of the stack temperature and gas pressure on the fuel cell voltage. Moreover, it is shown how it is possible to reduce the number of experiments needed and how certain optimizations of the fuel cell operating parameters leading to higher performances can be achieved. The work presented aims at showing the suitability of the experimental design method for the characterization, analysis, and improvement of a complex system like a fuel cell generator. The future outlook is proposed in the final part of the paper. The methodologies
Considerations for Task Analysis Methods and Rapid E-Learning Development Techniques
Directory of Open Access Journals (Sweden)
Dr. Ismail Ipek
2014-02-01
Full Text Available The purpose of this paper is to provide basic dimensions for rapid training development in e-learning courses in education and business. Principally, it starts with defining task analysis and how to select tasks for analysis and task analysis methods for instructional design. To do this, first, learning and instructional technologies as visions of the future were discussed. Second, the importance of task analysis methods in rapid e-learning was considered, with learning technologies as asynchronous and synchronous e-learning development. Finally, rapid instructional design concepts and e-learning design strategies were defined and clarified with examples, that is, all steps for effective task analysis and rapid training development techniques based on learning and instructional design approaches were discussed, such as m-learning and other delivery systems. As a result, the concept of task analysis, rapid e-learning development strategies and the essentials of online course design were discussed, alongside learner interface design features for learners and designers.
Kwon, Dae-Woong; Kim, Kyung-Hyun; Park, Jeong-Yoon; Chin, Dong-Kyu; Kim, Keun-Su; Cho, Young-Eun; Kuh, Sung-Uk
2013-08-01
The posterior lumbar interbody fusion (PLIF) and transforaminal lumbar interbody fusion (TLIF) techniques are commonly used surgical methods for wide indications such as degeneration or trauma. Although they are rarely required for lumbar disk disease in younger patients, there are a few children and adolescents who are indicated for PLIF or TLIF for other reasons, such as congenital severe stenosis with or without lumbar instability that requires wide decompression or severe bony spur that need to be removed. In such cases, different pathophysiology and outcomes are expected compared with adult patients. We retrospectively reviewed data of 23 patients who underwent PLIF or TLIF surgery when less than 20 years old. Clinical and radiographic outcomes were assessed during a mean of 36.4 months follow-up period. The indications of lumbar interbody fusion, success of fusion, complications, and visual analog scale (VAS) were analyzed. Radiographs of all patients taken 6 months after the surgery showed fusion. Clinical outcome was also satisfactory, with improvement of VAS score from 7.7 preoperatively to 2.3 at 6 months after surgery. Only one patient had reoperation due to adjacent segment disease. For adolescent patients with severe bony spur, massive central disk rupture, or severe spondylolisthesis, lumbar interbody fusion surgery has good surgical outcome with few complications.
Statistical techniques for modeling extreme price dynamics in the energy market
International Nuclear Information System (INIS)
Mbugua, L N; Mwita, P N
2013-01-01
Extreme events have large impact throughout the span of engineering, science and economics. This is because extreme events often lead to failure and losses due to the nature unobservable of extra ordinary occurrences. In this context this paper focuses on appropriate statistical methods relating to a combination of quantile regression approach and extreme value theory to model the excesses. This plays a vital role in risk management. Locally, nonparametric quantile regression is used, a method that is flexible and best suited when one knows little about the functional forms of the object being estimated. The conditions are derived in order to estimate the extreme value distribution function. The threshold model of extreme values is used to circumvent the lack of adequate observation problem at the tail of the distribution function. The application of a selection of these techniques is demonstrated on the volatile fuel market. The results indicate that the method used can extract maximum possible reliable information from the data. The key attraction of this method is that it offers a set of ready made approaches to the most difficult problem of risk modeling.
Maia, A de H; Luiz, A J; Campanhola, C
2000-04-01
Knowledge of population growth potential is crucial for studying population dynamics and for establishing management tactics for pest control. Estimation of population growth can be achieved with fertility life tables because they synthesize data on reproduction and mortality of a population. The five main parameters associated with a fertility life table are as follows: (1) the net reproductive rate (Ro), (2) the intrinsic rate of increase (rm), 3) the mean generation time (T), (4) the doubling time (Dt), and (5) the finite rate of increase (lambda). Jackknife and bootstrap techniques are used to calculate the variance of the rm estimate, which can be extended to the other parameters of life tables. Those methods are computer-intensive, their application requires the development of efficient algorithms, and their implementation is based on a programming language that encompasses quickness and reliability. The objectives of this article are to discuss statistical and computational aspects related to estimation of life table parameters and to present a SAS program that uses jackknife to estimate parameters for fertility life tables. The SAS program presented here allows the calculation of confidence intervals for all estimated parameters, as well as provides one-sided and two-sided t-tests to perform pairwise or multiple comparison between groups, with their respective P values.
Sharmin, Rumana
This thesis explores the use of multivariate statistical techniques in developing tools for property modeling and monitoring of a high pressure ethylene polymerization process. In polymer industry, many researchers have shown, mainly in simulation studies, the potential of multivariate statistical methods in identification and control of polymerization process. However, very few, if any, of these strategies have been implemented. This work was done using data collected from a commercial high pressure LDPE/EVA reactor located at AT Plastics, Edmonton. The models or methods developed in the course of this research have been validated with real data and in most cases, implemented in real time. One main objective of this PhD project was to develop and implement a data based inferential sensor to estimate the melt flow index of LDPE and EVA resins using regularly measured process variables. Steady state PLS method was used to develop the soft sensor model. A detailed description of the data preprocessing steps are given that should be followed in the analysis of industrial data. Models developed for two of the most frequently produced polymer grades at AT Plastics have been implemented. The models were tested for many sets of data and showed acceptable performance when applied with an online bias updating scheme. One observation from many validation exercises was that the model prediction becomes poorer with time as operators use new process conditions in the plant to produce the same resin with the same specification. During the implementation of the soft sensors, we suggested a simple bias update scheme as a remedy to this problem. An alternative and more rigorous approach is to recursively update the model with new data, which is also more suitable to handle grade transition. Two existing recursive PLS methods, one based on NIPALS algorithm and the other based on kernel algorithm were reviewed. In addition, we proposed a novel RPLS algorithm which is based on the
Directory of Open Access Journals (Sweden)
Emrys A Jones
Full Text Available MALDI mass spectrometry can generate profiles that contain hundreds of biomolecular ions directly from tissue. Spatially-correlated analysis, MALDI imaging MS, can simultaneously reveal how each of these biomolecular ions varies in clinical tissue samples. The use of statistical data analysis tools to identify regions containing correlated mass spectrometry profiles is referred to as imaging MS-based molecular histology because of its ability to annotate tissues solely on the basis of the imaging MS data. Several reports have indicated that imaging MS-based molecular histology may be able to complement established histological and histochemical techniques by distinguishing between pathologies with overlapping/identical morphologies and revealing biomolecular intratumor heterogeneity. A data analysis pipeline that identifies regions of imaging MS datasets with correlated mass spectrometry profiles could lead to the development of novel methods for improved diagnosis (differentiating subgroups within distinct histological groups and annotating the spatio-chemical makeup of tumors. Here it is demonstrated that highlighting the regions within imaging MS datasets whose mass spectrometry profiles were found to be correlated by five independent multivariate methods provides a consistently accurate summary of the spatio-chemical heterogeneity. The corroboration provided by using multiple multivariate methods, efficiently applied in an automated routine, provides assurance that the identified regions are indeed characterized by distinct mass spectrometry profiles, a crucial requirement for its development as a complementary histological tool. When simultaneously applied to imaging MS datasets from multiple patient samples of intermediate-grade myxofibrosarcoma, a heterogeneous soft tissue sarcoma, nodules with mass spectrometry profiles found to be distinct by five different multivariate methods were detected within morphologically identical regions of
International Nuclear Information System (INIS)
Abbas Alkarkhi, F.M.; Ismail, Norli; Easa, Azhar Mat
2008-01-01
Cockles (Anadara granosa) sample obtained from two rivers in the Penang State of Malaysia were analyzed for the content of arsenic (As) and heavy metals (Cr, Cd, Zn, Cu, Pb, and Hg) using a graphite flame atomic absorption spectrometer (GF-AAS) for Cr, Cd, Zn, Cu, Pb, As and cold vapor atomic absorption spectrometer (CV-AAS) for Hg. The two locations of interest with 20 sampling points of each location were Kuala Juru (Juru River) and Bukit Tambun (Jejawi River). Multivariate statistical techniques such as multivariate analysis of variance (MANOVA) and discriminant analysis (DA) were applied for analyzing the data. MANOVA showed a strong significant difference between the two rivers in term of As and heavy metals contents in cockles. DA gave the best result to identify the relative contribution for all parameters in discriminating (distinguishing) the two rivers. It provided an important data reduction as it used only two parameters (Zn and Cd) affording more than 72% correct assignations. Results indicated that the two rivers were different in terms of As and heavy metal contents in cockle, and the major difference was due to the contribution of Zn and Cd. A positive correlation was found between discriminate functions (DF) and Zn, Cd and Cr, whereas negative correlation was exhibited with other heavy metals. Therefore, DA allowed a reduction in the dimensionality of the data set, delineating a few indicator parameters responsible for large variations in heavy metals and arsenic content. Taking into account of these results, it can be suggested that a continuous monitoring of As and heavy metals in cockles be performed in these two rivers
Energy Technology Data Exchange (ETDEWEB)
Carneiro, Alvaro Luiz Guimaraes [Instituto de Pesquisas Energeticas e Nucleares (IPEN-CNEN/SP), Sao Paulo, SP (Brazil)], E-mail: carneiro@ipen.br; Santos, Francisco Carlos Barbosa dos [Fundacao Instituto de Pesquisas Economicas (FIPE/USP), Sao Paulo, SP (Brazil)], E-mail: fcarlos@fipe.org.br
2007-07-01
Energy is an essential input for social development and economic growth. The production and use of energy cause environmental degradation at all levels, being local, regional and global such as, combustion of fossil fuels causing air pollution; hydropower often causes environmental damage due to the submergence of large areas of land; and global climate change associated with the increasing concentration of greenhouse gases in the atmosphere. As mentioned in chapter 9 of Agenda 21, the Energy is essential to economic and social development and improved quality of life. Much of the world's energy, however, is currently produced and consumed in ways that could not be sustained if technologies were remain constant and if overall quantities were to increase substantially. All energy sources will need to be used in ways that respect the atmosphere, human health, and the environment as a whole. The energy in the context of sustainable development needs a set of quantifiable parameters, called indicators, to measure and monitor important changes and significant progress towards the achievement of the objectives of sustainable development policies. The indicators are divided into four dimensions: social, economic, environmental and institutional. This paper shows a methodology of analysis using Multivariate Statistical Technique that provide the ability to analyse complex sets of data. The main goal of this study is to explore the correlation analysis among the indicators. The data used on this research work, is an excerpt of IBGE (Instituto Brasileiro de Geografia e Estatistica) data census. The core indicators used in this study follows The IAEA (International Atomic Energy Agency) framework: Energy Indicators for Sustainable Development. (author)
International Nuclear Information System (INIS)
Carneiro, Alvaro Luiz Guimaraes; Santos, Francisco Carlos Barbosa dos
2007-01-01
Energy is an essential input for social development and economic growth. The production and use of energy cause environmental degradation at all levels, being local, regional and global such as, combustion of fossil fuels causing air pollution; hydropower often causes environmental damage due to the submergence of large areas of land; and global climate change associated with the increasing concentration of greenhouse gases in the atmosphere. As mentioned in chapter 9 of Agenda 21, the Energy is essential to economic and social development and improved quality of life. Much of the world's energy, however, is currently produced and consumed in ways that could not be sustained if technologies were remain constant and if overall quantities were to increase substantially. All energy sources will need to be used in ways that respect the atmosphere, human health, and the environment as a whole. The energy in the context of sustainable development needs a set of quantifiable parameters, called indicators, to measure and monitor important changes and significant progress towards the achievement of the objectives of sustainable development policies. The indicators are divided into four dimensions: social, economic, environmental and institutional. This paper shows a methodology of analysis using Multivariate Statistical Technique that provide the ability to analyse complex sets of data. The main goal of this study is to explore the correlation analysis among the indicators. The data used on this research work, is an excerpt of IBGE (Instituto Brasileiro de Geografia e Estatistica) data census. The core indicators used in this study follows The IAEA (International Atomic Energy Agency) framework: Energy Indicators for Sustainable Development. (author)
Tay, C. K.; Hayford, E. K.; Hodgson, I. O. A.
2017-06-01
Multivariate statistical technique and hydrogeochemical approach were employed for groundwater assessment within the Lower Pra Basin. The main objective was to delineate the main processes that are responsible for the water chemistry and pollution of groundwater within the basin. Fifty-four (54) (No) boreholes were sampled in January 2012 for quality assessment. PCA using Varimax with Kaiser Normalization method of extraction for both rotated space and component matrix have been applied to the data. Results show that Spearman's correlation matrix of major ions revealed expected process-based relationships derived mainly from the geochemical processes, such as ion-exchange and silicate/aluminosilicate weathering within the aquifer. Three main principal components influence the water chemistry and pollution of groundwater within the basin. The three principal components have accounted for approximately 79% of the total variance in the hydrochemical data. Component 1 delineates the main natural processes (water-soil-rock interactions) through which groundwater within the basin acquires its chemical characteristics, Component 2 delineates the incongruent dissolution of silicate/aluminosilicates, while Component 3 delineates the prevalence of pollution principally from agricultural input as well as trace metal mobilization in groundwater within the basin. The loadings and score plots of the first two PCs show grouping pattern which indicates the strength of the mutual relation among the hydrochemical variables. In terms of proper management and development of groundwater within the basin, communities, where intense agriculture is taking place, should be monitored and protected from agricultural activities. especially where inorganic fertilizers are used by creating buffer zones. Monitoring of the water quality especially the water pH is recommended to ensure the acid neutralizing potential of groundwater within the basin thereby, curtailing further trace metal
Abbas Alkarkhi, F M; Ismail, Norli; Easa, Azhar Mat
2008-02-11
Cockles (Anadara granosa) sample obtained from two rivers in the Penang State of Malaysia were analyzed for the content of arsenic (As) and heavy metals (Cr, Cd, Zn, Cu, Pb, and Hg) using a graphite flame atomic absorption spectrometer (GF-AAS) for Cr, Cd, Zn, Cu, Pb, As and cold vapor atomic absorption spectrometer (CV-AAS) for Hg. The two locations of interest with 20 sampling points of each location were Kuala Juru (Juru River) and Bukit Tambun (Jejawi River). Multivariate statistical techniques such as multivariate analysis of variance (MANOVA) and discriminant analysis (DA) were applied for analyzing the data. MANOVA showed a strong significant difference between the two rivers in term of As and heavy metals contents in cockles. DA gave the best result to identify the relative contribution for all parameters in discriminating (distinguishing) the two rivers. It provided an important data reduction as it used only two parameters (Zn and Cd) affording more than 72% correct assignations. Results indicated that the two rivers were different in terms of As and heavy metal contents in cockle, and the major difference was due to the contribution of Zn and Cd. A positive correlation was found between discriminate functions (DF) and Zn, Cd and Cr, whereas negative correlation was exhibited with other heavy metals. Therefore, DA allowed a reduction in the dimensionality of the data set, delineating a few indicator parameters responsible for large variations in heavy metals and arsenic content. Taking into account of these results, it can be suggested that a continuous monitoring of As and heavy metals in cockles be performed in these two rivers.
Energy Technology Data Exchange (ETDEWEB)
Abbas Alkarkhi, F.M. [School of Industrial Technology, Environmental Technology Division, Universiti Sains Malaysia, 11800 Penang (Malaysia)], E-mail: abbas@usm.my; Ismail, Norli [School of Industrial Technology, Environmental Technology Division, Universiti Sains Malaysia, 11800 Penang (Malaysia)], E-mail: norlii@usm.my; Easa, Azhar Mat [School of Industrial Technology, Food Technology Division, Universiti Sains Malaysia, 11800 Penang (Malaysia)], E-mail: azhar@usm.my
2008-02-11
Cockles (Anadara granosa) sample obtained from two rivers in the Penang State of Malaysia were analyzed for the content of arsenic (As) and heavy metals (Cr, Cd, Zn, Cu, Pb, and Hg) using a graphite flame atomic absorption spectrometer (GF-AAS) for Cr, Cd, Zn, Cu, Pb, As and cold vapor atomic absorption spectrometer (CV-AAS) for Hg. The two locations of interest with 20 sampling points of each location were Kuala Juru (Juru River) and Bukit Tambun (Jejawi River). Multivariate statistical techniques such as multivariate analysis of variance (MANOVA) and discriminant analysis (DA) were applied for analyzing the data. MANOVA showed a strong significant difference between the two rivers in term of As and heavy metals contents in cockles. DA gave the best result to identify the relative contribution for all parameters in discriminating (distinguishing) the two rivers. It provided an important data reduction as it used only two parameters (Zn and Cd) affording more than 72% correct assignations. Results indicated that the two rivers were different in terms of As and heavy metal contents in cockle, and the major difference was due to the contribution of Zn and Cd. A positive correlation was found between discriminate functions (DF) and Zn, Cd and Cr, whereas negative correlation was exhibited with other heavy metals. Therefore, DA allowed a reduction in the dimensionality of the data set, delineating a few indicator parameters responsible for large variations in heavy metals and arsenic content. Taking into account of these results, it can be suggested that a continuous monitoring of As and heavy metals in cockles be performed in these two rivers.
Directory of Open Access Journals (Sweden)
M.A. Delavar
2016-02-01
Full Text Available Introduction: The accumulation of heavy metals (HMs in the soil is of increasing concern due to food safety issues, potential health risks, and the detrimental effects on soil ecosystems. HMs may be considered as the most important soil pollutants, because they are not biodegradable and their physical movement through the soil profile is relatively limited. Therefore, root uptake process may provide a big chance for these pollutants to transfer from the surface soil to natural and cultivated plants, which may eventually steer them to human bodies. The general behavior of HMs in the environment, especially their bioavailability in the soil, is influenced by their origin. Hence, source apportionment of HMs may provide some essential information for better management of polluted soils to restrict the HMs entrance to the human food chain. This paper explores the applicability of multivariate statistical techniques in the identification of probable sources that can control the concentration and distribution of selected HMs in the soils surrounding the Zanjan Zinc Specialized Industrial Town (briefly Zinc Town. Materials and Methods: The area under investigation has a size of approximately 4000 ha.It is located around the Zinc Town, Zanjan province. A regular grid sampling pattern with an interval of 500 meters was applied to identify the sample location, and 184 topsoil samples (0-10 cm were collected. The soil samples were air-dried and sieved through a 2 mm polyethylene sieve and then, were digested using HNO3. The total concentrations of zinc (Zn, lead (Pb, cadmium (Cd, Nickel (Ni and copper (Cu in the soil solutions were determined via Atomic Absorption Spectroscopy (AAS. Data were statistically analyzed using the SPSS software version 17.0 for Windows. Correlation Matrix (CM, Principal Component Analyses (PCA and Factor Analyses (FA techniques were performed in order to identify the probable sources of HMs in the studied soils. Results and
Sampson, Joshua N; Hildesheim, Allan; Herrero, Rolando; Gonzalez, Paula; Kreimer, Aimee R; Gail, Mitchell H
2018-05-01
Cervical cancer is a leading cause of cancer mortality in women worldwide. Human papillomavirus (HPV) types 16 and 18 cause about 70% of all cervical cancers. Clinical trials have demonstrated that three doses of either commercially available HPV vaccine, Cervarix ® or Gardasil ®, prevent most new HPV 16/18 infections and associated precancerous lesions. Based on evidence of immunological non-inferiority, 2-dose regimens have been licensed for adolescents in the United States, European Union, and elsewhere. However, if a single dose were effective, vaccine costs would be reduced substantially and the logistics of vaccination would be greatly simplified, enabling vaccination programs in developing countries. The National Cancer Institute (NCI) and the Agencia Costarricense de Investigaciones Biomédicas (ACIB) are conducting, with support from the Bill & Melinda Gates Foundation and the International Agency for Research on Cancer (IARC), a large 24,000 girl study to evaluate the efficacy of a 1-dose regimen. The first component of the study is a four-year non-inferiority trial comparing 1- to 2-dose regimens of the two licensed vaccines. The second component is an observational study that estimates the vaccine efficacy (VE) of each regimen by comparing the HPV infection rates in the trial arms to those in a contemporaneous survey group of unvaccinated girls. In this paper, we describe the design and statistical analysis for this study. We explain the advantage of defining non-inferiority on the absolute risk scale when the expected event rate is near 0 and, given this definition, suggest an approach to account for missing clinic visits. We then describe the problem of estimating VE in the absence of a randomized placebo arm and offer our solution. Copyright © 2018. Published by Elsevier Inc.
Ahmed, Fahad; Fakhruddin, A. N. M.; Imam, MD. Toufick; Khan, Nasima; Abdullah, Abu Tareq Mohammad; Khan, Tanzir Ahmed; Rahman, Md. Mahfuzur; Uddin, Mohammad Nashir
2017-11-01
In this study, multivariate statistical techniques in collaboration with GIS are used to assess the roadside surface water quality of Savar region. Nineteen water samples were collected in dry season and 15 water quality parameters including TSS, TDS, pH, DO, BOD, Cl-, F-, NO3 2-, NO2 -, SO4 2-, Ca, Mg, K, Zn and Pb were measured. The univariate overview of water quality parameters are TSS 25.154 ± 8.674 mg/l, TDS 840.400 ± 311.081 mg/l, pH 7.574 ± 0.256 pH unit, DO 4.544 ± 0.933 mg/l, BOD 0.758 ± 0.179 mg/l, Cl- 51.494 ± 28.095 mg/l, F- 0.771 ± 0.153 mg/l, NO3 2- 2.211 ± 0.878 mg/l, NO2 - 4.692 ± 5.971 mg/l, SO4 2- 69.545 ± 53.873 mg/l, Ca 48.458 ± 22.690 mg/l, Mg 19.676 ± 7.361 mg/l, K 12.874 ± 11.382 mg/l, Zn 0.027 ± 0.029 mg/l, Pb 0.096 ± 0.154 mg/l. The water quality data were subjected to R-mode PCA which resulted in five major components. PC1 explains 28% of total variance and indicates the roadside and brick field dust settle down (TDS, TSS) in the nearby water body. PC2 explains 22.123% of total variance and indicates the agricultural influence (K, Ca, and NO2 -). PC3 describes the contribution of nonpoint pollution from agricultural and soil erosion processes (SO4 2-, Cl-, and K). PC4 depicts heavy positively loaded by vehicle emission and diffusion from battery stores (Zn, Pb). PC5 depicts strong positive loading of BOD and strong negative loading of pH. Cluster analysis represents three major clusters for both water parameters and sampling sites. The site based on cluster showed similar grouping pattern of R-mode factor score map. The present work reveals a new scope to monitor the roadside water quality for future research in Bangladesh.
The mixed linear model (MLM) is currently among the most advanced and flexible statistical modeling techniques and its use in tackling problems in plant pathology has begun surfacing in the literature. The longitudinal MLM is a multivariate extension that handles repeatedly measured data, such as r...
Herd, Maria-Teresa; Hall, Timothy J; Jiang, Jingfeng; Zagzebski, James A
2011-01-01
Many quantitative ultrasound (QUS) techniques are based on estimates of the radio frequency (RF) echo signal power spectrum. Historically reliable spectral estimates required spatial averaging over large regions of interest (ROIs). Spatial compounding techniques have been used to obtain robust spectral estimates for data acquired over small regions of interest. A new technique referred to as “deformation compounding” is another method for providing robust spectral estimates over smaller regio...
McCray, Wilmon Wil L., Jr.
The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization
Statistical Techniques for Analyzing Process or "Similarity" Data in TID Hardness Assurance
Ladbury, R.
2010-01-01
We investigate techniques for estimating the contributions to TID hardness variability for families of linear bipolar technologies, determining how part-to-part and lot-to-lot variability change for different part types in the process.
The Random Forests Statistical Technique: An Examination of Its Value for the Study of Reading
Matsuki, Kazunaga; Kuperman, Victor; Van Dyke, Julie A.
2016-01-01
Studies investigating individual differences in reading ability often involve data sets containing a large number of collinear predictors and a small number of observations. In this article, we discuss the method of Random Forests and demonstrate its suitability for addressing the statistical concerns raised by such data sets. The method is…
National Research Council Canada - National Science Library
Steed, Chad A; Fitzpatrick, Patrick J; Jankun-Kelly, T. J; Swan II, J. E
2008-01-01
... for a particular dependent variable. These capabilities are combined into a unique visualization system that is demonstrated via a North Atlantic hurricane climate study using a systematic workflow. This research corroborates the notion that enhanced parallel coordinates coupled with statistical analysis can be used for more effective knowledge discovery and confirmation in complex, real-world data sets.
Jeong, Min Sook; Yu, Kyeong-Nam; Chung, Hyun Hoon; Park, Soo Jin; Lee, Ah Young; Song, Mi Ryoung; Cho, Myung-Haing; Kim, Jun Sung
2016-05-19
Qualitative and quantitative analyses of reactive oxygen species (ROS) generated on the surfaces of nanomaterials are important for understanding their toxicity and toxic mechanisms, which are in turn beneficial for manufacturing more biocompatible nanomaterials in many industrial fields. Electron spin resonance (ESR) is a useful tool for detecting ROS formation. However, using this technique without first considering the physicochemical properties of nanomaterials and proper conditions of the spin trapping agent (such as incubation time) may lead to misinterpretation of the resulting data. In this report, we suggest methodological considerations for ESR as pertains to magnetism, sample preparation and proper incubation time with spin trapping agents. Based on our results, each spin trapping agent should be given the proper incubation time. For nanomaterials having magnetic properties, it is useful to remove these nanomaterials via centrifugation after reacting with spin trapping agents. Sonication for the purpose of sample dispersion and sample light exposure should be controlled during ESR in order to enhance the obtained ROS signal. This report will allow researchers to better design ESR spin trapping applications involving nanomaterials.
On some surprising statistical properties of a DNA fingerprinting technique called AFLP
Gort, G.
2010-01-01
AFLP is a widely used DNA fingerprinting technique, resulting in band absence - presence profiles, like a bar code. Bands represent DNA fragments, sampled from the genome of an individual plant or other organism. The DNA fragments travel through a lane of an electrophoretic gel or microcapillary
International Nuclear Information System (INIS)
Carew, John F.; Finch, Stephen J.; Lois, Lambros
2003-01-01
The calculated >1-MeV pressure vessel fluence is used to determine the fracture toughness and integrity of the reactor pressure vessel. It is therefore of the utmost importance to ensure that the fluence prediction is accurate and unbiased. In practice, this assurance is provided by comparing the predictions of the calculational methodology with an extensive set of accurate benchmarks. A benchmarking database is used to provide an estimate of the overall average measurement-to-calculation (M/C) bias in the calculations ( ). This average is used as an ad-hoc multiplicative adjustment to the calculations to correct for the observed calculational bias. However, this average only provides a well-defined and valid adjustment of the fluence if the M/C data are homogeneous; i.e., the data are statistically independent and there is no correlation between subsets of M/C data.Typically, the identification of correlations between the errors in the database M/C values is difficult because the correlation is of the same magnitude as the random errors in the M/C data and varies substantially over the database. In this paper, an evaluation of a reactor dosimetry benchmark database is performed to determine the statistical validity of the adjustment to the calculated pressure vessel fluence. Physical mechanisms that could potentially introduce a correlation between the subsets of M/C ratios are identified and included in a multiple regression analysis of the M/C data. Rigorous statistical criteria are used to evaluate the homogeneity of the M/C data and determine the validity of the adjustment.For the database evaluated, the M/C data are found to be strongly correlated with dosimeter response threshold energy and dosimeter location (e.g., cavity versus in-vessel). It is shown that because of the inhomogeneity in the M/C data, for this database, the benchmark data do not provide a valid basis for adjusting the pressure vessel fluence.The statistical criteria and methods employed in
Source of statistical noises in the Monte Carlo sampling techniques for coherently scattered photons
Muhammad, Wazir; Lee, Sang Hoon
2012-01-01
Detailed comparisons of the predictions of the Relativistic Form Factors (RFFs) and Modified Form Factors (MFFs) and their advantages and shortcomings in calculating elastic scattering cross sections can be found in the literature. However, the issues related to their implementation in the Monte Carlo (MC) sampling for coherently scattered photons is still under discussion. Secondly, the linear interpolation technique (LIT) is a popular method to draw the integrated values of squared RFFs/MFF...
Application of the Statistical ICA Technique in the DANCE Data Analysis
Baramsai, Bayarbadrakh; Jandel, M.; Bredeweg, T. A.; Rusev, G.; Walker, C. L.; Couture, A.; Mosby, S.; Ullmann, J. L.; Dance Collaboration
2015-10-01
The Detector for Advanced Neutron Capture Experiments (DANCE) at the Los Alamos Neutron Science Center is used to improve our understanding of the neutron capture reaction. DANCE is a highly efficient 4 π γ-ray detector array consisting of 160 BaF2 crystals which make it an ideal tool for neutron capture experiments. The (n, γ) reaction Q-value equals to the sum energy of all γ-rays emitted in the de-excitation cascades from the excited capture state to the ground state. The total γ-ray energy is used to identify reactions on different isotopes as well as the background. However, it's challenging to identify contribution in the Esum spectra from different isotopes with the similar Q-values. Recently we have tested the applicability of modern statistical methods such as Independent Component Analysis (ICA) to identify and separate different (n, γ) reaction yields on different isotopes that are present in the target material. ICA is a recently developed computational tool for separating multidimensional data into statistically independent additive subcomponents. In this conference talk, we present some results of the application of ICA algorithms and its modification for the DANCE experimental data analysis. This research is supported by the U. S. Department of Energy, Office of Science, Nuclear Physics under the Early Career Award No. LANL20135009.
Bilgin, Ayla
2015-11-01
The purpose of this study was to assess the impact of 24 water parameters, measured semi-annually between 2011 and 2013 in Coruh Basin (Turkey), based on the quality of the water. The study utilised analysis of variance (ANOVA), principal component analysis (PCA) and factor analysis (FA) methods. The water-quality data was obtained from a total of four sites by the 26th Regional Directorate of the State Hydraulic Works (DSI). ANOVA was carried out to identify the differences between the parameters at the different measuring sites. The variables were classified using factor analysis, and at the end of the ANOVA test, it was established that there was a statistically significant difference between the downstream and upstream waste waters released by the Black Sea copper companies and between the Murgul and Borcka Dams, in terms of water quality, while no statistically significant difference was observed between the Murgul and Borcka Dams. It was determined through factor analysis that five factors explained 81.3% of the total variance. It was concluded that domestic, industrial and agricultural activities, in combination with physicochemical properties, were factors affecting the quality of the water in the Coruh Basin.
Statistical techniques for automating the detection of anomalous performance in rotating machinery
International Nuclear Information System (INIS)
Piety, K.R.; Magette, T.E.
1978-01-01
Surveillance techniques which extend the sophistication existing in automated systems monitoring in industrial rotating equipment are described. The monitoring system automatically established limiting criteria during an initial learning period of a few days; and subsequently, while monitoring the test rotor during an extended period of normal operation, experienced a false alarm rate of 0.5%. At the same time, the monitoring system successfully detected all fault types that introduced into the test setup. Tests on real equipment are needed to provide final verification of the monitoring techniques. There are areas that would profit from additional investigation in the laboratory environment. A comparison of the relative value of alternate descriptors under given fault conditions would be worthwhile. This should be pursued in conjunction with extending the set of fault types available, e.g., lecaring problems. Other tests should examine the effects of using fewer (more coarse) intervals to define the lumped operational states. finally, techniques to diagnose the most probable fault should be developed by drawing upon the extensive data automatically logged by the monitoring system
Lightfoot, Emma; O’Connell, Tamsin C.
2016-01-01
Oxygen isotope analysis of archaeological skeletal remains is an increasingly popular tool to study past human migrations. It is based on the assumption that human body chemistry preserves the δ18O of precipitation in such a way as to be a useful technique for identifying migrants and, potentially, their homelands. In this study, the first such global survey, we draw on published human tooth enamel and bone bioapatite data to explore the validity of using oxygen isotope analyses to identify migrants in the archaeological record. We use human δ18O results to show that there are large variations in human oxygen isotope values within a population sample. This may relate to physiological factors influencing the preservation of the primary isotope signal, or due to human activities (such as brewing, boiling, stewing, differential access to water sources and so on) causing variation in ingested water and food isotope values. We compare the number of outliers identified using various statistical methods. We determine that the most appropriate method for identifying migrants is dependent on the data but is likely to be the IQR or median absolute deviation from the median under most archaeological circumstances. Finally, through a spatial assessment of the dataset, we show that the degree of overlap in human isotope values from different locations across Europe is such that identifying individuals’ homelands on the basis of oxygen isotope analysis alone is not possible for the regions analysed to date. Oxygen isotope analysis is a valid method for identifying first-generation migrants from an archaeological site when used appropriately, however it is difficult to identify migrants using statistical methods for a sample size of less than c. 25 individuals. In the absence of local previous analyses, each sample should be treated as an individual dataset and statistical techniques can be used to identify migrants, but in most cases pinpointing a specific homeland should
Directory of Open Access Journals (Sweden)
Emma Lightfoot
Full Text Available Oxygen isotope analysis of archaeological skeletal remains is an increasingly popular tool to study past human migrations. It is based on the assumption that human body chemistry preserves the δ18O of precipitation in such a way as to be a useful technique for identifying migrants and, potentially, their homelands. In this study, the first such global survey, we draw on published human tooth enamel and bone bioapatite data to explore the validity of using oxygen isotope analyses to identify migrants in the archaeological record. We use human δ18O results to show that there are large variations in human oxygen isotope values within a population sample. This may relate to physiological factors influencing the preservation of the primary isotope signal, or due to human activities (such as brewing, boiling, stewing, differential access to water sources and so on causing variation in ingested water and food isotope values. We compare the number of outliers identified using various statistical methods. We determine that the most appropriate method for identifying migrants is dependent on the data but is likely to be the IQR or median absolute deviation from the median under most archaeological circumstances. Finally, through a spatial assessment of the dataset, we show that the degree of overlap in human isotope values from different locations across Europe is such that identifying individuals' homelands on the basis of oxygen isotope analysis alone is not possible for the regions analysed to date. Oxygen isotope analysis is a valid method for identifying first-generation migrants from an archaeological site when used appropriately, however it is difficult to identify migrants using statistical methods for a sample size of less than c. 25 individuals. In the absence of local previous analyses, each sample should be treated as an individual dataset and statistical techniques can be used to identify migrants, but in most cases pinpointing a specific
Siderius, Daniel W; Mahynski, Nathan A; Shen, Vincent K
2017-05-01
Measurement of the pore-size distribution (PSD) via gas adsorption and the so-called "kernel method" is a widely used characterization technique for rigid adsorbents. Yet, standard techniques and analytical equipment are not appropriate to characterize the emerging class of flexible adsorbents that deform in response to the stress imparted by an adsorbate gas, as the PSD is a characteristic of the material that varies with the gas pressure and any other external stresses. Here, we derive the PSD for a flexible adsorbent using statistical mechanics in the osmotic ensemble to draw analogy to the kernel method for rigid materials. The resultant PSD is a function of the ensemble constraints including all imposed stresses and, most importantly, the deformation free energy of the adsorbent material. Consequently, a pressure-dependent PSD is a descriptor of the deformation characteristics of an adsorbent and may be the basis of future material characterization techniques. We discuss how, given a technique for resolving pressure-dependent PSDs, the present statistical mechanical theory could enable a new generation of analytical tools that measure and characterize certain intrinsic material properties of flexible adsorbents via otherwise simple adsorption experiments.
Statistical techniques for automating the detection of anomalous performance in rotating machinery
International Nuclear Information System (INIS)
Piety, K.R.; Magette, T.E.
1979-01-01
The level of technology utilized in automated systems that monitor industrial rotating equipment and the potential of alternative surveillance methods are assessed. It is concluded that changes in surveillance methodology would upgrade ongoing programs and yet still be practical for implementation. An improved anomaly recognition methodology is formulated and implemented on a minicomputer system. The effectiveness of the monitoring system was evaluated in laboratory tests on a small rotor assembly, using vibrational signals from both displacement probes and accelerometers. Time and frequency domain descriptors are selected to compose an overall signature that characterizes the monitored equipment. Limits for normal operation of the rotor assembly are established automatically during an initial learning period. Thereafter, anomaly detection is accomplished by applying an approximate statistical test to each signature descriptor. As demonstrated over months of testing, this monitoring system is capable of detecting anomalous conditions while exhibiting a false alarm rate below 0.5%
Benson, Nsikak U; Asuquo, Francis E; Williams, Akan B; Essien, Joseph P; Ekong, Cyril I; Akpabio, Otobong; Olajire, Abaas A
2016-01-01
Trace metals (Cd, Cr, Cu, Ni and Pb) concentrations in benthic sediments were analyzed through multi-step fractionation scheme to assess the levels and sources of contamination in estuarine, riverine and freshwater ecosystems in Niger Delta (Nigeria). The degree of contamination was assessed using the individual contamination factors (ICF) and global contamination factor (GCF). Multivariate statistical approaches including principal component analysis (PCA), cluster analysis and correlation test were employed to evaluate the interrelationships and associated sources of contamination. The spatial distribution of metal concentrations followed the pattern Pb>Cu>Cr>Cd>Ni. Ecological risk index by ICF showed significant potential mobility and bioavailability for Cu, Cu and Ni. The ICF contamination trend in the benthic sediments at all studied sites was Cu>Cr>Ni>Cd>Pb. The principal component and agglomerative clustering analyses indicate that trace metals contamination in the ecosystems was influenced by multiple pollution sources.
Directory of Open Access Journals (Sweden)
Nsikak U Benson
Full Text Available Trace metals (Cd, Cr, Cu, Ni and Pb concentrations in benthic sediments were analyzed through multi-step fractionation scheme to assess the levels and sources of contamination in estuarine, riverine and freshwater ecosystems in Niger Delta (Nigeria. The degree of contamination was assessed using the individual contamination factors (ICF and global contamination factor (GCF. Multivariate statistical approaches including principal component analysis (PCA, cluster analysis and correlation test were employed to evaluate the interrelationships and associated sources of contamination. The spatial distribution of metal concentrations followed the pattern Pb>Cu>Cr>Cd>Ni. Ecological risk index by ICF showed significant potential mobility and bioavailability for Cu, Cu and Ni. The ICF contamination trend in the benthic sediments at all studied sites was Cu>Cr>Ni>Cd>Pb. The principal component and agglomerative clustering analyses indicate that trace metals contamination in the ecosystems was influenced by multiple pollution sources.
International Nuclear Information System (INIS)
Kleijnen, J.P.C.; Helton, J.C.
1999-01-01
The robustness of procedures for identifying patterns in scatterplots generated in Monte Carlo sensitivity analyses is investigated. These procedures are based on attempts to detect increasingly complex patterns in the scatterplots under consideration and involve the identification of (i) linear relationships with correlation coefficients, (ii) monotonic relationships with rank correlation coefficients, (iii) trends in central tendency as defined by means, medians and the Kruskal-Wallis statistic, (iv) trends in variability as defined by variances and interquartile ranges, and (v) deviations from randomness as defined by the chi-square statistic. The following two topics related to the robustness of these procedures are considered for a sequence of example analyses with a large model for two-phase fluid flow: the presence of Type I and Type II errors, and the stability of results obtained with independent Latin hypercube samples. Observations from analysis include: (i) Type I errors are unavoidable, (ii) Type II errors can occur when inappropriate analysis procedures are used, (iii) physical explanations should always be sought for why statistical procedures identify variables as being important, and (iv) the identification of important variables tends to be stable for independent Latin hypercube samples
Directory of Open Access Journals (Sweden)
Qing Gu
2016-03-01
Full Text Available Qiandao Lake (Xin’an Jiang reservoir plays a significant role in drinking water supply for eastern China, and it is an attractive tourist destination. Three multivariate statistical methods were comprehensively applied to assess the spatial and temporal variations in water quality as well as potential pollution sources in Qiandao Lake. Data sets of nine parameters from 12 monitoring sites during 2010–2013 were obtained for analysis. Cluster analysis (CA was applied to classify the 12 sampling sites into three groups (Groups A, B and C and the 12 monitoring months into two clusters (April-July, and the remaining months. Discriminant analysis (DA identified Secchi disc depth, dissolved oxygen, permanganate index and total phosphorus as the significant variables for distinguishing variations of different years, with 79.9% correct assignments. Dissolved oxygen, pH and chlorophyll-a were determined to discriminate between the two sampling periods classified by CA, with 87.8% correct assignments. For spatial variation, DA identified Secchi disc depth and ammonia nitrogen as the significant discriminating parameters, with 81.6% correct assignments. Principal component analysis (PCA identified organic pollution, nutrient pollution, domestic sewage, and agricultural and surface runoff as the primary pollution sources, explaining 84.58%, 81.61% and 78.68% of the total variance in Groups A, B and C, respectively. These results demonstrate the effectiveness of integrated use of CA, DA and PCA for reservoir water quality evaluation and could assist managers in improving water resources management.
An Efficient Statistical Computation Technique for Health Care Big Data using R
Sushma Rani, N.; Srinivasa Rao, P., Dr; Parimala, P.
2017-08-01
Due to the changes in living conditions and other factors many critical health related problems are arising. The diagnosis of the problem at earlier stages will increase the chances of survival and fast recovery. This reduces the time of recovery and the cost associated for the treatment. One such medical related issue is cancer and breast cancer has been identified as the second leading cause of cancer death. If detected in the early stage it can be cured. Once a patient is detected with breast cancer tumor, it should be classified whether it is cancerous or non-cancerous. So the paper uses k-nearest neighbors(KNN) algorithm which is one of the simplest machine learning algorithms and is an instance-based learning algorithm to classify the data. Day-to -day new records are added which leds to increase in the data to be classified and this tends to be big data problem. The algorithm is implemented in R whichis the most popular platform applied to machine learning algorithms for statistical computing. Experimentation is conducted by using various classification evaluation metric onvarious values of k. The results show that the KNN algorithm out performes better than existing models.
International Nuclear Information System (INIS)
Horwitz, G.; Katz, J.
1977-01-01
A microcanonical statistical mechanics formulation has been developed for nonrotating systems of stars of equal mass. The system, placed in a confining volume and with a cutoff of the interparticle gravitational interaction at short distances, can have thermodynamic equilibrium states. Sequences of equilibrium states are presumed to simulate slowly evolving, near-equilibrium configurations of real star clusters. An exact functional expression for the entropy of such systems is derived which has also a relativistic counterpart. The entropy is evaluated in an approximation which is mean field plus fluctuations. Evaluations beyond this approximation can readily be carried out. We obtain the necessary and sufficient conditions for spherically symmetric clusters to be thermodynamically stable about a mean field solution, with respect to arbitrary fluctuations in the microcanonical ensemble. The stability conditions amount to the following quantities having definite signs: (i) a functional form, quadratic in ''mean field'' fluctuations, (ii) the derivative of the gravito-chemical potential with respect to the number of particles, at fixed temperature, being positive definite, and (iii) the heat capacity C/sub ν/, at fixed number of particles, being positive definite. In a sequence of equilibrium configurations in which the ratio of densities between the center and the boundary of the cluster is progressively increased, conditions (i) and (ii) break down simultaneously when this density contrast is equal to 1.58. Condition (i) remains unsatisfied for higher density contrasts. The limit 1.58 on the density contrast is much more stringent than that given by condition (iii) which breaks down only for a value of 32.1. Our results are in sharp contrast to those of Antonov's criterion according to which instabilities appear when the density contrast is higher than 709. Time scales of evolutions of various unstable configurations are not considered in this work
International Nuclear Information System (INIS)
Pirkle, F.L.; Stablein, N.K.; Howell, J.A.; Wecksung, G.W.; Duran, B.S.
1982-11-01
One objective of the aerial radiometric surveys flown as part of the US Department of Energy's National Uranium Resource Evaluation (NURE) program was to ascertain the regional distribution of near-surface radioelement abundances. Some method for identifying groups of observations with similar radioelement values was therefore required. It is shown in this report that cluster analysis can identify such groups even when no a priori knowledge of the geology of an area exists. A method of convergent k-means cluster analysis coupled with a hierarchical cluster analysis is used to classify 6991 observations (three radiometric variables at each observation location) from the Precambrian rocks of the Copper Mountain, Wyoming, area. Another method, one that combines a principal components analysis with a convergent k-means analysis, is applied to the same data. These two methods are compared with a convergent k-means analysis that utilizes available geologic knowledge. All three methods identify four clusters. Three of the clusters represent background values for the Precambrian rocks of the area, and one represents outliers (anomalously high 214 Bi). A segmentation of the data corresponding to geologic reality as discovered by other methods has been achieved based solely on analysis of aerial radiometric data. The techniques employed are composites of classical clustering methods designed to handle the special problems presented by large data sets. 20 figures, 7 tables
Malik, Riffat Naseem; Hashmi, Muhammad Zaffar
2017-10-01
Himalayan foothills streams, Pakistan play an important role in living water supply and irrigation of farmlands; thus, the water quality is closely related to public health. Multivariate techniques were applied to check spatial and seasonal trends, and metals contamination sources of the Himalayan foothills streams, Pakistan. Grab surface water samples were collected from different sites (5-15 cm water depth) in pre-washed polyethylene containers. Fast Sequential Atomic Absorption Spectrophotometer (Varian FSAA-240) was used to measure the metals concentration. Concentrations of Ni, Cu, and Mn were high in pre-monsoon season than the post-monsoon season. Cluster analysis identified impaired, moderately impaired and least impaired clusters based on water parameters. Discriminant function analysis indicated spatial variability in water was due to temperature, electrical conductivity, nitrates, iron and lead whereas seasonal variations were correlated with 16 physicochemical parameters. Factor analysis identified municipal and poultry waste, automobile activities, surface runoff, and soil weathering as major sources of contamination. Levels of Mn, Cr, Fe, Pb, Cd, Zn and alkalinity were above the WHO and USEPA standards for surface water. The results of present study will help to higher authorities for the management of the Himalayan foothills streams.
Teleni, Vicki; Baldauf, Richard B., Jr.
A study investigated the statistical techniques used by applied linguists and reported in three journals, "Language Learning,""Applied Linguistics," and "TESOL Quarterly," between 1980 and 1986. It was found that 47% of the published articles used statistical procedures. In these articles, 63% of the techniques used could be called basic, 28%…
Kakar, R. K.; Pandey, P. C.
1983-01-01
A linear statistical technique using a 'leaps and bounds' procedure (Furnival and Wilson, 1974) is developed for retrieving geophysical parameters from remote measurements. It is used for retrieving sea surface temperatures from the Scaning Multichannel Microwave Radiometer (SMMR) on Seasat. The technique uses an efficient algorithm to select the best fixed-size subset of the 10 SMMR channels for linearly retrieving a given geophysical parameter. The 5-channel subset (6.6V, 6.6H 10H, 18V, 21H), where V and H refer to, respectively, the vertical and horizontal polarizations and the numbers are the channel frequencies in gigahertz, gives the minimum rms error in estimating the sea surface temperature. A comparison with ground truth indicates that the algorithm infers the temperature with an rms accuracy of better than 1.5 K under most environmental conditions. A quality control procedure which is seen as holding promise for further improving the accuracy is proposed.
Directory of Open Access Journals (Sweden)
Arostegui I
2018-03-01
Full Text Available Inmaculada Arostegui,1–3 Nerea Gonzalez,2,4 Nerea Fernández-de-Larrea,5,6 Santiago Lázaro-Aramburu,7 Marisa Baré,2,8 Maximino Redondo,2,9 Cristina Sarasqueta,2,10 Susana Garcia-Gutierrez,2,4 José M Quintana2,4 On behalf of the REDISSEC CARESS-CCR Group2 1Department of Applied Mathematics, Statistics and Operations Research, University of the Basque Country UPV/EHU, Leioa, Bizkaia, Spain; 2Health Services Research on Chronic Patients Network (REDISSEC, Galdakao, Bizkaia, Spain; 3Basque Center for Applied Mathematics – BCAM, Bilbao, Bizkaia, Spain; 4Research Unit, Galdakao-Usansolo Hospital, Galdakao, Bizkaia, Spain; 5Environmental and Cancer Epidemiology Unit, National Center of Epidemiology, Instituto de Salud Carlos III, Madrid, Spain; 6Consortium for Biomedical Research in Epidemiology and Public Health (CIBERESP, Madrid, Spain; 7General Surgery Service, Galdakao-Usansolo Hospital, Galdakao, Bizkaia, Spain; 8Clinical Epidemiology and Cancer Screening Unit, Parc Taulí Sabadell-Hospital Universitari, UAB, Sabadell, Barcelona, Spain; 9Research Unit, Costa del Sol Hospital, Marbella, Malaga, Spain; 10Research Unit, Donostia Hospital, Donostia-San Sebastián, Gipuzkoa, Spain Introduction: Colorectal cancer is one of the most frequently diagnosed malignancies and a common cause of cancer-related mortality. The aim of this study was to develop and validate a clinical predictive model for 1-year mortality among patients with colon cancer who survive for at least 30 days after surgery. Methods: Patients diagnosed with colon cancer who had surgery for the first time and who survived 30 days after the surgery were selected prospectively. The outcome was mortality within 1 year. Random forest, genetic algorithms and classification and regression trees were combined in order to identify the variables and partition points that optimally classify patients by risk of mortality. The resulting decision tree was categorized into four risk categories
Statistical signal processing techniques for coherent transversal beam dynamics in synchrotrons
Energy Technology Data Exchange (ETDEWEB)
Alhumaidi, Mouhammad
2015-03-04
identifying and analyzing the betatron oscillation sourced from the kick based on its mixing and temporal patterns. The accelerator magnets can generate unwanted spurious linear and non-linear fields due to fabrication errors or aging. These error fields in the magnets can excite undesired resonances leading together with the space charge tune spread to long term beam losses and reducing dynamic aperture. Therefore, the knowledge of the linear and non-linear magnets errors in circular accelerator optics is very crucial for controlling and compensating resonances and their consequent beam losses and beam quality deterioration. This is indispensable, especially for high beam intensity machines. Fortunately, the relationship between the beam offset oscillation signals recorded at the BPMs is a manifestation of the accelerator optics, and can therefore be exploited in the determination of the optics linear and non-linear components. Thus, beam transversal oscillations can be excited deliberately for purposes of diagnostics operation of particle accelerators. In this thesis, we propose a novel method for detecting and estimating the optics lattice non-linear components located in-between the locations of two BPMs by analyzing the beam offset oscillation signals of a BPMs-triple containing these two BPMs. Depending on the non-linear components in-between the locations of the BPMs-triple, the relationship between the beam offsets follows a multivariate polynomial accordingly. After calculating the covariance matrix of the polynomial terms, the Generalized Total Least Squares method is used to find the model parameters, and thus the non-linear components. A bootstrap technique is used to detect the existing polynomial model orders by means of multiple hypothesis testing, and determine confidence intervals for the model parameters.
Rimayi, Cornelius; Odusanya, David; Mtunzi, Fanyana; Tsoka, Shepherd
2015-01-01
This paper investigates the efficiency of application of four different multivariate calibration techniques, namely matrix-matched internal standard (MMIS), matrix-matched external standard (MMES), solvent-only internal standard (SOIS) and solvent-only external standard (SOES) on the detection and quantification of 20 organochlorine compounds from high, low and blank matrix water sample matrices by Gas Chromatography-Mass Spectrometry (GC-MS) coupled to solid phase extraction (SPE). Further statistical testing, using Statistical Package for the Social Science (SPSS) by applying MANOVA, T-tests and Levene's F tests indicates that matrix composition has a more significant effect on the efficiency of the analytical method than the calibration method of choice. Matrix effects are widely described as one of the major sources of errors in GC-MS multiresidue analysis. Descriptive and inferential statistics proved that the matrix-matched internal standard calibration was the best approach to use for samples of varying matrix composition as it produced the most precise average mean recovery of 87% across all matrices tested. The use of an internal standard calibration overall produced more precise total recoveries than external standard calibration, with mean values of 77% and 64% respectively. The internal standard calibration technique produced a particularly high overall standard deviation of 38% at 95% confidence level indicating that it is less robust than the external standard calibration method which had an overall standard error of 32% at 95% confidence level. Overall, the matrix-matched external standard calibration proved to be the best calibration approach for analysis of low matrix samples which consisted of the real sample matrix as it had the most precise recovery of 98% compared to other calibration approaches for the low-matrix samples. Copyright © 2014 Elsevier Ltd. All rights reserved.
Kang, Sunghong; Seon, Seok Kyung; Yang, Yeong-Ja; Lee, Aekyung; Bae, Jong-Myon
2006-09-01
The aim of this study is to develop a methodology for estimating a nationwide statistic for hernia operations with using the claim database of the Korea Health Insurance Cooperation (KHIC). According to the insurance claim procedures, the claim database was divided into the electronic data interchange database (EDI_DB) and the sheet database (Paper_DB). Although the EDI_DB has operation and management codes showing the facts and kinds of operations, the Paper_DB doesn't. Using the hernia matched management code in the EDI_DB, the cases of hernia surgery were extracted. For drawing the potential cases from the Paper_DB, which doesn't have the code, the predictive model was developed using the data mining technique called SEMMA. The claim sheets of the cases that showed a predictive probability of an operation over the threshold, as was decided by the ROC curve, were identified in order to get the positive predictive value as an index of usefulness for the predictive model. Of the claim databases in 2004, 14,386 cases had hernia related management codes with using the EDI system. For fitting the models with applying the data mining technique, logistic regression was chosen rather than the neural network method or the decision tree method. From the Paper_DB, 1,019 cases were extracted as potential cases. Direct review of the sheets of the extracted cases showed that the positive predictive value was 95.3%. The results suggested that applying the data mining technique to the claim database in the KHIC for estimating the nationwide surgical statistics would be useful from the aspect of execution and cost-effectiveness.
International Nuclear Information System (INIS)
Hernandez M, B.
1997-01-01
The objectives of this work are: to identify the heavy metals present in the air, and its concentrations. To know the behavior from the polluting chemical elements to the long of an annual cycle corresponding to 1990, based on the concentrations of the same ones, obtained through the PIXE technique. To identify the suitable statistical methods to use to the data of metals concentration in form of total suspended particle (PST), found in this investigation. To relate the concentrations and the meteorological parameters considered to be able to suggest the possible pollution sources. In function of the obtained results, to serve as base to the decisions making and measures control that are planned by diverse institutions focused to the problem of the atmospheric pollution in the Metropolitan area of Mexico City (ZMCM). (Author)
Paul, Michael; Arora, Karunesh; Sumita, Eiichiro
This paper proposes a method for handling out-of-vocabulary (OOV) words that cannot be translated using conventional phrase-based statistical machine translation (SMT) systems. For a given OOV word, lexical approximation techniques are utilized to identify spelling and inflectional word variants that occur in the training data. All OOV words in the source sentence are then replaced with appropriate word variants found in the training corpus, thus reducing the number of OOV words in the input. Moreover, in order to increase the coverage of such word translations, the SMT translation model is extended by adding new phrase translations for all source language words that do not have a single-word entry in the original phrase-table but only appear in the context of larger phrases. The effectiveness of the proposed methods is investigated for the translation of Hindi to English, Chinese, and Japanese.
Nikolopoulos, E. I.; Destro, E.; Bhuiyan, M. A. E.; Borga, M., Sr.; Anagnostou, E. N.
2017-12-01
Fire disasters affect modern societies at global scale inducing significant economic losses and human casualties. In addition to their direct impacts they have various adverse effects on hydrologic and geomorphologic processes of a region due to the tremendous alteration of the landscape characteristics (vegetation, soil properties etc). As a consequence, wildfires often initiate a cascade of hazards such as flash floods and debris flows that usually follow the occurrence of a wildfire thus magnifying the overall impact in a region. Post-fire debris flows (PFDF) is one such type of hazards frequently occurring in Western United States where wildfires are a common natural disaster. Prediction of PDFD is therefore of high importance in this region and over the last years a number of efforts from United States Geological Survey (USGS) and National Weather Service (NWS) have been focused on the development of early warning systems that will help mitigate PFDF risk. This work proposes a prediction framework that is based on a nonparametric statistical technique (random forests) that allows predicting the occurrence of PFDF at regional scale with a higher degree of accuracy than the commonly used approaches that are based on power-law thresholds and logistic regression procedures. The work presented is based on a recently released database from USGS that reports a total of 1500 storms that triggered and did not trigger PFDF in a number of fire affected catchments in Western United States. The database includes information on storm characteristics (duration, accumulation, max intensity etc) and other auxiliary information of land surface properties (soil erodibility index, local slope etc). Results show that the proposed model is able to achieve a satisfactory prediction accuracy (threat score > 0.6) superior of previously published prediction frameworks highlighting the potential of nonparametric statistical techniques for development of PFDF prediction systems.
Flore, Jacinthe
2016-09-01
This article examines the problematization of sexual appetite and its imbalances in the development of the Diagnostic and Statistical Manual of Mental Disorders (DSM) in the twentieth and twenty-first centuries. The dominant strands of historiographies of sexuality have focused on historicizing sexual object choice and understanding the emergence of sexual identities. This article emphasizes the need to contextualize these histories within a broader frame of historical interest in the problematization of sexual appetite. The first part highlights how sexual object choice, as a paradigm of sexual dysfunctions, progressively receded from medical interest in the twentieth century as the clinical gaze turned to the problem of sexual appetite and its imbalances. The second part uses the example of the newly introduced Female Sexual Interest/Arousal Disorder in the DSM-5 to explore how the Manual functions as a technique for taking care of the self. I argue that the design of the Manual and associated inventories and questionnaires paved the way for their interpretation and application as techniques for self-examination. © The Author(s) 2016.
Kutkut, Ahmad; Andreana, Sebastiano; Monaco, Edward
2012-01-01
An esthetic restoration supported by dental implant rehabilitation is a major challenge to restorative dentists. The ultimate goal of a dental implant is to restore missing or extracted teeth by placing implants in anatomically, esthetically, and long-term functional restorations. Alveolar ridge preservation and site enhancement following tooth extraction has a major impact on the hard and soft tissue volume. Extraction socket preservation is technique sensitive, not 100% successful, and at times unpredictable. Current techniques may delay surgical implant placement for a few months, and the quality of new bone regeneration is questionable. The aim of this report was to describe a minimally traumatic extraction socket preservation technique using different types of bone graft as a preserver prior to implant placement applied for 80 consecutive cases.
Barman, S.; Bhattacharjya, R. K.
2017-12-01
The River Subansiri is the major north bank tributary of river Brahmaputra. It originates from the range of Himalayas beyond the Great Himalayan range at an altitude of approximately 5340m. Subansiri basin extends from tropical to temperate zones and hence exhibits a great diversity in rainfall characteristics. In the Northern and Central Himalayan tracts, precipitation is scarce on account of high altitudes. On the other hand, Southeast part of the Subansiri basin comprising the sub-Himalayan and the plain tract in Arunachal Pradesh and Assam, lies in the tropics. Due to Northeast as well as Southwest monsoon, precipitation occurs in this region in abundant quantities. Particularly, Southwest monsoon causes very heavy precipitation in the entire Subansiri basin during May to October. In this study, the rainfall over Subansiri basin has been studied at 24 different locations by multiple linear and non-linear regression based statistical downscaling techniques and by Artificial Neural Network based model. APHRODITE's gridded rainfall data of 0.25˚ x 0.25˚ resolutions and climatic parameters of HadCM3 GCM of resolution 2.5˚ x 3.75˚ (latitude by longitude) have been used in this study. It has been found that multiple non-linear regression based statistical downscaling technique outperformed the other techniques. Using this method, the future rainfall pattern over the Subansiri basin has been analyzed up to the year 2099 for four different time periods, viz., 2020-39, 2040-59, 2060-79, and 2080-99 at all the 24 locations. On the basis of historical rainfall, the months have been categorized as wet months, months with moderate rainfall and dry months. The spatial changes in rainfall patterns for all these three types of months have also been analyzed over the basin. Potential decrease of rainfall in the wet months and months with moderate rainfall and increase of rainfall in the dry months are observed for the future rainfall pattern of the Subansiri basin.
Takeshita, Satoshi; Takagi, Ayumu; Saito, Shigeru
2012-08-01
We previously proposed a technique called the "mother-child technique" to facilitate stent delivery for complex coronary lesions. This technique is applicable when the backup support of the guiding catheter is insufficient. In this study, we used an in vitro coronary artery tree model to determine the impact of the size of the mother guiding catheter on the backup support of the mother-child guiding system. The backup support was measured for the 4-in-5, 4-in-6, 4-in-7, and 4-in-8 systems as well as for the 5-in-6, 5-in-7, and 5-in-8 systems. Advancement of the child catheter into the coronary artery tree model improved the backup support of the mother-child system. When a 4-Fr child catheter was advanced by 9 cm, the relative increase in the backup support was 174% in the 4-in-5 system; it was 203% in the 4-in-6, and 135% in the 4-in-7 system (P guiding catheter. Thus, the mother-child technique may be most useful for PCIs in which a small guiding catheter is used, such as transradial coronary interventions. Copyright © 2012 Wiley Periodicals, Inc.
Cortese, Antonio; D'Alessio, Giuseppe; Brongo, Sergio; Amato, Massimo; Sarno, Maria Rosaria; Claudio, Pier Paolo
2016-10-01
Aesthetic of the face is greatly changed in relation to common standards of the past. Modern concepts of beauty from popular models of beautiful faces to actors show a biprotrusive asset with high tension for soft tissues. Facial symmetry has been proposed as a marker of developmental stability that may be important in human mate choice. Any deviation from perfect symmetry can be considered a reflection of imperfect development. The goal of maxillofacial surgery should be to give the best results for both aesthetic and functional aspects. Following these new concepts of aesthetic of the face, new surgical procedure by osteodistraction techniques will lead to a very natural final result by harmonizing the face. The aim of this study was to detect aesthetic results on 10 patients operated for skeletal discrepancies by maxillary distraction and jaw repositioning compared with other 10 patients operated by conventional techniques on a 5-point scale by Likert.
Lyons, Matthew D; Carson, Culley C; Coward, Robert M
2015-01-01
Placement of an inflatable penile prosthesis (IPP) is the mainstay of surgical treatment for patients with Peyronie’s disease (PD) and concomitant medication-refractory erectile dysfunction. Special considerations and adjunctive surgical techniques during the IPP procedure are often required for patients with PD to improve residual penile curvature, as well as postoperative penile length. The surgical outcomes and various adjunctive techniques are not significantly different from one another, and selection of the appropriate technique must be tailored to patient-specific factors including the extent of the deformity, the degree of penile shortening, and preoperative patient expectations. The aims of this review were to assess the current literature on published outcomes and surgical techniques involving IPP placement in the treatment of PD. Patient satisfaction and preferences are reported, along with the description and patient selection for surgical techniques that include manual penile modeling, management of refractory curvature with concurrent plication, and correction of severe residual curvature and penile shortening with tunica release and plaque incision and grafting. A thorough description of the available techniques and their associated outcomes may help guide surgeons to the most appropriate choice for their patients. PMID:26251633
Murphy, Robyn M; Lamb, Graham D
2013-12-01
Western blotting has been used for protein analyses in a wide range of tissue samples for >30 years. Fundamental to Western blotting success are a number of important considerations, which unfortunately are often overlooked or not appreciated. Firstly, lowly expressed proteins may often be better detected by dramatically reducing the amount of sample loaded. Single cell (fibre) Western blotting demonstrates the ability to detect proteins in small sample sizes, 5-10 μg total mass (1-3 μg total protein). That is an order of magnitude less than often used. Using heterogeneous skeletal muscle as the tissue of representation, the need to undertake Western blotting in sample sizes equivalent to single fibre segments is demonstrated. Secondly, incorrect results can be obtained if samples are fractionated and a proportion of the protein of interest inadvertently discarded during sample preparation. Thirdly, quantitative analyses demand that a calibration curve be used. This is regardless of using a loading control, which must be proven to not change with the intervention and also be appropriately calibrated. Fourthly, antibody specificity must be proven using whole tissue analyses, and for immunofluorescence analyses it is vital that only a single protein is detected. If appropriately undertaken, Western blotting is reliable, quantitative, both in relative and absolute terms, and extremely valuable.
Alam, Nayab; Ahmad, Sajid Rashid; Qadir, Abdul; Ashraf, Muhammad Imran; Lakhan, Calvin; Lakhan, V Chris
2015-10-01
Soils from different land use areas in Lahore City, Pakistan, were analyzed for concentrations of heavy metals-cadmium (Cd), chromium (Cr), nickel (Ni), and lead (Pb). One hundred one samples were randomly collected from six land use areas categorized as park, commercial, agricultural, residential, urban, and industrial. Each sample was analyzed in the laboratory with the tri-acid digestion method. Metal concentrations in each sample were obtained with the use of an atomic absorption spectrophotometer. The statistical techniques of analysis of variance, correlation analysis, and cluster analysis were used to analyze all data. In addition, kriging, a geostatistical procedure supported by ArcGIS, was used to model and predict the spatial concentrations of the four heavy metals-Cd, Cr, Ni, and Pb. The results demonstrated significant correlation among the heavy metals in the urban and industrial areas. The dendogram, and the results associated with the cluster analysis, indicated that the agricultural, commercial, and park areas had high concentrations of Cr, Ni, and Pb. High concentrations of Cd and Ni were also observed in the residential and industrial areas, respectively. The maximum concentrations of both Cd and Pb exceeded world toxic limit values. The kriging method demonstrated increasing spatial diffusion of both Cd and Pb concentrations throughout and beyond the Lahore City area.
Energy Technology Data Exchange (ETDEWEB)
Choi, S.O.; Kwon, K.S.; Kim, I.H.; Cho, W.J.; Shin, H.S.; Lee, J.R.; Song, W.K.; Synn, J.H.; Park, C. [Korea Institute of Geology Mining and Materials, Taejon (Korea, Republic of)
1997-12-01
Most of the ground stability analysis on the subsidence prone areas used to be performed through the conventional routine work which consist of a geological survey, a review of the ragged mining map, a trace-investigation on the surface subsidence, a coring job on the prone areas, a rock mass classification, and a two dimensional numerical analysis. Through the above works, we could analyze the stability problems of a surface structure and the tendency of a surface subsidence. However so many problems have been pointed out during the analysis of the subsidence problem owing to the lack of quantitative data in geological survey, the unreliability of the input data for numerical analysis. Also new techniques for ground stability on subsidence area which can replace the conventional passive method are requested among the civil and mining engineers for the safety control of the surface structure including the road and tunnel. In this study, the basic mechanism for the surface subsidence was surveyed first, and the proper input data for the two and three dimensional numerical analysis was selected. And these results were applied to Si-Heung Mine. According to the two dimensional numerical analysis, there is no possibility of surface subsidence even though tension failure was developed up to the region three times to the height of the cavity. Meanwhile the existing data for joints and the ground water was re-evaluated in order to analyze their effects on the subsidence. If we can recognize the characteristics of the spatial data on them in the future, the effect of the joint and ground water on the subsidence can be found out more precisely through the combination with GIS. Finally a finite difference numerical method was applied to Si-Heung Mine in the three dimension. But it was revealed that there are some problems in the three dimensional technique. In other words, it is difficult to obtain the exact spatial coordinates of the cavity, and the researcher should have
Statistical modeling for degradation data
Lio, Yuhlong; Ng, Hon; Tsai, Tzong-Ru
2017-01-01
This book focuses on the statistical aspects of the analysis of degradation data. In recent years, degradation data analysis has come to play an increasingly important role in different disciplines such as reliability, public health sciences, and finance. For example, information on products’ reliability can be obtained by analyzing degradation data. In addition, statistical modeling and inference techniques have been developed on the basis of different degradation measures. The book brings together experts engaged in statistical modeling and inference, presenting and discussing important recent advances in degradation data analysis and related applications. The topics covered are timely and have considerable potential to impact both statistics and reliability engineering.
Directory of Open Access Journals (Sweden)
Stephanie Cohen
2017-09-01
Full Text Available As the oceans become less alkaline due to rising CO2 levels, deleterious consequences are expected for calcifying corals. Predicting how coral calcification will be affected by on-going ocean acidification (OA requires an accurate assessment of CaCO3 deposition and an understanding of the relative importance that decreasing calcification and/or increasing dissolution play for the overall calcification budget of individual corals. Here, we assessed the compatibility of the 45Ca-uptake and total alkalinity (TA anomaly techniques as measures of gross and net calcification (GC, NC, respectively, to determine coral calcification at pHT 8.1 and 7.5. Considering the differing buffering capacity of seawater at both pH values, we were also interested in how strongly coral calcification alters the seawater carbonate chemistry under prolonged incubation in sealed chambers, potentially interfering with physiological functioning. Our data indicate that NC estimates by TA are erroneously ∼5% and ∼21% higher than GC estimates from 45Ca for ambient and reduced pH, respectively. Considering also previous data, we show that the consistent discrepancy between both techniques across studies is not constant, but largely depends on the absolute value of CaCO3 deposition. Deriving rates of coral dissolution from the difference between NC and GC was not possible and we advocate a more direct approach for the future by simultaneously measuring skeletal calcium influx and efflux. Substantial changes in carbonate system parameters for incubation times beyond two hours in our experiment demonstrate the necessity to test and optimize experimental incubation setups when measuring coral calcification in closed systems, especially under OA conditions.
Pestman, Wiebe R
2009-01-01
This textbook provides a broad and solid introduction to mathematical statistics, including the classical subjects hypothesis testing, normal regression analysis, and normal analysis of variance. In addition, non-parametric statistics and vectorial statistics are considered, as well as applications of stochastic analysis in modern statistics, e.g., Kolmogorov-Smirnov testing, smoothing techniques, robustness and density estimation. For students with some elementary mathematical background. With many exercises. Prerequisites from measure theory and linear algebra are presented.
Directory of Open Access Journals (Sweden)
Abdelbaset Buhmeida
2011-05-01
Full Text Available The role of DNA content as a prognostic factor in colorectal cancer (CRC is highly controversial. Some of these controversies are due to purely technical reasons, e.g. variable practices in interpreting the DNA histograms, which is problematic particularly in advanced cases. In this report, we give a detailed account on various options how these histograms could be optimally interpreted, with the idea of establishing the potential value of DNA image cytometry in prognosis and in selection of proper treatment. Material consists of nuclei isolated from 50 ƒĘm paraffin sections from 160 patients with stage II, III or IV CRC diagnosed, treated and followed-up in our clinic. The nuclei were stained with the Feulgen stain. Nuclear DNA was measured using computer-assisted image cytometry. We applied 4 different approaches to analyse the DNA histograms: 1 appearance of the histogram (ABCDE approach, 2 range of DNA values, 3 peak evaluation, and 4 events present at high DNA values. Intra-observer reproducibility of these four histogram interpretation was 89%, 95%, 96%, and 100%, respectively. We depicted selected histograms to illustrate the four analytical approaches in cases with different stages of CRC, with variable disease outcome. In our analysis, the range of DNA values was the best prognosticator, i.e., the tumours with the widest histograms had the most ominous prognosis. These data implicate that DNA cytometry based on isolated nuclei is valuable in predicting the prognosis of CRC. Different interpretation techniques differed in their reproducibility, but the method showing the best prognostic value also had high reproducibility in our analysis.
Akifuddin, Syed; Khatoon, Farheen
2015-12-01
Health care faces challenges due to complications, inefficiencies and other concerns that threaten the safety of patients. The purpose of his study was to identify causes of complications encountered after administration of local anaesthesia for dental and oral surgical procedures and to reduce the incidence of complications by introduction of six sigma methodology. DMAIC (Define, Measure, Analyse, Improve and Control) process of Six Sigma was taken into consideration to reduce the incidence of complications encountered after administration of local anaesthesia injections for dental and oral surgical procedures using failure mode and effect analysis. Pareto analysis was taken into consideration to analyse the most recurring complications. Paired z-sample test using Minitab Statistical Inference and Fisher's exact test was used to statistically analyse the obtained data. The p-value six sigma improvement methodology in healthcare tends to deliver consistently better results to the patients as well as hospitals and results in better patient compliance as well as satisfaction.
Directory of Open Access Journals (Sweden)
Ruchi Tiwari
2009-12-01
Full Text Available The present study investigated a novel extended release system of promethazine hydrochloride (PHC with acrylic polymers Eudragit RL100 and Eudragit S100 in different weight ratios (1:1 and 1: 5, and in combination (0.5+1.5, using freeze-drying and spray-drying techniques. Solid dispersions were characterized by Fourier-transformed infrared spectroscopy (FT-IR, differential scanning calorimetry (DSC, Powder X-ray diffractometry (PXRD, Nuclear magnetic resonance (NMR, Scanning electron microscopy (SEM, as well as solubility and in vitro dissolution studies in 0.1 N HCl (pH 1.2, double-distilled water and phosphate buffer (pH 7.4. Adsorption tests from drug solution to solid polymers were also performed. A selected solid dispersion system was developed into capsule dosage form and evaluated for in vitro dissolution studies. The progressive disappearance of drug peaks in thermotropic profiles of spray-dried dispersions were related to increasing amount of polymers, while SEM studies suggested homogenous dispersion of drug in polymer. Eudragit RL100 had a greater adsorptive capacity than Eudragit S100, and thus its combination in (0.5+1.5 for S100 and RL 100 exhibited a higher dissolution rate with 97.14% drug release for twelve hours. Among different formulations, capsules prepared by combination of acrylic polymers using spray-drying (1:0.5 + 1.5 displayed extended release of drug for twelve hours with 96.87% release followed by zero order kinetics (r²= 0.9986.O presente trabalho compreendeu estudo de um novo sistema de liberação prolongada de cloridrato de prometazina (PHC com polímeros acrílicos Eudragit RL100 e Eudragit S100 em diferentes proporções em massa (1:1 e 1:5 e em combinação (0,5+1,5, utilizando técnicas de liofilização e de secagem por aspersão As dispersões sólidas foram caracterizadas por espectrofotometria no infravermelho por transformada de Fourier (FT-IR, calorimetria diferencial de varredura (DSC, difratometria
Kissling, Grace E; Haseman, Joseph K; Zeiger, Errol
2015-09-02
A recent article by Gaus (2014) demonstrates a serious misunderstanding of the NTP's statistical analysis and interpretation of rodent carcinogenicity data as reported in Technical Report 578 (Ginkgo biloba) (NTP, 2013), as well as a failure to acknowledge the abundant literature on false positive rates in rodent carcinogenicity studies. The NTP reported Ginkgo biloba extract to be carcinogenic in mice and rats. Gaus claims that, in this study, 4800 statistical comparisons were possible, and that 209 of them were statistically significant (p<0.05) compared with 240 (4800×0.05) expected by chance alone; thus, the carcinogenicity of Ginkgo biloba extract cannot be definitively established. However, his assumptions and calculations are flawed since he incorrectly assumes that the NTP uses no correction for multiple comparisons, and that significance tests for discrete data operate at exactly the nominal level. He also misrepresents the NTP's decision making process, overstates the number of statistical comparisons made, and ignores the fact that the mouse liver tumor effects were so striking (e.g., p<0.0000000000001) that it is virtually impossible that they could be false positive outcomes. Gaus' conclusion that such obvious responses merely "generate a hypothesis" rather than demonstrate a real carcinogenic effect has no scientific credibility. Moreover, his claims regarding the high frequency of false positive outcomes in carcinogenicity studies are misleading because of his methodological misconceptions and errors. Published by Elsevier Ireland Ltd.
Masoud, Alaa A.
2014-07-01
Extensive urban, agricultural and industrial expansions on the western fringe of the Nile Delta of Egypt have exerted much load on the water needs and lead to groundwater quality deterioration. Documenting the spatial variation of the groundwater quality and their controlling factors is vital to ensure sustainable water management and safe use. A comprehensive dataset of 451 shallow groundwater samples were collected in 2011 and 2012. On-site field measurements of the total dissolved solids (TDS), electric conductivity (EC), pH, temperature, as well as lab-based ionic composition of the major and trace components were performed. Groundwater types were derived and the suitability for irrigation use was evaluated. Multivariate statistical techniques of factor analysis and K-means clustering were integrated with the geostatistical semi-variogram modeling for evaluating the spatial hydrochemical variations and the driving factors as well as for hydrochemical pattern recognition. Most hydrochemical parameters showed very wide ranges; TDS (201-24,400 mg/l), pH (6.72-8.65), Na+ (28.30-7774 mg/l), and Cl- (7-12,186 mg/l) suggesting complex hydrochemical processes of multiple sources. TDS violated the limit (1200 mg/l) of the Egyptian standards for drinking water quality in many localities. Extreme concentrations of Fe2+, Mn2+, Zn2+, Cu2+, Ni2+, are mostly related to their natural content in the water-bearing sediments and/or to contamination from industrial leakage. Very high nitrate concentrations exceeding the permissible limit (50 mg/l) were potentially maximized toward hydrologic discharge zones and related to wastewater leakage. Three main water types; NaCl (29%), Na2SO4 (26%), and NaHCO3 (20%), formed 75% of the groundwater dominated in the saline depressions, sloping sides of the coastal ridges of the depressions, and in the cultivated/newly reclaimed lands intensely covered by irrigation canals, respectively. Water suitability for irrigation use clarified that the
International Nuclear Information System (INIS)
Dios, R.A.
1984-01-01
This dissertation focuses upon the field of probabilistic risk assessment and its development. It investigates the development of probabilistic risk assessment in nuclear engineering. To provide background for its development, the related areas of population dynamics (demography), epidemiology and actuarial science are studied by presenting information upon how risk has been viewed in these areas over the years. A second major problem involves presenting an overview of the mathematical models related to risk analysis to mathematics educators and making recommendations for presenting this theory in classes of probability and statistics for mathematics and engineering majors at the undergraduate and graduate levels
Labrique, Alain; Blynn, Emily; Ahmed, Saifuddin; Gibson, Dustin; Pariyo, George; Hyder, Adnan A
2017-05-05
In low- and middle-income countries (LMICs), historically, household surveys have been carried out by face-to-face interviews to collect survey data related to risk factors for noncommunicable diseases. The proliferation of mobile phone ownership and the access it provides in these countries offers a new opportunity to remotely conduct surveys with increased efficiency and reduced cost. However, the near-ubiquitous ownership of phones, high population mobility, and low cost require a re-examination of statistical recommendations for mobile phone surveys (MPS), especially when surveys are automated. As with landline surveys, random digit dialing remains the most appropriate approach to develop an ideal survey-sampling frame. Once the survey is complete, poststratification weights are generally applied to reduce estimate bias and to adjust for selectivity due to mobile ownership. Since weights increase design effects and reduce sampling efficiency, we introduce the concept of automated active strata monitoring to improve representativeness of the sample distribution to that of the source population. Although some statistical challenges remain, MPS represent a promising emerging means for population-level data collection in LMICs. ©Alain Labrique, Emily Blynn, Saifuddin Ahmed, Dustin Gibson, George Pariyo, Adnan A Hyder. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 05.05.2017.
Lind, Mads V; Savolainen, Otto I; Ross, Alastair B
2016-08-01
Data quality is critical for epidemiology, and as scientific understanding expands, the range of data available for epidemiological studies and the types of tools used for measurement have also expanded. It is essential for the epidemiologist to have a grasp of the issues involved with different measurement tools. One tool that is increasingly being used for measuring biomarkers in epidemiological cohorts is mass spectrometry (MS), because of the high specificity and sensitivity of MS-based methods and the expanding range of biomarkers that can be measured. Further, the ability of MS to quantify many biomarkers simultaneously is advantageously compared to single biomarker methods. However, as with all methods used to measure biomarkers, there are a number of pitfalls to consider which may have an impact on results when used in epidemiology. In this review we discuss the use of MS for biomarker analyses, focusing on metabolites and their application and potential issues related to large-scale epidemiology studies, the use of MS "omics" approaches for biomarker discovery and how MS-based results can be used for increasing biological knowledge gained from epidemiological studies. Better understanding of the possibilities and possible problems related to MS-based measurements will help the epidemiologist in their discussions with analytical chemists and lead to the use of the most appropriate statistical tools for these data.
Directory of Open Access Journals (Sweden)
WOLFGANG T. WIEDERMANN
2007-03-01
Full Text Available Starting with the discussion between Rasch & Guiard (2004 and von Eye (2004 concerning the use of parametric and nonparametric tests for the comparison of two samples a further approach toward this question is undertaken. Student’s t-test requires for its application interval scaled and normally distributed data along with homogeneous variances across groups. In case that at least one of these prerequisites is not fulfilled, common statistical textbooks for social sciences usually refer to the nonparametric Wilcoxon-Mann-Whitney test. Earlier simulation studies revealed the t-test to be rather robust concerning distributional assumptions. The current study extends these findings with respect to the simultaneous violation of distributional and homogeneity assumptions. A simulation study has shown that both tests lead to highly contradicting results, and a more general approach toward the question of whether parametric or nonparametric procedures should be used, is introduced. Results indicate that the U-Test seems to be in general a more proper instrument for psychological research.
Szulc, Stefan
1965-01-01
Statistical Methods provides a discussion of the principles of the organization and technique of research, with emphasis on its application to the problems in social statistics. This book discusses branch statistics, which aims to develop practical ways of collecting and processing numerical data and to adapt general statistical methods to the objectives in a given field.Organized into five parts encompassing 22 chapters, this book begins with an overview of how to organize the collection of such information on individual units, primarily as accomplished by government agencies. This text then
International Nuclear Information System (INIS)
Kleijnen, J.P.C.; Helton, J.C.
1999-01-01
Procedures for identifying patterns in scatterplots generated in Monte Carlo sensitivity analyses are described and illustrated. These procedures attempt to detect increasingly complex patterns in scatterplots and involve the identification of (i) linear relationships with correlation coefficients, (ii) monotonic relationships with rank correlation coefficients, (iii) trends in central tendency as defined by means, medians and the Kruskal-Wallis statistic, (iv) trends in variability as defined by variances and interquartile ranges, and (v) deviations from randomness as defined by the chi-square statistic. A sequence of example analyses with a large model for two-phase fluid flow illustrates how the individual procedures can differ in the variables that they identify as having effects on particular model outcomes. The example analyses indicate that the use of a sequence of procedures is a good analysis strategy and provides some assurance that an important effect is not overlooked
Petridis, C; Ries, T; Cramer, M C; Graessner, J; Petersen, K U; Reitmeier, F; Jaehne, M; Weiss, F; Adam, G; Habermann, C R
2007-02-01
To evaluate an ultra-fast sequence for MR sialography requiring no post-processing and to compare the acquisition technique regarding the effect of oral stimulation with a parallel acquisition technique in patients with salivary gland diseases. 128 patients with salivary gland disease were prospectively examined using a 1.5-T superconducting system with a 30 mT/m maximum gradient capability and a maximum slew rate of 125 mT/m/sec. A single-shot turbo-spin-echo sequence (ss-TSE) with an acquisition time of 2.8 sec was used in transverse and oblique sagittal orientation. All images were obtained with and without a parallel imaging technique. The evaluation of the ductal system of the parotid and submandibular gland was performed using a visual scale of 1-5 for each side. The images were assessed by two independent experienced radiologists. An ANOVA with post-hoc comparisons and an overall two tailed significance level of p = 0.05 was used for the statistical evaluation. An intraclass correlation was computed to evaluate interobserver variability and a correlation of > 0.8 was determined, thereby indicating a high correlation. Depending on the diagnosed diseases and the absence of abruption of the ducts, all parts of excretory ducts were able to be visualized in all patients using the developed technique with an overall rating for all ducts of 2.70 (SD +/- 0.89). A high correlation was achieved between the two observers with an intraclass correlation of 0.73. Oral application of a sialogogum improved the visibility of excretory ducts significantly (p parallel imaging technique led to a significant decrease in image quality (p = 0,011). The applied ss-TSE for MR sialography allows fast and sufficient visualization of the excretory ducts of the main salivary glands in patients, and no elaborate post-processing is required. Use of an oral sialogogum is suggested to improve the results of MR sialography.
Jaya Christiyan, K. G.; Chandrasekhar, U.; Mathivanan, N. Rajesh; Venkateswarlu, K.
2018-02-01
A 3D printing was successfully used to fabricate samples of Polylactic Acid (PLA). Processing parameters such as Lay-up speed, Lay-up thickness, and printing nozzle were varied. All samples were tested for flexural strength using three point load test. A statistical mathematical model was developed to correlate the processing parameters with flexural strength. The result clearly demonstrated that the lay-up thickness and nozzle diameter influenced flexural strength significantly, whereas lay-up speed hardly influenced the flexural strength.
Directory of Open Access Journals (Sweden)
A. Naseerutheen
2014-03-01
Full Text Available In the analysis of archaeological pottery, Energy Dispersive X-ray florescence analysis has been utilized to establish the elemental concentrations up to fourteen chemical elements for each of 14 archaeological pottery samples from Vellore Dist, Tamil Nadu, India. The EDXRF results have been processed using two multivariate statistical cluster and principal component analysis (PCA methods in order to determine the similarities and correlation between the selected samples based on their elemental composition. The methodology successfully separates the samples where two distinct chemical groups were discerned.
MacKenzie, Dana
2004-01-01
The drawbacks of using 19th-century mathematics in physics and astronomy are illustrated. To continue with the expansion of the knowledge about the cosmos, the scientists will have to come in terms with modern statistics. Some researchers have deliberately started importing techniques that are used in medical research. However, the physicists need to identify the brand of statistics that will be suitable for them, and make a choice between the Bayesian and the frequentists approach. (Edited abstract).
Statistical considerations in practical contamination monitoring
International Nuclear Information System (INIS)
Brodsky, A.; Gallaghar, R.G.
1991-01-01
This paper reports on an examination of smear survey practices which indicates that many radiation safety programs are lacking in important aspects of contamination monitoring. In order to satisfy regulatory and potential litigatory requirements, smear surveys should include the measurement and recording of the flowing data: area of each smear and smear procedure (by reference); total counts on smear and counting procedure; total counts on appropriate blanks and description of blank; total counts on standard and specifications of standard; and all counting times. The rationale for these smear survey requirements is obtained by examining the formulation of the minimum detectable amount (MDA) on a smear
Hu, Yijia; Zhong, Zhong; Zhu, Yimin; Ha, Yao
2018-04-01
In this paper, a statistical forecast model using the time-scale decomposition method is established to do the seasonal prediction of the rainfall during flood period (FPR) over the middle and lower reaches of the Yangtze River Valley (MLYRV). This method decomposites the rainfall over the MLYRV into three time-scale components, namely, the interannual component with the period less than 8 years, the interdecadal component with the period from 8 to 30 years, and the interdecadal component with the period larger than 30 years. Then, the predictors are selected for the three time-scale components of FPR through the correlation analysis. At last, a statistical forecast model is established using the multiple linear regression technique to predict the three time-scale components of the FPR, respectively. The results show that this forecast model can capture the interannual and interdecadal variation of FPR. The hindcast of FPR during 14 years from 2001 to 2014 shows that the FPR can be predicted successfully in 11 out of the 14 years. This forecast model performs better than the model using traditional scheme without time-scale decomposition. Therefore, the statistical forecast model using the time-scale decomposition technique has good skills and application value in the operational prediction of FPR over the MLYRV.
El Alfy, Mohamed; Lashin, Aref; Abdalla, Fathy; Al-Bassam, Abdulaziz
2017-10-01
Rapid economic expansion poses serious problems for groundwater resources in arid areas, which typically have high rates of groundwater depletion. In this study, integration of hydrochemical investigations involving chemical and statistical analyses are conducted to assess the factors controlling hydrochemistry and potential pollution in an arid region. Fifty-four groundwater samples were collected from the Dhurma aquifer in Saudi Arabia, and twenty-one physicochemical variables were examined for each sample. Spatial patterns of salinity and nitrate were mapped using fitted variograms. The nitrate spatial distribution shows that nitrate pollution is a persistent problem affecting a wide area of the aquifer. The hydrochemical investigations and cluster analysis reveal four significant clusters of groundwater zones. Five main factors were extracted, which explain >77% of the total data variance. These factors indicated that the chemical characteristics of the groundwater were influenced by rock-water interactions and anthropogenic factors. The identified clusters and factors were validated with hydrochemical investigations. The geogenic factors include the dissolution of various minerals (calcite, aragonite, gypsum, anhydrite, halite and fluorite) and ion exchange processes. The anthropogenic factors include the impact of irrigation return flows and the application of potassium, nitrate, and phosphate fertilizers. Over time, these anthropogenic factors will most likely contribute to further declines in groundwater quality. Copyright © 2017 Elsevier Ltd. All rights reserved.
Mazzarella, J.; Jarrett, T.; Odewahn, S.; Cutri, R.; Chester, T.; Schmitz, M.; Monkewitz, S.; Madore, B.
1999-05-01
The Spring 1999 Incremental Release of the Two Micron All-Sky Survey (2MASS) Extended Source Catalog (XSC) contains new near-infrared measurements for about eighty thousand extended objects, most of which are previously uncatalogued galaxies. Likewise, the Second Generation Digital Palomar Observatory Sky Survey (DPOSS) provides a rich archive of new visual measurements over the same regions of the sky. Concise graphical and statistical summary data are used to systematically quantify the source densities in various slices of the 2MASS+DPOSS parameter space, including BRIJHK color space, concentration indices, central and average surface brightnesses, and isophotal parameters. Results are also presented for a global principal components analysis of this merged 2MASS+DPOSS dataset for the Spring 1999 XSC sample, with the primary goal of identifying the most important linear combinations of variables to feed into a decision-tree algorithm which will be applied in a follow-up study to attempt supervised classification of previously uncatalogued galaxies. An initial cross-comparison with the current NASA/IPAC Extragalactic Database (NED) shows that approximately 10% of the Spring 1999 XSC sample are previously catalogued objects. Distributions of 2MASS/DPOSS sources with published morphological types and nuclear activity levels (starburst, LINER, Seyfert) available in NED are summarized in the context of forming a training set for a machine learning classifier.
Beg, Sarwar; Saini, Sumant; Bandopadhyay, Shantanu; Katare, O P; Singh, Bhupinder
2018-03-01
This research work entails quality by design (QbD)-based systematic development of nanostructured lipid carriers (NLCs) of Olmesartan medoxomil (OLM) with improved biopharmaceutical attributes. Quality target product profile (QTPP) was defined and critical quality attributes (CQAs) were earmarked. Solubility of drug was performed in various lipids for screening of them. NLCs were prepared by hot-microemulsion method using solid lipids, liquid lipids and surfactants with maximal solubility. Failure mode and effect analysis (FMEA) was carried out for identifying high risk formulation and process parameters. Further, principal component analysis (PCA) was applied on high risk parameters for evaluating the effect of type and concentration of lipids and surfactants on CQAs. Further, systematic optimization of critical material attributes (CMAs) was carried out using face centered cubic design and optimized formulation was identified in the design space. FMEA and PCA suggested suitability of stearic acid, oleic acid and Tween 80 as the CMAs for NLCs. Response surface optimization helped in identifying the optimized NLC formulation with particle size ∼250 nm, zeta potential 75%, in vitro drug release >80% within 6 h. Release kinetic modeling indicated drug release through Fickian-diffusion mechanism. Overall, these studies indicated successful development of NLCs using multivariate statistical approaches for improved product and process understanding.
International Nuclear Information System (INIS)
Tumur, Odgerel; Soon, Kean; Brown, Fraser; Mykytowycz, Marcus
2013-01-01
The aims of our study were to evaluate the effect of application of Adaptive Statistical Iterative Reconstruction (ASIR) algorithm on the radiation dose of coronary computed tomography angiography (CCTA) and its effects on image quality of CCTA and to evaluate the effects of various patient and CT scanning factors on the radiation dose of CCTA. This was a retrospective study that included 347 consecutive patients who underwent CCTA at a tertiary university teaching hospital between 1 July 2009 and 20 September 2011. Analysis was performed comparing patient demographics, scan characteristics, radiation dose and image quality in two groups of patients in whom conventional Filtered Back Projection (FBP) or ASIR was used for image reconstruction. There were 238 patients in the FBP group and 109 patients in the ASIR group. There was no difference between the groups in the use of prospective gating, scan length or tube voltage. In ASIR group, significantly lower tube current was used compared with FBP group, 550mA (450–600) vs. 650mA (500–711.25) (median (interquartile range)), respectively, P<0.001. There was 27% effective radiation dose reduction in the ASIR group compared with FBP group, 4.29mSv (2.84–6.02) vs. 5.84mSv (3.88–8.39) (median (interquartile range)), respectively, P<0.001. Although ASIR was associated with increased image noise compared with FBP (39.93±10.22 vs. 37.63±18.79 (mean ±standard deviation), respectively, P<001), it did not affect the signal intensity, signal-to-noise ratio, contrast-to-noise ratio or the diagnostic quality of CCTA. Application of ASIR reduces the radiation dose of CCTA without affecting the image quality.
Üstün-Aytekin, Özlem; Arısoy, Sevda; Aytekin, Ali Özhan; Yıldız, Ece
2016-03-01
X-prolyl dipeptidyl aminopeptidase (PepX) is an intracellular enzyme from the Gram-positive bacterium Lactococcus lactis spp. lactis NRRL B-1821, and it has commercial importance. The objective of this study was to compare the effects of several cell disruption methods on the activity of PepX. Statistical optimization methods were performed for two cavitation methods, hydrodynamic (high-pressure homogenization) and acoustic (sonication), to determine the more appropriate disruption method. Two level factorial design (2FI), with the parameters of number of cycles and pressure, and Box-Behnken design (BBD), with the parameters of cycle, sonication time, and power, were used for the optimization of the high-pressure homogenization and sonication methods, respectively. In addition, disruption methods, consisting of lysozyme, bead milling, heat treatment, freeze-thawing, liquid nitrogen, ethylenediaminetetraacetic acid (EDTA), Triton-X, sodium dodecyl sulfate (SDS), chloroform, and antibiotics, were performed and compared with the high-pressure homogenization and sonication methods. The optimized values of high-pressure homogenization were one cycle at 130 MPa providing activity of 114.47 mU ml(-1), while sonication afforded an activity of 145.09 mU ml(-1) at 28 min with 91% power and three cycles. In conclusion, sonication was the more effective disruption method, and its optimal operation parameters were manifested for the release of intracellular enzyme from a L. lactis spp. lactis strain, which is a Gram-positive bacterium. Copyright © 2015 Elsevier B.V. All rights reserved.
Oostenbroek, Hubert J; Brand, Ronald; van Roermund, Peter M; Castelein, René M
2014-01-01
Limb length discrepancy (LLD) and other patient factors are thought to influence the complication rate in (paediatric) limb deformity correction. In the literature, information is conflicting. This study was performed to identify clinical factors that affect the complication rate in paediatric lower-limb lengthening. A consecutive group of 37 children was analysed. The median proportionate LLD was 15 (4-42)%. An analysis was carried out on several patient factors that may complicate the treatment or end result using logistic regression in a polytomous logistic regression model. The factors analysed were proportionate LLD, cause of deformity, location of corrected bone, and the classification of the deformity according to an overall classification that includes the LLD and all concomitant deformity factors. The median age at the start of the treatment was 11 (6-17) years. The median lengthening index was 1.5 (0.8-3.8) months per centimetre lengthening. The obstacle and complication rate was 69% per lengthened bone. Proportionate LLD was the only statistically significant predictor for the occurrence of complications. Concomitant deformities did not influence the complication rate. From these data we constructed a simple graph that shows the relationship between proportionate LLD and risk for complications. This study shows that only relative LLD is a predictor of the risk for complications. The additional value of this analysis is the production of a simple graph. Construction of this graph using data of a patient group (for example, your own) may allow a more realistic comparison with results in the literature than has been possible before.
Schwabl, Franz
2006-01-01
The completely revised new edition of the classical book on Statistical Mechanics covers the basic concepts of equilibrium and non-equilibrium statistical physics. In addition to a deductive approach to equilibrium statistics and thermodynamics based on a single hypothesis - the form of the microcanonical density matrix - this book treats the most important elements of non-equilibrium phenomena. Intermediate calculations are presented in complete detail. Problems at the end of each chapter help students to consolidate their understanding of the material. Beyond the fundamentals, this text demonstrates the breadth of the field and its great variety of applications. Modern areas such as renormalization group theory, percolation, stochastic equations of motion and their applications to critical dynamics, kinetic theories, as well as fundamental considerations of irreversibility, are discussed. The text will be useful for advanced students of physics and other natural sciences; a basic knowledge of quantum mechan...
Boisrobert, Loic; Laclaustra, Martin; Bossa, Matias; Frangi, Andres G.; Frangi, Alejandro F.
2005-04-01
Clinical studies report that impaired endothelial function is associated with Cardio-Vascular Diseases (CVD) and their risk factors. One commonly used mean for assessing endothelial function is Flow-Mediated Dilation (FMD). Classically, FMD is quantified using local indexes e.g. maximum peak dilation. Although such parameters have been successfully linked to CVD risk factors and other clinical variables, this description does not consider all the information contained in the complete vasodilation curve. Moreover, the relation between flow impulse and the vessel vasodilation response to this stimulus, although not clearly known, seems to be important and is not taken into account in the majority of studies. In this paper we propose a novel global parameterization for the vasodilation and the flow curves of a FMD test. This parameterization uses Principal Component Analysis (PCA) to describe independently and jointly the variability of flow and FMD curves. These curves are obtained using computerized techniques (based on edge detection and image registration, respectively) to analyze the ultrasound image sequences. The global description obtained through PCA yields a detailed characterization of the morphology of such curves allowing the extraction of intuitive quantitative information of the vasodilation process and its interplay with flow changes. This parameterization is consistent with traditional measurements and, in a database of 177 subjects, seems to correlate more strongly (and with more clinical parameters) than classical measures to CVD risk factors and clinical parameters such as LDL- and HDL-Cholesterol.
Abul Ehsan Bhuiyan, Md; Nikolopoulos, Efthymios I.; Anagnostou, Emmanouil N.; Quintana-Seguí, Pere; Barella-Ortiz, Anaïs
2018-02-01
This study investigates the use of a nonparametric, tree-based model, quantile regression forests (QRF), for combining multiple global precipitation datasets and characterizing the uncertainty of the combined product. We used the Iberian Peninsula as the study area, with a study period spanning 11 years (2000-2010). Inputs to the QRF model included three satellite precipitation products, CMORPH, PERSIANN, and 3B42 (V7); an atmospheric reanalysis precipitation and air temperature dataset; satellite-derived near-surface daily soil moisture data; and a terrain elevation dataset. We calibrated the QRF model for two seasons and two terrain elevation categories and used it to generate ensemble for these conditions. Evaluation of the combined product was based on a high-resolution, ground-reference precipitation dataset (SAFRAN) available at 5 km 1 h-1 resolution. Furthermore, to evaluate relative improvements and the overall impact of the combined product in hydrological response, we used the generated ensemble to force a distributed hydrological model (the SURFEX land surface model and the RAPID river routing scheme) and compared its streamflow simulation results with the corresponding simulations from the individual global precipitation and reference datasets. We concluded that the proposed technique could generate realizations that successfully encapsulate the reference precipitation and provide significant improvement in streamflow simulations, with reduction in systematic and random error on the order of 20-99 and 44-88 %, respectively, when considering the ensemble mean.
Sharaf El Din, Essam; Zhang, Yun
2017-10-01
Traditional surface water quality assessment is costly, labor intensive, and time consuming; however, remote sensing has the potential to assess surface water quality because of its spatiotemporal consistency. Therefore, estimating concentrations of surface water quality parameters (SWQPs) from satellite imagery is essential. Remote sensing estimation of nonoptical SWQPs, such as chemical oxygen demand (COD), biochemical oxygen demand (BOD), and dissolved oxygen (DO), has not yet been performed because they are less likely to affect signals measured by satellite sensors. However, concentrations of nonoptical variables may be correlated with optical variables, such as turbidity and total suspended sediments, which do affect the reflected radiation. In this context, an indirect relationship between satellite multispectral data and COD, BOD, and DO can be assumed. Therefore, this research attempts to develop an integrated Landsat 8 band ratios and stepwise regression to estimate concentrations of both optical and nonoptical SWQPs. Compared with previous studies, a significant correlation between Landsat 8 surface reflectance and concentrations of SWQPs was achieved and the obtained coefficient of determination (R2)>0.85. These findings demonstrated the possibility of using our technique to develop models to estimate concentrations of SWQPs and to generate spatiotemporal maps of SWQPs from Landsat 8 imagery.
International Nuclear Information System (INIS)
Xu, Yan; He, Wen; Chen, Hui; Hu, Zhihai; Li, Juan; Zhang, Tingting
2013-01-01
Aim: To evaluate the relationship between different noise indices (NIs) and radiation dose and to compare the effect of different reconstruction algorithm applications for ultra-low-dose chest computed tomography (CT) on image quality improvement and the accuracy of volumetric measurement of ground-glass opacity (GGO) nodules using a phantom study. Materials and methods: A 11 cm thick transverse phantom section with a chest wall, mediastinum, and 14 artificial GGO nodules with known volumes (919.93 ± 64.05 mm 3 ) was constructed. The phantom was scanned on a Discovery CT 750HD scanner with five different NIs (NIs = 20, 30, 40, 50, and 60). All data were reconstructed with a 0.625 mm section thickness using the filtered back-projection (FBP), 50% adaptive statistical iterative reconstruction (ASiR), and Veo model-base iterative reconstruction algorithms. Image noise was measured in six regions of interest (ROIs). Nodule volumes were measured using a commercial volumetric software package. The image quality and the volume measurement errors were analysed. Results: Image noise increased dramatically from 30.7 HU at NI 20 to 122.4 HU at NI 60, with FBP reconstruction. Conversely, Veo reconstruction effectively controlled the noise increase, with an increase from 9.97 HU at NI 20 to only 15.1 HU at NI 60. Image noise at NI 60 with Veo was even lower (50.8%) than that at NI 20 with FBP. The contrast-to-noise ratio (CNR) of Veo at NI 40 was similar to that of FBP at NI 20. All artificial GGO nodules were successfully identified and measured with an average relative volume measurement error with Veo at NI 60 of 4.24%, comparable to a value of 10.41% with FBP at NI 20. At NI 60, the radiation dose was only one-tenth that at NI 20. Conclusion: The Veo reconstruction algorithms very effectively reduced image noise compared with the conventional FBP reconstructions. Using ultra-low-dose CT scanning and Veo reconstruction, GGOs can be detected and quantified with an acceptable
Directory of Open Access Journals (Sweden)
M. A. E. Bhuiyan
2018-02-01
Full Text Available This study investigates the use of a nonparametric, tree-based model, quantile regression forests (QRF, for combining multiple global precipitation datasets and characterizing the uncertainty of the combined product. We used the Iberian Peninsula as the study area, with a study period spanning 11 years (2000–2010. Inputs to the QRF model included three satellite precipitation products, CMORPH, PERSIANN, and 3B42 (V7; an atmospheric reanalysis precipitation and air temperature dataset; satellite-derived near-surface daily soil moisture data; and a terrain elevation dataset. We calibrated the QRF model for two seasons and two terrain elevation categories and used it to generate ensemble for these conditions. Evaluation of the combined product was based on a high-resolution, ground-reference precipitation dataset (SAFRAN available at 5 km 1 h−1 resolution. Furthermore, to evaluate relative improvements and the overall impact of the combined product in hydrological response, we used the generated ensemble to force a distributed hydrological model (the SURFEX land surface model and the RAPID river routing scheme and compared its streamflow simulation results with the corresponding simulations from the individual global precipitation and reference datasets. We concluded that the proposed technique could generate realizations that successfully encapsulate the reference precipitation and provide significant improvement in streamflow simulations, with reduction in systematic and random error on the order of 20–99 and 44–88 %, respectively, when considering the ensemble mean.
Understanding Statistics - Cancer Statistics
Annual reports of U.S. cancer statistics including new cases, deaths, trends, survival, prevalence, lifetime risk, and progress toward Healthy People targets, plus statistical summaries for a number of common cancer types.
International Nuclear Information System (INIS)
Huang, Erich P; Fridlyand, Jane; Lewin-Koh, Nicholas; Yue, Peng; Shi, Xiaoyan; Dornan, David; Burington, Bart
2010-01-01
Developing the right drugs for the right patients has become a mantra of drug development. In practice, it is very difficult to identify subsets of patients who will respond to a drug under evaluation. Most of the time, no single diagnostic will be available, and more complex decision rules will be required to define a sensitive population, using, for instance, mRNA expression, protein expression or DNA copy number. Moreover, diagnostic development will often begin with in-vitro cell-line data and a high-dimensional exploratory platform, only later to be transferred to a diagnostic assay for use with patient samples. In this manuscript, we present a novel approach to developing robust genomic predictors that are not only capable of generalizing from in-vitro to patient, but are also amenable to clinically validated assays such as qRT-PCR. Using our approach, we constructed a predictor of sensitivity to dacetuzumab, an investigational drug for CD40-expressing malignancies such as lymphoma using genomic measurements of cell lines treated with dacetuzumab. Additionally, we evaluated several state-of-the-art prediction methods by independently pairing the feature selection and classification components of the predictor. In this way, we constructed several predictors that we validated on an independent DLBCL patient dataset. Similar analyses were performed on genomic measurements of breast cancer cell lines and patients to construct a predictor of estrogen receptor (ER) status. The best dacetuzumab sensitivity predictors involved ten or fewer genes and accurately classified lymphoma patients by their survival and known prognostic subtypes. The best ER status classifiers involved one or two genes and led to accurate ER status predictions more than 85% of the time. The novel method we proposed performed as well or better than other methods evaluated. We demonstrated the feasibility of combining feature selection techniques with classification methods to develop assays
Herojeet, Rajkumar; Rishi, Madhuri S.; Lata, Renu; Dolma, Konchok
2017-09-01
multivariate techniques for reliable quality characterization of surface water quality to develop effective pollution reduction strategies and maintain a fine balance between the industrialization and ecological integrity.
Serdobolskii, Vadim Ivanovich
2007-01-01
This monograph presents mathematical theory of statistical models described by the essentially large number of unknown parameters, comparable with sample size but can also be much larger. In this meaning, the proposed theory can be called "essentially multiparametric". It is developed on the basis of the Kolmogorov asymptotic approach in which sample size increases along with the number of unknown parameters.This theory opens a way for solution of central problems of multivariate statistics, which up until now have not been solved. Traditional statistical methods based on the idea of an infinite sampling often break down in the solution of real problems, and, dependent on data, can be inefficient, unstable and even not applicable. In this situation, practical statisticians are forced to use various heuristic methods in the hope the will find a satisfactory solution.Mathematical theory developed in this book presents a regular technique for implementing new, more efficient versions of statistical procedures. ...
Siebert, Eberhard; Bohner, Georg; Zweynert, Sarah; Maus, Volker; Mpotsaris, Anastasios; Liebig, Thomas; Kabbasch, Christoph
2018-04-12
To describe the clinical and radiological characteristics, frequency, technical aspects and outcome of endovascular treatment of acute basilar artery occlusion (ABO) in the setting of vertebrobasilar steno-occlusive disease. Retrospective analysis of databases of two universitary stroke centers including all consecutive patients from January 2013 until May 2017 undergoing thrombectomy for a) acute stroke due to basilar artery occlusion and either significant basilar artery stenosis or vertebral artery stenosis/occlusion as well as b) presumed embolic basilar artery occlusions. Demographics, stroke characteristics, time metrics, recanalization results and outcome were recorded. Interventional strategies were evaluated concerning the thrombectomy technique, additional angioplasty, type of approach with respect to lesion pattern (ipsilateral to steno-occlusive VA lesion: dirty road or contralateral: clean road) and sequence of actions. Out of 157 patients treated for ABO 38 (24.2%) had associated significant vertebrobasilar steno-occlusive lesions. An underlying significant basilar artery stenosis was present in 23.7% and additionally significant steno-occlusive vertebral lesions were present in 81.5%. Thrombectomy was performed with primary aspiration in 15.8% and with stent-retrievers in 84.2%. Successful revascularization (TICI 2b-3) was achieved in 86.8%. In 52.6% additional stent angioplasty was performed, in 7.9% balloon angioplasty only. The clean road approach was used in 22.5% of cases, the dirty road in 77.4%. Final modified Rankin scale (mRS) was 0-2 in 6 patients (15.8%) and 3-5 in 32 (84.2%). The in-hospital mortality was 36.8%. There were no statistically significant differences in outcome compared to presumed cases of embolisms. Endovascular treatment of ABO with underlying significant vertebrobasilar steno-occlusive lesions is effective and reasonably safe. Specific procedural strategies apply depending on individual patient pathology and anatomy
Maté-González, Miguel Ángel; Aramendi, Julia; Yravedra, José; Blasco, Ruth; Rosell, Jordi; González-Aguilera, Diego; Domínguez-Rodrigo, Manuel
2017-09-01
In the last few years, the study of cut marks on bone surfaces has become fundamental for the interpretation of prehistoric butchery practices. Due to the difficulties in the correct identification of cut marks, many criteria for their description and classification have been suggested. Different techniques, such as three-dimensional digital microscope (3D DM), laser scanning confocal microscopy (LSCM) and micro-photogrammetry (M-PG) have been recently applied to the study of cut marks. Although the 3D DM and LSCM microscopic techniques are the most commonly used for the 3D identification of cut marks, M-PG has also proved to be very efficient and a low-cost method. M-PG is a noninvasive technique that allows the study of the cortical surface without any previous preparation of the samples, and that generates high-resolution models. Despite the current application of microscopic and micro-photogrammetric techniques to taphonomy, their reliability has never been tested. In this paper, we compare 3D DM, LSCM and M-PG in order to assess their resolution and results. In this study, we analyse 26 experimental cut marks generated with a metal knife. The quantitative and qualitative information registered is analysed by means of standard multivariate statistics and geometric morphometrics to assess the similarities and differences obtained with the different methodologies. © 2017 The Authors Journal of Microscopy © 2017 Royal Microscopical Society.
Energy Technology Data Exchange (ETDEWEB)
Somerville, Richard
2013-08-22
The long-range goal of several past and current projects in our DOE-supported research has been the development of new and improved parameterizations of cloud-radiation effects and related processes, using ARM data, and the implementation and testing of these parameterizations in global models. The main objective of the present project being reported on here has been to develop and apply advanced statistical techniques, including Bayesian posterior estimates, to diagnose and evaluate features of both observed and simulated clouds. The research carried out under this project has been novel in two important ways. The first is that it is a key step in the development of practical stochastic cloud-radiation parameterizations, a new category of parameterizations that offers great promise for overcoming many shortcomings of conventional schemes. The second is that this work has brought powerful new tools to bear on the problem, because it has been a collaboration between a meteorologist with long experience in ARM research (Somerville) and a mathematician who is an expert on a class of advanced statistical techniques that are well-suited for diagnosing model cloud simulations using ARM observations (Shen).
Velasco-Tapia, Fernando
2014-01-01
Magmatic processes have usually been identified and evaluated using qualitative or semiquantitative geochemical or isotopic tools based on a restricted number of variables. However, a more complete and quantitative view could be reached applying multivariate analysis, mass balance techniques, and statistical tests. As an example, in this work a statistical and quantitative scheme is applied to analyze the geochemical features for the Sierra de las Cruces (SC) volcanic range (Mexican Volcanic Belt). In this locality, the volcanic activity (3.7 to 0.5 Ma) was dominantly dacitic, but the presence of spheroidal andesitic enclaves and/or diverse disequilibrium features in majority of lavas confirms the operation of magma mixing/mingling. New discriminant-function-based multidimensional diagrams were used to discriminate tectonic setting. Statistical tests of discordancy and significance were applied to evaluate the influence of the subducting Cocos plate, which seems to be rather negligible for the SC magmas in relation to several major and trace elements. A cluster analysis following Ward's linkage rule was carried out to classify the SC volcanic rocks geochemical groups. Finally, two mass-balance schemes were applied for the quantitative evaluation of the proportion of the end-member components (dacitic and andesitic magmas) in the comingled lavas (binary mixtures). PMID:24737994
Directory of Open Access Journals (Sweden)
Fernando Velasco-Tapia
2014-01-01
Full Text Available Magmatic processes have usually been identified and evaluated using qualitative or semiquantitative geochemical or isotopic tools based on a restricted number of variables. However, a more complete and quantitative view could be reached applying multivariate analysis, mass balance techniques, and statistical tests. As an example, in this work a statistical and quantitative scheme is applied to analyze the geochemical features for the Sierra de las Cruces (SC volcanic range (Mexican Volcanic Belt. In this locality, the volcanic activity (3.7 to 0.5 Ma was dominantly dacitic, but the presence of spheroidal andesitic enclaves and/or diverse disequilibrium features in majority of lavas confirms the operation of magma mixing/mingling. New discriminant-function-based multidimensional diagrams were used to discriminate tectonic setting. Statistical tests of discordancy and significance were applied to evaluate the influence of the subducting Cocos plate, which seems to be rather negligible for the SC magmas in relation to several major and trace elements. A cluster analysis following Ward’s linkage rule was carried out to classify the SC volcanic rocks geochemical groups. Finally, two mass-balance schemes were applied for the quantitative evaluation of the proportion of the end-member components (dacitic and andesitic magmas in the comingled lavas (binary mixtures.
Ars, Sébastien; Broquet, Grégoire; Yver Kwok, Camille; Roustan, Yelva; Wu, Lin; Arzoumanian, Emmanuel; Bousquet, Philippe
2017-12-01
This study presents a new concept for estimating the pollutant emission rates of a site and its main facilities using a series of atmospheric measurements across the pollutant plumes. This concept combines the tracer release method, local-scale atmospheric transport modelling and a statistical atmospheric inversion approach. The conversion between the controlled emission and the measured atmospheric concentrations of the released tracer across the plume places valuable constraints on the atmospheric transport. This is used to optimise the configuration of the transport model parameters and the model uncertainty statistics in the inversion system. The emission rates of all sources are then inverted to optimise the match between the concentrations simulated with the transport model and the pollutants' measured atmospheric concentrations, accounting for the transport model uncertainty. In principle, by using atmospheric transport modelling, this concept does not strongly rely on the good colocation between the tracer and pollutant sources and can be used to monitor multiple sources within a single site, unlike the classical tracer release technique. The statistical inversion framework and the use of the tracer data for the configuration of the transport and inversion modelling systems should ensure that the transport modelling errors are correctly handled in the source estimation. The potential of this new concept is evaluated with a relatively simple practical implementation based on a Gaussian plume model and a series of inversions of controlled methane point sources using acetylene as a tracer gas. The experimental conditions are chosen so that they are suitable for the use of a Gaussian plume model to simulate the atmospheric transport. In these experiments, different configurations of methane and acetylene point source locations are tested to assess the efficiency of the method in comparison to the classic tracer release technique in coping with the distances
Directory of Open Access Journals (Sweden)
S. Ars
2017-12-01
Full Text Available This study presents a new concept for estimating the pollutant emission rates of a site and its main facilities using a series of atmospheric measurements across the pollutant plumes. This concept combines the tracer release method, local-scale atmospheric transport modelling and a statistical atmospheric inversion approach. The conversion between the controlled emission and the measured atmospheric concentrations of the released tracer across the plume places valuable constraints on the atmospheric transport. This is used to optimise the configuration of the transport model parameters and the model uncertainty statistics in the inversion system. The emission rates of all sources are then inverted to optimise the match between the concentrations simulated with the transport model and the pollutants' measured atmospheric concentrations, accounting for the transport model uncertainty. In principle, by using atmospheric transport modelling, this concept does not strongly rely on the good colocation between the tracer and pollutant sources and can be used to monitor multiple sources within a single site, unlike the classical tracer release technique. The statistical inversion framework and the use of the tracer data for the configuration of the transport and inversion modelling systems should ensure that the transport modelling errors are correctly handled in the source estimation. The potential of this new concept is evaluated with a relatively simple practical implementation based on a Gaussian plume model and a series of inversions of controlled methane point sources using acetylene as a tracer gas. The experimental conditions are chosen so that they are suitable for the use of a Gaussian plume model to simulate the atmospheric transport. In these experiments, different configurations of methane and acetylene point source locations are tested to assess the efficiency of the method in comparison to the classic tracer release technique in coping
Davidson, Norman
2003-01-01
Clear and readable, this fine text assists students in achieving a grasp of the techniques and limitations of statistical mechanics. The treatment follows a logical progression from elementary to advanced theories, with careful attention to detail and mathematical development, and is sufficiently rigorous for introductory or intermediate graduate courses.Beginning with a study of the statistical mechanics of ideal gases and other systems of non-interacting particles, the text develops the theory in detail and applies it to the study of chemical equilibrium and the calculation of the thermody
Zack, J. W.
2015-12-01
Predictions from Numerical Weather Prediction (NWP) models are the foundation for wind power forecasts for day-ahead and longer forecast horizons. The NWP models directly produce three-dimensional wind forecasts on their respective computational grids. These can be interpolated to the location and time of interest. However, these direct predictions typically contain significant systematic errors ("biases"). This is due to a variety of factors including the limited space-time resolution of the NWP models and shortcomings in the model's representation of physical processes. It has become common practice to attempt to improve the raw NWP forecasts by statistically adjusting them through a procedure that is widely known as Model Output Statistics (MOS). The challenge is to identify complex patterns of systematic errors and then use this knowledge to adjust the NWP predictions. The MOS-based improvements are the basis for much of the value added by commercial wind power forecast providers. There are an enormous number of statistical approaches that can be used to generate the MOS adjustments to the raw NWP forecasts. In order to obtain insight into the potential value of some of the newer and more sophisticated statistical techniques often referred to as "machine learning methods" a MOS-method comparison experiment has been performed for wind power generation facilities in 6 wind resource areas of California. The underlying NWP models that provided the raw forecasts were the two primary operational models of the US National Weather Service: the GFS and NAM models. The focus was on 1- and 2-day ahead forecasts of the hourly wind-based generation. The statistical methods evaluated included: (1) screening multiple linear regression, which served as a baseline method, (2) artificial neural networks, (3) a decision-tree approach called random forests, (4) gradient boosted regression based upon an decision-tree algorithm, (5) support vector regression and (6) analog ensemble
International Nuclear Information System (INIS)
Almazan T, M. G.; Jimenez R, M.; Monroy G, F.; Tenorio, D.; Rodriguez G, N. L.
2009-01-01
The elementary composition of archaeological ceramic fragments obtained during the explorations in San Miguel Ixtapan, Mexico State, was determined by the neutron activation analysis technique. The samples irradiation was realized in the research reactor TRIGA Mark III with a neutrons flow of 1·10 13 n·cm -2 ·s -1 . The irradiation time was of 2 hours. Previous to the acquisition of the gamma rays spectrum the samples were allowed to decay from 12 to 14 days. The analyzed elements were: Nd, Ce, Lu, Eu, Yb, Pa(Th), Tb, La, Cr, Hf, Sc, Co, Fe, Cs, Rb. The statistical treatment of the data, consistent in the group analysis and the main components analysis allowed to identify three different origins of the archaeological ceramic, designated as: local, foreign and regional. (Author)
Statistical Methods in Translational Medicine
Directory of Open Access Journals (Sweden)
Shein-Chung Chow
2008-12-01
Full Text Available This study focuses on strategies and statistical considerations for assessment of translation in language (e.g. translation of case report forms in multinational clinical trials, information (e.g. translation of basic discoveries to the clinic and technology (e.g. translation of Chinese diagnostic techniques to well-established clinical study endpoints in pharmaceutical/clinical research and development. However, most of our efforts will be directed to statistical considerations for translation in information. Translational medicine has been defined as bench-to-bedside research, where a basic laboratory discovery becomes applicable to the diagnosis, treatment or prevention of a specific disease, and is brought forth by either a physician—scientist who works at the interface between the research laboratory and patient care, or by a team of basic and clinical science investigators. Statistics plays an important role in translational medicine to ensure that the translational process is accurate and reliable with certain statistical assurance. Statistical inference for the applicability of an animal model to a human model is also discussed. Strategies for selection of clinical study endpoints (e.g. absolute changes, relative changes, or responder-defined, based on either absolute or relative change are reviewed.
Guimarães Nobre, Gabriela; Arnbjerg-Nielsen, Karsten; Rosbjerg, Dan; Madsen, Henrik
2016-04-01
Traditionally, flood risk assessment studies have been carried out from a univariate frequency analysis perspective. However, statistical dependence between hydrological variables, such as extreme rainfall and extreme sea surge, is plausible to exist, since both variables to some extent are driven by common meteorological conditions. Aiming to overcome this limitation, multivariate statistical techniques has the potential to combine different sources of flooding in the investigation. The aim of this study was to apply a range of statistical methodologies for analyzing combined extreme hydrological variables that can lead to coastal and urban flooding. The study area is the Elwood Catchment, which is a highly urbanized catchment located in the city of Port Phillip, Melbourne, Australia. The first part of the investigation dealt with the marginal extreme value distributions. Two approaches to extract extreme value series were applied (Annual Maximum and Partial Duration Series), and different probability distribution functions were fit to the observed sample. Results obtained by using the Generalized Pareto distribution demonstrate the ability of the Pareto family to model the extreme events. Advancing into multivariate extreme value analysis, first an investigation regarding the asymptotic properties of extremal dependence was carried out. As a weak positive asymptotic dependence between the bivariate extreme pairs was found, the Conditional method proposed by Heffernan and Tawn (2004) was chosen. This approach is suitable to model bivariate extreme values, which are relatively unlikely to occur together. The results show that the probability of an extreme sea surge occurring during a one-hour intensity extreme precipitation event (or vice versa) can be twice as great as what would occur when assuming independent events. Therefore, presuming independence between these two variables would result in severe underestimation of the flooding risk in the study area.
Gao, Yongnian; Gao, Junfeng; Yin, Hongbin; Liu, Chuansheng; Xia, Ting; Wang, Jing; Huang, Qi
2015-03-15
Remote sensing has been widely used for ater quality monitoring, but most of these monitoring studies have only focused on a few water quality variables, such as chlorophyll-a, turbidity, and total suspended solids, which have typically been considered optically active variables. Remote sensing presents a challenge in estimating the phosphorus concentration in water. The total phosphorus (TP) in lakes has been estimated from remotely sensed observations, primarily using the simple individual band ratio or their natural logarithm and the statistical regression method based on the field TP data and the spectral reflectance. In this study, we investigated the possibility of establishing a spatial modeling scheme to estimate the TP concentration of a large lake from multi-spectral satellite imagery using band combinations and regional multivariate statistical modeling techniques, and we tested the applicability of the spatial modeling scheme. The results showed that HJ-1A CCD multi-spectral satellite imagery can be used to estimate the TP concentration in a lake. The correlation and regression analysis showed a highly significant positive relationship between the TP concentration and certain remotely sensed combination variables. The proposed modeling scheme had a higher accuracy for the TP concentration estimation in the large lake compared with the traditional individual band ratio method and the whole-lake scale regression-modeling scheme. The TP concentration values showed a clear spatial variability and were high in western Lake Chaohu and relatively low in eastern Lake Chaohu. The northernmost portion, the northeastern coastal zone and the southeastern portion of western Lake Chaohu had the highest TP concentrations, and the other regions had the lowest TP concentration values, except for the coastal zone of eastern Lake Chaohu. These results strongly suggested that the proposed modeling scheme, i.e., the band combinations and the regional multivariate
Directory of Open Access Journals (Sweden)
Georgescu Daniel Ștefan
2014-09-01
Full Text Available This paper presents the appreciations and contributions regarding the use of psychological techniques to stimulate technical creativity with special reference to consonant association technique and inversion technique. The study is performed in the field of TISR transformers and electric motors with limited movement, starting from the analogy between a transformer and an electric motor with shorted coil. It approached a particular aspect of inversion technique in relation with the transformation of negative effects and results of laws, phenomena and processes into useful applications. The matter reffered to is related to the question: ,,why disadvantages and no advantages ?". At the end of the paper are presented and discussed some experimental models produced and studied by the authors in the Research Laboratory of Machines, Equipment and Drives at the University of Suceava and are exposed conclusions drawn from the experimental study and directions for future research.
Industrial statistics with Minitab
Cintas, Pere Grima; Llabres, Xavier Tort-Martorell
2012-01-01
Industrial Statistics with MINITAB demonstrates the use of MINITAB as a tool for performing statistical analysis in an industrial context. This book covers introductory industrial statistics, exploring the most commonly used techniques alongside those that serve to give an overview of more complex issues. A plethora of examples in MINITAB are featured along with case studies for each of the statistical techniques presented. Industrial Statistics with MINITAB: Provides comprehensive coverage of user-friendly practical guidance to the essential statistical methods applied in industry.Explores
Beginning statistics with data analysis
Mosteller, Frederick; Rourke, Robert EK
2013-01-01
This introduction to the world of statistics covers exploratory data analysis, methods for collecting data, formal statistical inference, and techniques of regression and analysis of variance. 1983 edition.
Ebuna, D. R.; Kluesner, J.; Cunningham, K. J.; Edwards, J. H.
2016-12-01
An effective method for determining the approximate spatial extent of karst pore systems is critical for hydrological modeling in such environments. When using geophysical techniques, karst features are especially challenging to constrain due to their inherent heterogeneity and complex seismic signatures. We present a method for mapping these systems using three-dimensional seismic reflection data by combining applications of machine learning and modern data science. Supervised neural networks (NN) have been successfully implemented in seismic reflection studies to produce multi-attributes (or meta-attributes) for delineating faults, chimneys, salt domes, and slumps. Using a seismic reflection dataset from southeast Florida, we develop an objective multi-attribute workflow for mapping karst in which potential interpreter bias is minimized by applying linear and non-linear data transformations for dimensionality reduction. This statistical approach yields a reduced set of input seismic attributes to the NN by eliminating irrelevant and overly correlated variables, while still preserving the vast majority of the observed data variance. By initiating the supervised NN from an eigenspace that maximizes the separation between classes, the convergence time and accuracy of the computations are improved since the NN only needs to recognize small perturbations to the provided decision boundaries. We contend that this 3D seismic reflection, data-driven method for defining the spatial bounds of karst pore systems provides great value as a standardized preliminary step for hydrological characterization and modeling in these complex geological environments.
Cosmic Statistics of Statistics
Szapudi, I.; Colombi, S.; Bernardeau, F.
1999-01-01
The errors on statistics measured in finite galaxy catalogs are exhaustively investigated. The theory of errors on factorial moments by Szapudi & Colombi (1996) is applied to cumulants via a series expansion method. All results are subsequently extended to the weakly non-linear regime. Together with previous investigations this yields an analytic theory of the errors for moments and connected moments of counts in cells from highly nonlinear to weakly nonlinear scales. The final analytic formu...
International Nuclear Information System (INIS)
Judge, L.O.
1987-01-01
An increasing variety of imaging modalities as well as refinements of interventional techniques have led to a resurgence of radiologic interest and participation in urolithiasis management. Judicious selection of the diagnostic examination, close monitoring during the procedure, consultation with urologic colleagues, and a careful regard for radiation safety guidelines define the role of the radiologist in renal stone disease
... Certification Import Surveillance International Recall Guidance Civil and Criminal Penalties Federal Court Orders & Decisions Research & Statistics Research & Statistics Technical Reports Injury Statistics NEISS ...
Cargnello, G; Pezza, L; Gallo, G; Camatta, T; Coccato, S; Pascarella, G; Di Gaetano, R; Casadei, G; La Torre, A; Spera, G; Scaglione, M; Moretti, S; Garofalo, A
2006-01-01
A study was carried out in order to identify agronomic ecologic solutions in the indirect grey mould control on grapevine. These specific trials started since 1990 and, after years of validation, now they are applied by the entrepreneur to the business practice in the different pedological and climatic area and on different cultivars and forms of growing. The technique of "Doppia Maturazione Ragionata" (D.M.R.) ("Doubles Reasoned Maturing") consists of far "completing" the maturing of the grape for wilting on the plant through the reasoned cut of the heads to fruit e/o of shoots. The application of D.M.R., besides determining valid and important technical and qualitative (organoleptic, economic and social quality) improvements on the product, is particularly effective in the indirect grey mould control on grapevine. Such technique, in fact, allows us to vintage the grape during the business demands and not when imposed by Botrytis cinerea; it has been possible, in some cases, to vintage in December and over, without problems of B. cinerea. The trials have shown the technical, economic, social sustainability of D.M.R. application. This paper reports all trials that have brought, by now from years, to apply in the practice DMR (Double Reasoned Maturing).
International Nuclear Information System (INIS)
Lim, Gyeong Hui
2008-03-01
This book consists of 15 chapters, which are basic conception and meaning of statistical thermodynamics, Maxwell-Boltzmann's statistics, ensemble, thermodynamics function and fluctuation, statistical dynamics with independent particle system, ideal molecular system, chemical equilibrium and chemical reaction rate in ideal gas mixture, classical statistical thermodynamics, ideal lattice model, lattice statistics and nonideal lattice model, imperfect gas theory on liquid, theory on solution, statistical thermodynamics of interface, statistical thermodynamics of a high molecule system and quantum statistics
Directory of Open Access Journals (Sweden)
Hamid Reza Pouretedal
2018-02-01
Full Text Available Many of the physical and functional properties of RDX and HMX explosives are related to the crystalline structure of these materials. Crystalline defects affect the quality of the explosives. Therefore, in order to enhance the quality of these materials, it is necessary to form crystals with the lowest defects. In this research, we report the optimization of recrystallization process of RDX and HMX by statistical techniques. The solvent/anti-solvent procedure was used for recrystallization of HMX and RDX particles. The four parameters of i ratio of anti-solvent to solvent, ii ratio of solute to solvent, iii aging time, and iv cooling rate of mixture, were optimized by Taguchi analysis design. Taguchi L16 orthogonal array was used with sixteen rows corresponding to the number of tests in four columns at four levels. The apparent density of recrystallized of RDX and HMX particles was considered as the quality characteristic with the concept of “the larger-the-better”. The obtained graphs showed that the studied parameters were optimized in ratio 1:1 for anti-solvent to solvent, ratio 0.1 g⋅mL-1 for solute to solvent, aging time of 2 h and cooling rate of 1 °C⋅min-1. Also, the correlation between the investigated parameters and apparent density of crystals were studied by multiple linear regressions (MLR method for obtaining a model of prediction of apparent density. The P-values were indicated that in confidence level of 95%, the null hypothesis is rejected and a meaningful addition is observed in the proposed model.
Ahmed, Nisar; Khalid, Perveiz; Shafi, Hafiz Muhammad Bilal; Connolly, Patrick
2017-10-01
The use of seismic direct hydrocarbon indicators is very common in exploration and reservoir development to minimise exploration risk and to optimise the location of production wells. DHIs can be enhanced using AVO methods to calculate seismic attributes that approximate relative elastic properties. In this study, we analyse the sensitivity to pore fluid changes of a range of elastic properties by combining rock physics studies and statistical techniques and determine which provide the best basis for DHIs. Gassmann fluid substitution is applied to the well log data and various elastic properties are evaluated by measuring the degree of separation that they achieve between gas sands and wet sands. The method has been applied successfully to well log data from proven reservoirs in three different siliciclastic environments of Cambrian, Jurassic, and Cretaceous ages. We have quantified the sensitivity of various elastic properties such as acoustic and extended elastic (EEI) impedances, elastic moduli ( K sat and K sat- μ), lambda-mu-rho method ( λρ and μρ), P-to-S-wave velocity ratio ( V P/ V S), and Poisson's ratio ( σ) at fully gas/water saturation scenarios. The results are strongly dependent on the local geological settings and our modeling demonstrates that for Cambrian and Cretaceous reservoirs, K sat- μ, EEI, V P/ V S, and σ are more sensitive to pore fluids (gas/water). For the Jurassic reservoir, the sensitivity of all elastic and seismic properties to pore fluid reduces due to high overburden pressure and the resultant low porosity. Fluid indicators are evaluated using two metrics: a fluid indicator coefficient based on a Gaussian model and an overlap coefficient which makes no assumptions about a distribution model. This study will provide a potential way to identify gas sand zones in future exploration.
Directory of Open Access Journals (Sweden)
Pascual Izquierdo-Egea
2015-03-01
Full Text Available Se presenta aqui una tecnica estadistica para medir la conflictividad social a traves del registro mortuorio. Nace al amparo del metodo de valoracion contextual empleado en el analisis de los ajuares funerarios desde 1993. Se trata de una herramienta fundamental para el desarrollo de la arqueologia de los fenomenos sociales, cuyos relevantes resultados empiricos avalan su trascendencia teorica. Tras proceder a su conceptualizacion en funcion de la desigualdad social y la riqueza relativa, se explican las dos clases de conflictividad social definidas: estructural o estatica y coyuntural o dinamica. Finalmente, se incluyen sus conexiones con la ley demografica de Malthus a traves de sus dos parametros: poblacion y recursos. Todo este entramado teorico se ilustra con algunas aplicaciones referidas a las civilizaciones antiguas, abarcando la protohistoria iberica, la Mesoamerica prehispanica o la Roma altoimperial. ENGLISH: A statistical technique to measure social conflict through the mortuary record is presented here. It is born under the contextual valuation method used in the analysis of grave goods since 1993. This is a fundamental tool for the development of the archaeology of social phenomena, whose relevant empirical results support its theoretical significance. After conveying its conceptualization in terms of social inequality and relative wealth, the two classes of social conflict are explained: static or structural and dynamic or conjunctural. Finally, connections with the Malthusian demographic law through its two parameters—population and resources—are included. The synthesis of these theoretical frameworks is illustrated with applications to ancient civilizations, including Iberian protohistory, prehispanic Mesoamerica, and early imperial Rome.
International Nuclear Information System (INIS)
Gemuend, R.
1980-01-01
On a group of 146 test persons (in 50 cases desseminated lupus erythematodus had been confirmed), for the first time comparative evaluations were made with four methods (A to D) under the application of a repurified fluorescinisothiocyanat FITC) serum, in order to detect antinuclear antibodies (ANA). The ANA detection was obtained by immunofluorescence (IFL) on frozen sections of mouse livers; by IFL on chicken erythrocytes smears, previously treated with hydrochloric acid; by IFL on ethanol-fixed flagellates Crithidia luciliae; and by the radioimmunoassay (RIA) of a test kit with reference sera. These two tests served to detect antibodies - with respect to negative DNA - which are of particular importance in lupous nephritis. A good correlation of both methods was proved by means of various statistic methods and by follow-up observations and examinations of the reference sera. Possible reasons responsible for the deviations, which were found between the two tests, are described. Of all 4 tests, RIA and IFL on Crithida resulted to be the most closely ones to the relevant laboratory values and reflect very evidently the activity of the desseminated lupus erethematodus. The particularly well correlation with the blood sedimentation rate, proteinuria and with the complement level becomes very obvious. The advantages and disadvantages of the applied methods are discussed and it is emphasized that at present the method of choice for the detection of DNA antibodies is the combined examination of the patient serum, both, in the IFL on Crithidia and in the RIA. (orig./MG) [de
Energy Technology Data Exchange (ETDEWEB)
Flampouri, S; Li, Z; Hoppe, B [University of Florida Health Proton Therapy Institute, Jacksonville, FL (United States)
2015-06-15
Purpose: To develop a treatment planning method for passively-scattered involved-node proton therapy of mediastinal lymphoma robust to breathing and cardiac motions. Methods: Beam-specific planning treatment volumes (bsPTV) are calculated for each proton field to incorporate pertinent uncertainties. Geometric margins are added laterally to each beam while margins for range uncertainty due to setup errors, breathing, and calibration curve uncertainties are added along each beam. The calculation of breathing motion and deformation effects on proton range includes all 4DCT phases. The anisotropic water equivalent margins are translated to distances on average 4DCT. Treatment plans are designed so each beam adequately covers the corresponding bsPTV. For targets close to the heart, cardiac motion effects on dosemaps are estimated by using a library of anonymous ECG-gated cardiac CTs (cCT). The cCT, originally contrast-enhanced, are partially overridden to allow meaningful proton dose calculations. Targets similar to the treatment targets are drawn on one or more cCT sets matching the anatomy of the patient. Plans based on the average cCT are calculated on individual phases, then deformed to the average and accumulated. When clinically significant dose discrepancies occur between planned and accumulated doses, the patient plan is modified to reduce the cardiac motion effects. Results: We found that bsPTVs as planning targets create dose distributions similar to the conventional proton planning distributions, while they are a valuable tool for visualization of the uncertainties. For large targets with variability in motion and depth, integral dose was reduced because of the anisotropic margins. In most cases, heart motion has a clinically insignificant effect on target coverage. Conclusion: A treatment planning method was developed and used for proton therapy of mediastinal lymphoma. The technique incorporates bsPTVs compensating for all common sources of uncertainties
International Nuclear Information System (INIS)
Flampouri, S; Li, Z; Hoppe, B
2015-01-01
Purpose: To develop a treatment planning method for passively-scattered involved-node proton therapy of mediastinal lymphoma robust to breathing and cardiac motions. Methods: Beam-specific planning treatment volumes (bsPTV) are calculated for each proton field to incorporate pertinent uncertainties. Geometric margins are added laterally to each beam while margins for range uncertainty due to setup errors, breathing, and calibration curve uncertainties are added along each beam. The calculation of breathing motion and deformation effects on proton range includes all 4DCT phases. The anisotropic water equivalent margins are translated to distances on average 4DCT. Treatment plans are designed so each beam adequately covers the corresponding bsPTV. For targets close to the heart, cardiac motion effects on dosemaps are estimated by using a library of anonymous ECG-gated cardiac CTs (cCT). The cCT, originally contrast-enhanced, are partially overridden to allow meaningful proton dose calculations. Targets similar to the treatment targets are drawn on one or more cCT sets matching the anatomy of the patient. Plans based on the average cCT are calculated on individual phases, then deformed to the average and accumulated. When clinically significant dose discrepancies occur between planned and accumulated doses, the patient plan is modified to reduce the cardiac motion effects. Results: We found that bsPTVs as planning targets create dose distributions similar to the conventional proton planning distributions, while they are a valuable tool for visualization of the uncertainties. For large targets with variability in motion and depth, integral dose was reduced because of the anisotropic margins. In most cases, heart motion has a clinically insignificant effect on target coverage. Conclusion: A treatment planning method was developed and used for proton therapy of mediastinal lymphoma. The technique incorporates bsPTVs compensating for all common sources of uncertainties
International Nuclear Information System (INIS)
Yamada, Takuyo; Chiba, Goro; Totsuka, Nobuo; Arioka, Koji
2003-01-01
In order to evaluate the stress corrosion cracking (SCC) susceptibility of cast duplex stainless steel which is used for the main coolant pipe of pressurized water reactors (PWRs), the slow strain rate technique (SSRT) and the constant load test (CLT) of the materials were performed in simulated primary water at 360degC. The cast duplex stainless steel contains ferrite phase with ranging from 8 to 23% and its mechanical properties are affected by long time thermal aging. Therefore, we paid attention to the influence of its ferrite content and thermal aging on the SCC susceptibility of this unaged and aged stainless steel and prepared three kinds of specimen with different ferrite contents (23%, 15% and 8%). The brittle fracture of the unaged specimens after SSRT mainly consists of quasi-cleavage fracture in austenitic phase. After aging, it changes to a mixture of quasi-cleavage fracture in both austenitic and ferritic phases. Microcracks were observed on the unaged specimen surfaces and aged ones for 10,000 hours at 400degC after about 10,000 hours of the CLT under the load condition of 1.2∼2.0 times of yield strength. The crack initiation sites of CLT specimens are similar to SSRT fracture surfaces. The SCC susceptibility of this 23% ferrite material increases with aging time at 400degC. The SCC susceptibility of 15% and 23% ferrite materials are higher than that of 8% ferrite material with aging condition for 30,000h at 400degC. (author)
Manchikanti, Laxmaiah; Falco, Frank J E; Singh, Vijay; Benyamin, Ramsin M; Racz, Gabor B; Helm, Standiford; Caraway, David L; Calodney, Aaron K; Snook, Lee T; Smith, Howard S; Gupta, Sanjeeva; Ward, Stephen P; Grider, Jay S; Hirsch, Joshua A
2013-04-01
disproportionate number of challenges compared to established medical specialties, including the inappropriate utilization of ineffective and unsafe techniques. In 2000, the American Society of Interventional Pain Physicians (ASIPP) created treatment guidelines to help practitioners. There have been 5 subsequent updates. These guidelines address the issues of systematic evaluation and ongoing care of chronic or persistent pain, and provide information about the scientific basis of recommended procedures. These guidelines are expected to increase patient compliance; dispel misconceptions among providers and patients, manage patient expectations reasonably; and form the basis of a therapeutic partnership between the patient, the provider, and payers.
Rumsey, Deborah
2011-01-01
The fun and easy way to get down to business with statistics Stymied by statistics? No fear ? this friendly guide offers clear, practical explanations of statistical ideas, techniques, formulas, and calculations, with lots of examples that show you how these concepts apply to your everyday life. Statistics For Dummies shows you how to interpret and critique graphs and charts, determine the odds with probability, guesstimate with confidence using confidence intervals, set up and carry out a hypothesis test, compute statistical formulas, and more.Tracks to a typical first semester statistics cou
Zghibi, Adel; Merzougui, Amira; Zouhri, Lahcen; Tarhouni, Jamila
2014-01-01
the dissolution of gypsum, dolomite and halite, as well as contamination by nitrate caused mainly by extensive irrigation activity. The application of Multivariate Statistics Techniques based on Principal component Analysis and Hierarchical Cluster Analysis has lead to the corroboration of the hypotheses developed from the previous hydrochemical study. Two factors were found that explained major hydrochemical processes in the aquifer. These factors reveal the existence of an intensive intrusion of seawater and mechanisms of nitrate contamination of groundwater.
... What Is Cancer? Cancer Statistics Cancer Disparities Cancer Statistics Cancer has a major impact on society in ... success of efforts to control and manage cancer. Statistics at a Glance: The Burden of Cancer in ...
... Coping with Alzheimer’s COPD Caregiving Take Care! Caregiver Statistics Statistics on Family Caregivers and Family Caregiving Caregiving Population ... Health Care Caregiver Self-Awareness State by State Statistics Caregiving Population The value of the services family ...
Statistical Pattern Recognition
Webb, Andrew R
2011-01-01
Statistical pattern recognition relates to the use of statistical techniques for analysing data measurements in order to extract information and make justified decisions. It is a very active area of study and research, which has seen many advances in recent years. Applications such as data mining, web searching, multimedia data retrieval, face recognition, and cursive handwriting recognition, all require robust and efficient pattern recognition techniques. This third edition provides an introduction to statistical pattern theory and techniques, with material drawn from a wide range of fields,
Montoya, Isaac D.
2008-01-01
Three classification techniques (Chi-square Automatic Interaction Detection [CHAID], Classification and Regression Tree [CART], and discriminant analysis) were tested to determine their accuracy in predicting Temporary Assistance for Needy Families program recipients' future employment. Technique evaluation was based on proportion of correctly…
Target Discrimination Using Infrared Techniques: Theoretical Considerations.
1985-02-01
of the bidirectional reflectance factor with wavelength and scan angle for wheat at a growth stage of 3.5 27,36, on the modified Feeks scale...wneat at a growth stage of 3.5 on the modified Feeks scale (boot stage) are considered. In each case, a vegetative index (VIN) is used to typify the...pp. 136-146 "- 153. Kelfrco: , K., 19c7, iardware for non-uniformity correction devices for non-scanned systems \\ Unicorns ’ • Interm ,ecnn al Report
Boslaugh, Sarah
2013-01-01
Need to learn statistics for your job? Want help passing a statistics course? Statistics in a Nutshell is a clear and concise introduction and reference for anyone new to the subject. Thoroughly revised and expanded, this edition helps you gain a solid understanding of statistics without the numbing complexity of many college texts. Each chapter presents easy-to-follow descriptions, along with graphics, formulas, solved examples, and hands-on exercises. If you want to perform common statistical analyses and learn a wide range of techniques without getting in over your head, this is your book.
Business statistics for dummies
Anderson, Alan
2013-01-01
Score higher in your business statistics course? Easy. Business statistics is a common course for business majors and MBA candidates. It examines common data sets and the proper way to use such information when conducting research and producing informational reports such as profit and loss statements, customer satisfaction surveys, and peer comparisons. Business Statistics For Dummies tracks to a typical business statistics course offered at the undergraduate and graduate levels and provides clear, practical explanations of business statistical ideas, techniques, formulas, and calculations, w
The statistical stability phenomenon
Gorban, Igor I
2017-01-01
This monograph investigates violations of statistical stability of physical events, variables, and processes and develops a new physical-mathematical theory taking into consideration such violations – the theory of hyper-random phenomena. There are five parts. The first describes the phenomenon of statistical stability and its features, and develops methods for detecting violations of statistical stability, in particular when data is limited. The second part presents several examples of real processes of different physical nature and demonstrates the violation of statistical stability over broad observation intervals. The third part outlines the mathematical foundations of the theory of hyper-random phenomena, while the fourth develops the foundations of the mathematical analysis of divergent and many-valued functions. The fifth part contains theoretical and experimental studies of statistical laws where there is violation of statistical stability. The monograph should be of particular interest to engineers...
Energy Technology Data Exchange (ETDEWEB)
Kawano, Toshihiko [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2015-11-10
This theoretical treatment of low-energy compound nucleus reactions begins with the Bohr hypothesis, with corrections, and various statistical theories. The author investigates the statistical properties of the scattering matrix containing a Gaussian Orthogonal Ensemble (GOE) Hamiltonian in the propagator. The following conclusions are reached: For all parameter values studied, the numerical average of MC-generated cross sections coincides with the result of the Verbaarschot, Weidenmueller, Zirnbauer triple-integral formula. Energy average and ensemble average agree reasonably well when the width I is one or two orders of magnitude larger than the average resonance spacing d. In the strong-absorption limit, the channel degree-of-freedom ν _{a} is 2. The direct reaction increases the inelastic cross sections while the elastic cross section is reduced.
Wallis, W Allen
2014-01-01
Focusing on everyday applications as well as those of scientific research, this classic of modern statistical methods requires little to no mathematical background. Readers develop basic skills for evaluating and using statistical data. Lively, relevant examples include applications to business, government, social and physical sciences, genetics, medicine, and public health. ""W. Allen Wallis and Harry V. Roberts have made statistics fascinating."" - The New York Times ""The authors have set out with considerable success, to write a text which would be of interest and value to the student who,
Search Databases and Statistics
DEFF Research Database (Denmark)
Refsgaard, Jan C; Munk, Stephanie; Jensen, Lars J
2016-01-01
searches. Additionally, careful filtering and use of appropriate statistical tests on the output datasets affects the quality of all downstream analyses and interpretation of the data. Our considerations and general practices on these aspects of phosphoproteomics data processing are presented here....
Eliazar, Iddo
2017-05-01
The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their 'public relations' for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford's law, and 1/f noise.
Experimental Mathematics and Computational Statistics
Energy Technology Data Exchange (ETDEWEB)
Bailey, David H.; Borwein, Jonathan M.
2009-04-30
The field of statistics has long been noted for techniques to detect patterns and regularities in numerical data. In this article we explore connections between statistics and the emerging field of 'experimental mathematics'. These includes both applications of experimental mathematics in statistics, as well as statistical methods applied to computational mathematics.
Energy Technology Data Exchange (ETDEWEB)
Velazquez, J. C.; Caleyo, F.; Valorm, A.; Hallen, J. M.
2011-07-01
New deterministic and stochastic predictive models are proposed for external pitting corrosion in underground pipelines. The deterministic model takes into consideration the local chemical and physical properties of the soil as well as the pipeline coating to predict the time dependence of pitting depth and rate in a range of soils. This model, based on results from a field study, was used to conduct Monte Carlo simulations that established the probability distribution of pitting depth and growth rate in the studied soils and their evolution over the life of the pipeline. In the last stage of the study, an empirical Markov chain-based stochastic model was developed for predicting the evolution of pitting corrosion depth and rate distributions from the observed properties of the soil. (Author) 18 refs.
Plant, Emma L; Smernik, Ronald J; van Leeuwen, John; Greenwood, Paul; Macdonald, Lynne M
2014-03-01
The paper-making process can produce large amounts of wastewater (WW) with high particulate and dissolved organic loads. Generally, in developed countries, stringent international regulations for environmental protection require pulp and paper mill WW to be treated to reduce the organic load prior to discharge into the receiving environment. This can be achieved by primary and secondary treatments involving both chemical and biological processes. These processes result in complex changes in the nature of the organic material, as some components are mineralised and others are transformed. In this study, changes in the nature of organics through different stages of secondary treatment of pulp and paper mill WW were followed using three advanced characterisation techniques: solid-state (13)C nuclear magnetic resonance (NMR) spectroscopy, pyrolysis-gas chromatography mass spectrometry (py-GCMS) and high-performance size-exclusion chromatography (HPSEC). Each technique provided a different perspective on the changes that occurred. To compare the different chemical perspectives in terms of the degree of similarity/difference between samples, we employed non-metric multidimensional scaling. Results indicate that NMR and HPSEC provided strongly correlated perspectives, with 86 % of the discrimination between the organic samples common to both techniques. Conversely, py-GCMS was found to provide a unique, and thus complementary, perspective.
Sadovskii, Michael V
2012-01-01
This volume provides a compact presentation of modern statistical physics at an advanced level. Beginning with questions on the foundations of statistical mechanics all important aspects of statistical physics are included, such as applications to ideal gases, the theory of quantum liquids and superconductivity and the modern theory of critical phenomena. Beyond that attention is given to new approaches, such as quantum field theory methods and non-equilibrium problems.
Energy Technology Data Exchange (ETDEWEB)
Eliazar, Iddo, E-mail: eliazar@post.tau.ac.il
2017-05-15
The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.
Goodman, Joseph W
2015-01-01
This book discusses statistical methods that are useful for treating problems in modern optics, and the application of these methods to solving a variety of such problems This book covers a variety of statistical problems in optics, including both theory and applications. The text covers the necessary background in statistics, statistical properties of light waves of various types, the theory of partial coherence and its applications, imaging with partially coherent light, atmospheric degradations of images, and noise limitations in the detection of light. New topics have been introduced i
Farajzadeh, M.; Oji, R.; Cannon, A. J.; Ghavidel, Y.; Massah Bavani, A.
2015-04-01
Seven single-site statistical downscaling methods for daily temperature and precipitation, including four deterministic algorithms [analog model (ANM), quantile mapping with delta method extrapolation (QMD), cumulative distribution function transform (CDFt), and model-based recursive partitioning (MOB)] and three stochastic algorithms [generalized linear model (GLM), Conditional Density Estimation Network Creation and Evaluation (CaDENCE), and Statistical Downscaling Model-Decision Centric (SDSM-DC] are evaluated at nine stations located in the mountainous region of Iran's Midwest. The methods are of widely varying complexity, with input requirements that range from single-point predictors of temperature and precipitation to multivariate synoptic-scale fields. The period 1981-2000 is used for model calibration and 2001-2010 for validation, with performance assessed in terms of 27 Climate Extremes Indices (CLIMDEX). The sensitivity of the methods to large-scale anomalies and their ability to replicate the observed data distribution in the validation period are separately tested for each index by Pearson correlation and Kolmogorov-Smirnov (KS) tests, respectively. Combined tests are used to assess overall model performances. MOB performed best, passing 14.5 % (49.6 %) of the combined (single) tests, respectively, followed by SDSM, CaDENCE, and GLM [14.5 % (46.5 %), 13.2 % (47.1 %), and 12.8 % (43.2 %), respectively], and then by QMD, CDFt, and ANM [7 % (45.7 %), 4.9 % (45.3 %), and 1.6 % (37.9 %), respectively]. Correlation tests were passed less frequently than KS tests. All methods downscaled temperature indices better than precipitation indices. Some indices, notably R20, R25, SDII, CWD, and TNx, were not successfully simulated by any of the methods. Model performance varied widely across the study region.
Energy Technology Data Exchange (ETDEWEB)
Lee, Seung Hyun, E-mail: circle1128@yuhs.ac [Department of Radiology and Research Institute of Radiological Science, Severance Children' s Hospital, Yonsei University, College of Medicine, Seoul (Korea, Republic of); Kim, Myung-Joon, E-mail: mjkim@yuhs.ac [Department of Radiology and Research Institute of Radiological Science, Severance Children' s Hospital, Yonsei University, College of Medicine, Seoul (Korea, Republic of); Yoon, Choon-Sik, E-mail: yooncs58@yuhs.ac [Department of Radiology, Gangnam Severance Hospital, Yonsei University, College of Medicine, Seoul (Korea, Republic of); Lee, Mi-Jung, E-mail: mjl1213@yuhs.ac [Department of Radiology and Research Institute of Radiological Science, Severance Children' s Hospital, Yonsei University, College of Medicine, Seoul (Korea, Republic of)
2012-09-15
Objective: To retrospectively compare radiation dose and image quality of pediatric chest CT using a routine dose protocol reconstructed with filtered back projection (FBP) (the Routine study) and a low-dose protocol with 50% adaptive statistical iterative reconstruction (ASIR) (the ASIR study). Materials and methods: We retrospectively reviewed chest CT performed in pediatric patients who underwent both the Routine study and the ASIR study on different days between January 2010 and August 2011. Volume CT dose indices (CTDIvol), dose length products (DLP), and effective doses were obtained to estimate radiation dose. The image quality was evaluated objectively as noise measured in the descending aorta and paraspinal muscle, and subjectively by three radiologists for noise, sharpness, artifacts, and diagnostic acceptability using a four-point scale. The paired Student's t-test and the Wilcoxon signed-rank test were used for statistical analysis. Results: Twenty-six patients (M:F = 13:13, mean age 11.7) were enrolled. The ASIR studies showed 60.3%, 56.2%, and 55.2% reductions in CTDIvol (from 18.73 to 7.43 mGy, P < 0.001), DLP (from 307.42 to 134.51 mGy × cm, P < 0.001), and effective dose (from 4.12 to 1.84 mSv, P < 0.001), respectively, compared with the Routine studies. The objective noise was higher in the paraspinal muscle of the ASIR studies (20.81 vs. 16.67, P = 0.004), but was not different in the aorta (18.23 vs. 18.72, P = 0.726). The subjective image quality demonstrated no difference between the two studies. Conclusion: A low-dose protocol with 50% ASIR allows radiation dose reduction in pediatric chest CT by more than 55% while maintaining image quality.
Lyons, L.
2016-01-01
Accelerators and detectors are expensive, both in terms of money and human effort. It is thus important to invest effort in performing a good statistical anal- ysis of the data, in order to extract the best information from it. This series of five lectures deals with practical aspects of statistical issues that arise in typical High Energy Physics analyses.
DEFF Research Database (Denmark)
Tryggestad, Kjell
2004-01-01
The study aims is to describe how the inclusion and exclusion of materials and calculative devices construct the boundaries and distinctions between statistical facts and artifacts in economics. My methodological approach is inspired by John Graunt's (1667) Political arithmetic and more recent work...... within constructivism and the field of Science and Technology Studies (STS). The result of this approach is here termed reversible statistics, reconstructing the findings of a statistical study within economics in three different ways. It is argued that all three accounts are quite normal, albeit...... by accounting for the significance of the materials and the equipment that enters into the production of statistics. Key words: Reversible statistics, diverse materials, constructivism, economics, science, and technology....
Chung, Woon-Kwan; Park, Hyong-Hu; Im, In-Chul; Lee, Jae-Seung; Goo, Eun-Hoe; Dong, Kyung-Rae
2012-09-01
This paper proposes a computer-aided diagnosis (CAD) system based on texture feature analysis and statistical wavelet transformation technology to diagnose fatty liver disease with computed tomography (CT) imaging. In the target image, a wavelet transformation was performed for each lesion area to set the region of analysis (ROA, window size: 50 × 50 pixels) and define the texture feature of a pixel. Based on the extracted texture feature values, six parameters (average gray level, average contrast, relative smoothness, skewness, uniformity, and entropy) were determined to calculate the recognition rate for a fatty liver. In addition, a multivariate analysis of the variance (MANOVA) method was used to perform a discriminant analysis to verify the significance of the extracted texture feature values and the recognition rate for a fatty liver. According to the results, each texture feature value was significant for a comparison of the recognition rate for a fatty liver ( p fatty liver had the same scale as that for the F-value, showing 100% (average gray level) at the maximum and 80% (average contrast) at the minimum. Therefore, the recognition rate is believed to be a useful clinical value for the automatic detection and computer-aided diagnosis (CAD) using the texture feature value. Nevertheless, further study on various diseases and singular diseases will be needed in the future.
Directory of Open Access Journals (Sweden)
R. Soundararajan
2015-01-01
Full Text Available Artificial Neural Network (ANN approach was used for predicting and analyzing the mechanical properties of A413 aluminum alloy produced by squeeze casting route. The experiments are carried out with different controlled input variables such as squeeze pressure, die preheating temperature, and melt temperature as per Full Factorial Design (FFD. The accounted absolute process variables produce a casting with pore-free and ideal fine grain dendritic structure resulting in good mechanical properties such as hardness, ultimate tensile strength, and yield strength. As a primary objective, a feed forward back propagation ANN model has been developed with different architectures for ensuring the definiteness of the values. The developed model along with its predicted data was in good agreement with the experimental data, inferring the valuable performance of the optimal model. From the work it was ascertained that, for castings produced by squeeze casting route, the ANN is an alternative method for predicting the mechanical properties and appropriate results can be estimated rather than measured, thereby reducing the testing time and cost. As a secondary objective, quantitative and statistical analysis was performed in order to evaluate the effect of process parameters on the mechanical properties of the castings.
Directory of Open Access Journals (Sweden)
Hélène eColineaux
2015-10-01
Full Text Available INTRODUCTION. The use of genetic predictive markers in medical practice does not necessarily bear the same kind of medical and ethical consequences than that of genes directly involved in monogenic diseases. However, the French bioethics law framed in the same way the production and use of any genetic information. It seems therefore necessary to explore the practical and ethical context of the actual use of predictive markers in order to highlight their specific stakes. In this study, we document the uses of HLA-B*27, which are an interesting example of the multiple features of genetic predictive marker in general medical practice.MATERIAL & METHODS. The aims of this monocentric and qualitative study were to identify concrete and ethical issues of using the HLA-B*27 marker and the interests and limits of the legal framework as perceived by prescribers. In this regard, a thematic and descriptive analysis of five rheumatologists’ semi-structured and face-to-face interviews was performed.RESULTS. According to most of the interviewees, HLA-B*27 is an overframed test because they considered that this test is not really genetic or at least does not have the same nature as classical genetic tests; HLA-B*27 is not concerned by the ethical challenges of genetic test; the major ethics stake of this marker is not linked to its genetic nature but rather to the complexity of the probabilistic information. This study allows also showing that HLA-B*27, validated for a certain usage, may be used in different ways in practice.DISCUSSION. This marker and its clinical uses underline the challenges of translating both statistical concepts and unifying legal framework in clinical practice. This study allows identifying some new aspects and stakes of genetics in medicine and shows the need of additional studies about the use of predictive genetic markers, in order to provide a better basis for decisions and legal framework regarding these practices.
Feiveson, Alan H.; Foy, Millennia; Ploutz-Snyder, Robert; Fiedler, James
2014-01-01
Do you have elevated p-values? Is the data analysis process getting you down? Do you experience anxiety when you need to respond to criticism of statistical methods in your manuscript? You may be suffering from Insufficient Statistical Support Syndrome (ISSS). For symptomatic relief of ISSS, come for a free consultation with JSC biostatisticians at our help desk during the poster sessions at the HRP Investigators Workshop. Get answers to common questions about sample size, missing data, multiple testing, when to trust the results of your analyses and more. Side effects may include sudden loss of statistics anxiety, improved interpretation of your data, and increased confidence in your results.
Wannier, Gregory Hugh
1966-01-01
Until recently, the field of statistical physics was traditionally taught as three separate subjects: thermodynamics, statistical mechanics, and kinetic theory. This text, a forerunner in its field and now a classic, was the first to recognize the outdated reasons for their separation and to combine the essentials of the three subjects into one unified presentation of thermal physics. It has been widely adopted in graduate and advanced undergraduate courses, and is recommended throughout the field as an indispensable aid to the independent study and research of statistical physics.Designed for
Blakemore, J S
1962-01-01
Semiconductor Statistics presents statistics aimed at complementing existing books on the relationships between carrier densities and transport effects. The book is divided into two parts. Part I provides introductory material on the electron theory of solids, and then discusses carrier statistics for semiconductors in thermal equilibrium. Of course a solid cannot be in true thermodynamic equilibrium if any electrical current is passed; but when currents are reasonably small the distribution function is but little perturbed, and the carrier distribution for such a """"quasi-equilibrium"""" co
International Nuclear Information System (INIS)
Chung, Woon-Kwan; Park, Hyong-Hu; Im, In-Chul; Lee, Jae-Seung; Goo, Eun-Hoe; Dong, Kyung-Rae
2012-01-01
This paper proposes a computer-aided diagnosis (CAD) system based on texture feature analysis and statistical wavelet transformation technology to diagnose fatty liver disease with computed tomography (CT) imaging. In the target image, a wavelet transformation was performed for each lesion area to set the region of analysis (ROA, window size: 50 x 50 pixels) and define the texture feature of a pixel. Based on the extracted texture feature values, six parameters (average gray level, average contrast, relative smoothness, skewness, uniformity, and entropy) were determined to calculate the recognition rate for a fatty liver. In addition, a multivariate analysis of the variance (MANOVA) method was used to perform a discriminant analysis to verify the significance of the extracted texture feature values and the recognition rate for a fatty liver. According to the results, each texture feature value was significant for a comparison of the recognition rate for a fatty liver (p < 0.05). Furthermore, the F-value, which was used as a scale for the difference in recognition rates, was highest in the average gray level, relatively high in the skewness and the entropy, and relatively low in the uniformity, the relative smoothness and the average contrast. The recognition rate for a fatty liver had the same scale as that for the F-value, showing 100% (average gray level) at the maximum and 80% (average contrast) at the minimum. Therefore, the recognition rate is believed to be a useful clinical value for the automatic detection and computer-aided diagnosis (CAD) using the texture feature value. Nevertheless, further study on various diseases and singular diseases will be needed in the future.
Afeyan, Bedros; Hüller, Stefan; Montgomery, David; Moody, John; Froula, Dustin; Hammer, James; Jones, Oggie; Amendt, Peter
2014-10-01
In mid-Z and high-Z plasmas, it is possible to control crossed bean energy transfer (CBET) and subsequently occurring single or multiple beam instabilities such as Stimulated Raman Scattering (SRS) by novel means. These new techniques are inoperative when the ion acoustic waves are in their strong damping limit, such as occurs in low Z plasmas with comparable electron and ion temperatures. For mid-Z plasmas, such as Z = 10, and near the Mach 1 surface, the strong coupling regime (SCR) can be exploited for LPI mitigation. While at higher Z values, it is thermal filamentation in conjunction with nonlocal heat transport that are useful to exploit. In both these settings, the strategy is to induce laser hot spot intensity dependent, and thus spatially dependent, frequency shifts to the ion acoustic waves in the transient response of wave-wave interactions. The latter is achieved by the on-off nature of spike trains of uneven duration and delay, STUD pulses. The least taxing use of STUD pulses is to modulate the beams at the 10 ps time scale and to choose which crossing beams are overlapping in time and which are not. Work supported by a grant from the DOE NNSA-OFES joint program on HEDP
Department of Homeland Security — Accident statistics available on the Coast Guard’s website by state, year, and one variable to obtain tables and/or graphs. Data from reports has been loaded for...
U.S. Department of Health & Human Services — The CMS Center for Strategic Planning produces an annual CMS Statistics reference booklet that provides a quick reference for summary information about health...
DEFF Research Database (Denmark)
Tryggestad, Kjell
2004-01-01
The study aims is to describe how the inclusion and exclusion of materials and calculative devices construct the boundaries and distinctions between statistical facts and artifacts in economics. My methodological approach is inspired by John Graunt's (1667) Political arithmetic and more recent work...... within constructivism and the field of Science and Technology Studies (STS). The result of this approach is here termed reversible statistics, reconstructing the findings of a statistical study within economics in three different ways. It is argued that all three accounts are quite normal, albeit...... in different ways. The presence and absence of diverse materials, both natural and political, is what distinguishes them from each other. Arguments are presented for a more symmetric relation between the scientific statistical text and the reader. I will argue that a more symmetric relation can be achieved...
Energy Technology Data Exchange (ETDEWEB)
Wendelberger, Laura Jean [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2017-08-08
In large datasets, it is time consuming or even impossible to pick out interesting images. Our proposed solution is to find statistics to quantify the information in each image and use those to identify and pick out images of interest.
Lyons, L.
2017-01-01
Accelerators and detectors are expensive, both in terms of money and human effort. It is thus important to invest effort in performing a good statistical anal- ysis of the data, in order to extract the best information from it. This series of five lectures deals with practical aspects of statistical issues that arise in typical High Energy Physics analyses. Accelerators and detectors are expensive, both in terms of money and human effort. It is thus important to invest effort in performing...
Introductory statistical inference
Mukhopadhyay, Nitis
2014-01-01
This gracefully organized text reveals the rigorous theory of probability and statistical inference in the style of a tutorial, using worked examples, exercises, figures, tables, and computer simulations to develop and illustrate concepts. Drills and boxed summaries emphasize and reinforce important ideas and special techniques.Beginning with a review of the basic concepts and methods in probability theory, moments, and moment generating functions, the author moves to more intricate topics. Introductory Statistical Inference studies multivariate random variables, exponential families of dist
Affum, Andrews Obeng; Osae, Shiloh Dede; Nyarko, Benjamin Jabez Botwe; Afful, Samuel; Fianko, Joseph Richmond; Akiti, Tetteh Thomas; Adomako, Dickson; Acquaah, Samuel Osafo; Dorleku, Micheal; Antoh, Emmanuel; Barnes, Felix; Affum, Enoch Acheampong
2015-02-01
In recent times, surface water resource in the Western Region of Ghana has been found to be inadequate in supply and polluted by various anthropogenic activities. As a result of these problems, the demand for groundwater by the human populations in the peri-urban communities for domestic, municipal and irrigation purposes has increased without prior knowledge of its water quality. Water samples were collected from 14 public hand-dug wells during the rainy season in 2013 and investigated for total coliforms, Escherichia coli, mercury (Hg), arsenic (As), cadmium (Cd) and physicochemical parameters. Multivariate statistical analysis of the dataset and a linear stoichiometric plot of major ions were applied to group the water samples and to identify the main factors and sources of contamination. Hierarchal cluster analysis revealed four clusters from the hydrochemical variables (R-mode) and three clusters in the case of water samples (Q-mode) after z score standardization. Principal component analysis after a varimax rotation of the dataset indicated that the four factors extracted explained 93.3 % of the total variance, which highlighted salinity, toxic elements and hardness pollution as the dominant factors affecting groundwater quality. Cation exchange, mineral dissolution and silicate weathering influenced groundwater quality. The ranking order of major ions was Na(+) > Ca(2+) > K(+) > Mg(2+) and Cl(-) > SO4 (2-) > HCO3 (-). Based on piper plot and the hydrogeology of the study area, sodium chloride (86 %), sodium hydrogen carbonate and sodium carbonate (14 %) water types were identified. Although E. coli were absent in the water samples, 36 % of the wells contained total coliforms (Enterobacter species) which exceeded the WHO guidelines limit of zero colony-forming unit (CFU)/100 mL of drinking water. With the exception of Hg, the concentration of As and Cd in 79 and 43 % of the water samples exceeded the WHO guideline limits of 10 and 3
Bayesian statistics an introduction
Lee, Peter M
2012-01-01
Bayesian Statistics is the school of thought that combines prior beliefs with the likelihood of a hypothesis to arrive at posterior beliefs. The first edition of Peter Lee’s book appeared in 1989, but the subject has moved ever onwards, with increasing emphasis on Monte Carlo based techniques. This new fourth edition looks at recent techniques such as variational methods, Bayesian importance sampling, approximate Bayesian computation and Reversible Jump Markov Chain Monte Carlo (RJMCMC), providing a concise account of the way in which the Bayesian approach to statistics develops as wel
Evolutionary Statistical Procedures
Baragona, Roberto; Poli, Irene
2011-01-01
This proposed text appears to be a good introduction to evolutionary computation for use in applied statistics research. The authors draw from a vast base of knowledge about the current literature in both the design of evolutionary algorithms and statistical techniques. Modern statistical research is on the threshold of solving increasingly complex problems in high dimensions, and the generalization of its methodology to parameters whose estimators do not follow mathematically simple distributions is underway. Many of these challenges involve optimizing functions for which analytic solutions a
Jana, Madhusudan
2015-01-01
Statistical mechanics is self sufficient, written in a lucid manner, keeping in mind the exam system of the universities. Need of study this subject and its relation to Thermodynamics is discussed in detail. Starting from Liouville theorem gradually, the Statistical Mechanics is developed thoroughly. All three types of Statistical distribution functions are derived separately with their periphery of applications and limitations. Non-interacting ideal Bose gas and Fermi gas are discussed thoroughly. Properties of Liquid He-II and the corresponding models have been depicted. White dwarfs and condensed matter physics, transport phenomenon - thermal and electrical conductivity, Hall effect, Magneto resistance, viscosity, diffusion, etc. are discussed. Basic understanding of Ising model is given to explain the phase transition. The book ends with a detailed coverage to the method of ensembles (namely Microcanonical, canonical and grand canonical) and their applications. Various numerical and conceptual problems ar...
Rohatgi, Vijay K
2003-01-01
Unified treatment of probability and statistics examines and analyzes the relationship between the two fields, exploring inferential issues. Numerous problems, examples, and diagrams--some with solutions--plus clear-cut, highlighted summaries of results. Advanced undergraduate to graduate level. Contents: 1. Introduction. 2. Probability Model. 3. Probability Distributions. 4. Introduction to Statistical Inference. 5. More on Mathematical Expectation. 6. Some Discrete Models. 7. Some Continuous Models. 8. Functions of Random Variables and Random Vectors. 9. Large-Sample Theory. 10. General Meth
Levine-Wissing, Robin
2012-01-01
All Access for the AP® Statistics Exam Book + Web + Mobile Everything you need to prepare for the Advanced Placement® exam, in a study system built around you! There are many different ways to prepare for an Advanced Placement® exam. What's best for you depends on how much time you have to study and how comfortable you are with the subject matter. To score your highest, you need a system that can be customized to fit you: your schedule, your learning style, and your current level of knowledge. This book, and the online tools that come with it, will help you personalize your AP® Statistics prep
Mandl, Franz
1988-01-01
The Manchester Physics Series General Editors: D. J. Sandiford; F. Mandl; A. C. Phillips Department of Physics and Astronomy, University of Manchester Properties of Matter B. H. Flowers and E. Mendoza Optics Second Edition F. G. Smith and J. H. Thomson Statistical Physics Second Edition E. Mandl Electromagnetism Second Edition I. S. Grant and W. R. Phillips Statistics R. J. Barlow Solid State Physics Second Edition J. R. Hook and H. E. Hall Quantum Mechanics F. Mandl Particle Physics Second Edition B. R. Martin and G. Shaw The Physics of Stars Second Edition A. C. Phillips Computing for Scient
Flipping the statistics classroom in nursing education.
Schwartz, Todd A
2014-04-01
Flipped classrooms are so named because they substitute the traditional lecture that commonly encompasses the entire class period with active learning techniques, such as small-group work. The lectures are delivered instead by using an alternative mode--video recordings--that are made available for viewing online outside the class period. Due to this inverted approach, students are engaged with the course material during the class period, rather than participating only passively. This flipped approach is gaining popularity in many areas of education due to its enhancement of student learning and represents an opportunity for utilization by instructors of statistics courses in nursing education. This article presents the author's recent experiences with flipping a statistics course for nursing students in a PhD program, including practical considerations and student outcomes and reaction. This transformative experience deepened the level of student learning in a way that may not have occurred using a traditional format. Copyright 2014, SLACK Incorporated.
Collados-Lara, Antonio-Juan; Pulido-Velazquez, David; Pardo-Iguzquiza, Eulogio
2017-04-01
Assessing impacts of potential future climate change scenarios in precipitation and temperature is essential to design adaptive strategies in water resources systems. The objective of this work is to analyze the possibilities of different statistical downscaling methods to generate future potential scenarios in an Alpine Catchment from historical data and the available climate models simulations performed in the frame of the CORDEX EU project. The initial information employed to define these downscaling approaches are the historical climatic data (taken from the Spain02 project for the period 1971-2000 with a spatial resolution of 12.5 Km) and the future series provided by climatic models in the horizon period 2071-2100 . We have used information coming from nine climate model simulations (obtained from five different Regional climate models (RCM) nested to four different Global Climate Models (GCM)) from the European CORDEX project. In our application we have focused on the Representative Concentration Pathways (RCP) 8.5 emissions scenario, which is the most unfavorable scenario considered in the fifth Assessment Report (AR5) by the Intergovernmental Panel on Climate Change (IPCC). For each RCM we have generated future climate series for the period 2071-2100 by applying two different approaches, bias correction and delta change, and five different transformation techniques (first moment correction, first and second moment correction, regression functions, quantile mapping using distribution derived transformation and quantile mapping using empirical quantiles) for both of them. Ensembles of the obtained series were proposed to obtain more representative potential future climate scenarios to be employed to study potential impacts. In this work we propose a non-equifeaseble combination of the future series giving more weight to those coming from models (delta change approaches) or combination of models and techniques that provides better approximation to the basic
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education; Volume 4; Issue 10. Statistical Computing - Understanding Randomness and Random Numbers. Sudhakar Kunte. Series Article Volume 4 Issue 10 October 1999 pp 16-21. Fulltext. Click here to view fulltext PDF. Permanent link:
Schrödinger, Erwin
1952-01-01
Nobel Laureate's brilliant attempt to develop a simple, unified standard method of dealing with all cases of statistical thermodynamics - classical, quantum, Bose-Einstein, Fermi-Dirac, and more.The work also includes discussions of Nernst theorem, Planck's oscillator, fluctuations, the n-particle problem, problem of radiation, much more.
Advances in statistical multisource-multitarget information fusion
Mahler, Ronald PS
2014-01-01
This is the sequel to the 2007 Artech House bestselling title, Statistical Multisource-Multitarget Information Fusion. That earlier book was a comprehensive resource for an in-depth understanding of finite-set statistics (FISST), a unified, systematic, and Bayesian approach to information fusion. The cardinalized probability hypothesis density (CPHD) filter, which was first systematically described in the earlier book, has since become a standard multitarget detection and tracking technique, especially in research and development.Since 2007, FISST has inspired a considerable amount of research
Probability and Bayesian statistics
1987-01-01
This book contains selected and refereed contributions to the "Inter national Symposium on Probability and Bayesian Statistics" which was orga nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...
Statistics 101 for Radiologists.
Anvari, Arash; Halpern, Elkan F; Samir, Anthony E
2015-10-01
Diagnostic tests have wide clinical applications, including screening, diagnosis, measuring treatment effect, and determining prognosis. Interpreting diagnostic test results requires an understanding of key statistical concepts used to evaluate test efficacy. This review explains descriptive statistics and discusses probability, including mutually exclusive and independent events and conditional probability. In the inferential statistics section, a statistical perspective on study design is provided, together with an explanation of how to select appropriate statistical tests. Key concepts in recruiting study samples are discussed, including representativeness and random sampling. Variable types are defined, including predictor, outcome, and covariate variables, and the relationship of these variables to one another. In the hypothesis testing section, we explain how to determine if observed differences between groups are likely to be due to chance. We explain type I and II errors, statistical significance, and study power, followed by an explanation of effect sizes and how confidence intervals can be used to generalize observed effect sizes to the larger population. Statistical tests are explained in four categories: t tests and analysis of variance, proportion analysis tests, nonparametric tests, and regression techniques. We discuss sensitivity, specificity, accuracy, receiver operating characteristic analysis, and likelihood ratios. Measures of reliability and agreement, including κ statistics, intraclass correlation coefficients, and Bland-Altman graphs and analysis, are introduced. © RSNA, 2015.
Official Statistics and Statistics Education: Bridging the Gap
Directory of Open Access Journals (Sweden)
Gal Iddo
2017-03-01
Full Text Available This article aims to challenge official statistics providers and statistics educators to ponder on how to help non-specialist adult users of statistics develop those aspects of statistical literacy that pertain to official statistics. We first document the gap in the literature in terms of the conceptual basis and educational materials needed for such an undertaking. We then review skills and competencies that may help adults to make sense of statistical information in areas of importance to society. Based on this review, we identify six elements related to official statistics about which non-specialist adult users should possess knowledge in order to be considered literate in official statistics: (1 the system of official statistics and its work principles; (2 the nature of statistics about society; (3 indicators; (4 statistical techniques and big ideas; (5 research methods and data sources; and (6 awareness and skills for citizens’ access to statistical reports. Based on this ad hoc typology, we discuss directions that official statistics providers, in cooperation with statistics educators, could take in order to (1 advance the conceptualization of skills needed to understand official statistics, and (2 expand educational activities and services, specifically by developing a collaborative digital textbook and a modular online course, to improve public capacity for understanding of official statistics.
Statistical considerations for grain-size analyses of tills
Jacobs, A.M.
1971-01-01
Relative percentages of sand, silt, and clay from samples of the same till unit are not identical because of different lithologies in the source areas, sorting in transport, random variation, and experimental error. Random variation and experimental error can be isolated from the other two as follows. For each particle-size class of each till unit, a standard population is determined by using a normally distributed, representative group of data. New measurements are compared with the standard population and, if they compare satisfactorily, the experimental error is not significant and random variation is within the expected range for the population. The outcome of the comparison depends on numerical criteria derived from a graphical method rather than on a more commonly used one-way analysis of variance with two treatments. If the number of samples and the standard deviation of the standard population are substituted in a t-test equation, a family of hyperbolas is generated, each of which corresponds to a specific number of subsamples taken from each new sample. The axes of the graphs of the hyperbolas are the standard deviation of new measurements (horizontal axis) and the difference between the means of the new measurements and the standard population (vertical axis). The area between the two branches of each hyperbola corresponds to a satisfactory comparison between the new measurements and the standard population. Measurements from a new sample can be tested by plotting their standard deviation vs. difference in means on axes containing a hyperbola corresponding to the specific number of subsamples used. If the point lies between the branches of the hyperbola, the measurements are considered reliable. But if the point lies outside this region, the measurements are repeated. Because the critical segment of the hyperbola is approximately a straight line parallel to the horizontal axis, the test is simplified to a comparison between the means of the standard population and the means of the subsample. The minimum number of subsamples required to prove significant variation between samples caused by different lithologies in the source areas and sorting in transport can be determined directly from the graphical method. The minimum number of subsamples required is the maximum number to be run for economy of effort. ?? 1971 Plenum Publishing Corporation.
International Nuclear Information System (INIS)
Anon.
1994-01-01
For the years 1992 and 1993, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period. The tables and figures shown in this publication are: Changes in the volume of GNP and energy consumption; Coal consumption; Natural gas consumption; Peat consumption; Domestic oil deliveries; Import prices of oil; Price development of principal oil products; Fuel prices for power production; Total energy consumption by source; Electricity supply; Energy imports by country of origin in 1993; Energy exports by recipient country in 1993; Consumer prices of liquid fuels; Consumer prices of hard coal and natural gas, prices of indigenous fuels; Average electricity price by type of consumer; Price of district heating by type of consumer and Excise taxes and turnover taxes included in consumer prices of some energy sources
Goodman, Joseph W.
2000-07-01
The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson The Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences Robert G. Bartle The Elements of Integration and Lebesgue Measure George E. P. Box & Norman R. Draper Evolutionary Operation: A Statistical Method for Process Improvement George E. P. Box & George C. Tiao Bayesian Inference in Statistical Analysis R. W. Carter Finite Groups of Lie Type: Conjugacy Classes and Complex Characters R. W. Carter Simple Groups of Lie Type William G. Cochran & Gertrude M. Cox Experimental Designs, Second Edition Richard Courant Differential and Integral Calculus, Volume I RIchard Courant Differential and Integral Calculus, Volume II Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume I Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume II D. R. Cox Planning of Experiments Harold S. M. Coxeter Introduction to Geometry, Second Edition Charles W. Curtis & Irving Reiner Representation Theory of Finite Groups and Associative Algebras Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume I Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume II Cuthbert Daniel Fitting Equations to Data: Computer Analysis of Multifactor Data, Second Edition Bruno de Finetti Theory of Probability, Volume I Bruno de Finetti Theory of Probability, Volume 2 W. Edwards Deming Sample Design in Business Research
Statistical finite element analysis.
Khalaji, Iman; Rahemifar, Kaamran; Samani, Abbas
2008-01-01
A novel technique is introduced for tissue deformation and stress analysis. Compared to the conventional Finite Element method, this technique is orders of magnitude faster and yet still very accurate. The proposed technique uses preprocessed data obtained from FE analyses of a number of similar objects in a Statistical Shape Model framework as described below. This technique takes advantage of the fact that the body organs have limited variability, especially in terms of their geometry. As such, it is well suited for calculating tissue displacements of body organs. The proposed technique can be applied in many biomedical applications such as image guided surgery, or virtual reality environment development where tissue behavior is simulated for training purposes.
Statistically tuned Gaussian background subtraction technique for ...
Indian Academy of Sciences (India)
The non-parametric background modelling approach proposed by Martin Hofmann et al (2012) involves modelling of foreground by the history of recently ... background subtraction system with mixture of Gaussians, deviation scaling factor and max– min background model for outdoor environment. Selection of detection ...
Statistics techniques applied to electron probe microanalysis
International Nuclear Information System (INIS)
Brizuela, H.; Del Giorgio, M.; Budde, C.; Briozzo, C.; Riveros, J.
1987-01-01
A description of Montroll-West's general theory for a tridimensional random walk of a particle with internal degrees of freedom is given, connecting this problem with the master equation solution. The possibility of its application to EPMA is discussed. Numerical solutions are given for thick or collimated beams at several energies interacting with samples of different shape and size. Spatial distribution of particles within the sample -for a stationary state- is analized, as well as the electron backscattering coefficient. (Author) [es
Statistical mechanics of superconductivity
Kita, Takafumi
2015-01-01
This book provides a theoretical, step-by-step comprehensive explanation of superconductivity for undergraduate and graduate students who have completed elementary courses on thermodynamics and quantum mechanics. To this end, it adopts the unique approach of starting with the statistical mechanics of quantum ideal gases and successively adding and clarifying elements and techniques indispensible for understanding it. They include the spin-statistics theorem, second quantization, density matrices, the Bloch–De Dominicis theorem, the variational principle in statistical mechanics, attractive interaction, and bound states. Ample examples of their usage are also provided in terms of topics from advanced statistical mechanics such as two-particle correlations of quantum ideal gases, derivation of the Hartree–Fock equations, and Landau’s Fermi-liquid theory, among others. With these preliminaries, the fundamental mean-field equations of superconductivity are derived with maximum mathematical clarity based on ...
International Nuclear Information System (INIS)
Anon.
1989-01-01
World data from the United Nation's latest Energy Statistics Yearbook, first published in our last issue, are completed here. The 1984-86 data were revised and 1987 data added for world commercial energy production and consumption, world natural gas plant liquids production, world LP-gas production, imports, exports, and consumption, world residual fuel oil production, imports, exports, and consumption, world lignite production, imports, exports, and consumption, world peat production and consumption, world electricity production, imports, exports, and consumption (Table 80), and world nuclear electric power production
Sheffield, Scott
2009-01-01
In recent years, statistical mechanics has been increasingly recognized as a central domain of mathematics. Major developments include the Schramm-Loewner evolution, which describes two-dimensional phase transitions, random matrix theory, renormalization group theory and the fluctuations of random surfaces described by dimers. The lectures contained in this volume present an introduction to recent mathematical progress in these fields. They are designed for graduate students in mathematics with a strong background in analysis and probability. This book will be of particular interest to graduate students and researchers interested in modern aspects of probability, conformal field theory, percolation, random matrices and stochastic differential equations.
Statistical mechanics of learning
Engel, Andreas
2001-01-01
The effort to build machines that are able to learn and undertake tasks such as datamining, image processing and pattern recognition has led to the development of artificial neural networks in which learning from examples may be described and understood. The contribution to this subject made over the past decade by researchers applying the techniques of statistical mechanics is the subject of this book. The authors provide a coherent account of various important concepts and techniques that are currently only found scattered in papers, supplement this with background material in mathematics and physics, and include many examples and exercises.
Statistical analysis of management data
Gatignon, Hubert
2013-01-01
This book offers a comprehensive approach to multivariate statistical analyses. It provides theoretical knowledge of the concepts underlying the most important multivariate techniques and an overview of actual applications.
DEFF Research Database (Denmark)
Lindström, Erik; Madsen, Henrik; Nielsen, Jan Nygaard
that rarely connect concepts to data and books on econometrics and time series analysis that do not cover specific problems related to option valuation. The book discusses applications of financial derivatives pertaining to risk assessment and elimination. The authors cover various statistical......Statistics for Finance develops students’ professional skills in statistics with applications in finance. Developed from the authors’ courses at the Technical University of Denmark and Lund University, the text bridges the gap between classical, rigorous treatments of financial mathematics...... and mathematical techniques, including linear and nonlinear time series analysis, stochastic calculus models, stochastic differential equations, Itō’s formula, the Black–Scholes model, the generalized method-of-moments, and the Kalman filter. They explain how these tools are used to price financial derivatives...
DEFF Research Database (Denmark)
Lindström, Erik; Madsen, Henrik; Nielsen, Jan Nygaard
Statistics for Finance develops students’ professional skills in statistics with applications in finance. Developed from the authors’ courses at the Technical University of Denmark and Lund University, the text bridges the gap between classical, rigorous treatments of financial mathematics...... that rarely connect concepts to data and books on econometrics and time series analysis that do not cover specific problems related to option valuation. The book discusses applications of financial derivatives pertaining to risk assessment and elimination. The authors cover various statistical...... and mathematical techniques, including linear and nonlinear time series analysis, stochastic calculus models, stochastic differential equations, Itō’s formula, the Black–Scholes model, the generalized method-of-moments, and the Kalman filter. They explain how these tools are used to price financial derivatives...
Perception in statistical graphics
VanderPlas, Susan Ruth
There has been quite a bit of research on statistical graphics and visualization, generally focused on new types of graphics, new software to create graphics, interactivity, and usability studies. Our ability to interpret and use statistical graphics hinges on the interface between the graph itself and the brain that perceives and interprets it, and there is substantially less research on the interplay between graph, eye, brain, and mind than is sufficient to understand the nature of these relationships. The goal of the work presented here is to further explore the interplay between a static graph, the translation of that graph from paper to mental representation (the journey from eye to brain), and the mental processes that operate on that graph once it is transferred into memory (mind). Understanding the perception of statistical graphics should allow researchers to create more effective graphs which produce fewer distortions and viewer errors while reducing the cognitive load necessary to understand the information presented in the graph. Taken together, these experiments should lay a foundation for exploring the perception of statistical graphics. There has been considerable research into the accuracy of numerical judgments viewers make from graphs, and these studies are useful, but it is more effective to understand how errors in these judgments occur so that the root cause of the error can be addressed directly. Understanding how visual reasoning relates to the ability to make judgments from graphs allows us to tailor graphics to particular target audiences. In addition, understanding the hierarchy of salient features in statistical graphics allows us to clearly communicate the important message from data or statistical models by constructing graphics which are designed specifically for the perceptual system.
Regulatory considerations for biosimilars
Directory of Open Access Journals (Sweden)
Ranjani Nellore
2010-01-01
Full Text Available Currently there is considerable interest in the legislative debate around generic biological drugs or "biosimilars" in the EU and US due to the large, lucrative market that it offers to the industry. While some countries have issued a few regulatory guidelines as well as product specific requirements, there is no general consensus as to a single, simple mechanism similar to the bioequivalence determination that leads to approval of generic small molecules all over the world. The inherent complex nature of the molecules, along with complicated manufacturing and analytical techniques to characterize them make it difficult to rely on a single human pharmacokinetic study for assurance of safety and efficacy. In general, the concept of comparability has been used for evaluation of the currently approved "similar" biological where a step by step assessment on the quality, preclinical and clinical aspects is made. In India, the focus is primarily on the availability and affordability of life-saving drugs. In this context every product needs to be evaluated on its own merit irrespective of the innovator brand. The formation of the National Biotechnology Regulatory Authority may provide a step in the right direction for regulation of these complex molecules. However, in order to have an efficient machinery for initial approval and ongoing oversight with a country-specific focus, cooperation with international authorities for granting approvals and continuous risk-benefit review is essential. Several steps are still needed for India to be perceived as a country that leads the world in providing quality biological products.
Statistical inference a short course
Panik, Michael J
2012-01-01
A concise, easily accessible introduction to descriptive and inferential techniques Statistical Inference: A Short Course offers a concise presentation of the essentials of basic statistics for readers seeking to acquire a working knowledge of statistical concepts, measures, and procedures. The author conducts tests on the assumption of randomness and normality, provides nonparametric methods when parametric approaches might not work. The book also explores how to determine a confidence interval for a population median while also providing coverage of ratio estimation, randomness, and causal
Tellinghuisen, Joel
2008-01-01
The method of least squares is probably the most powerful data analysis tool available to scientists. Toward a fuller appreciation of that power, this work begins with an elementary review of statistics fundamentals, and then progressively increases in sophistication as the coverage is extended to the theory and practice of linear and nonlinear least squares. The results are illustrated in application to data analysis problems important in the life sciences. The review of fundamentals includes the role of sampling and its connection to probability distributions, the Central Limit Theorem, and the importance of finite variance. Linear least squares are presented using matrix notation, and the significance of the key probability distributions-Gaussian, chi-square, and t-is illustrated with Monte Carlo calculations. The meaning of correlation is discussed, including its role in the propagation of error. When the data themselves are correlated, special methods are needed for the fitting, as they are also when fitting with constraints. Nonlinear fitting gives rise to nonnormal parameter distributions, but the 10% Rule of Thumb suggests that such problems will be insignificant when the parameter is sufficiently well determined. Illustrations include calibration with linear and nonlinear response functions, the dangers inherent in fitting inverted data (e.g., Lineweaver-Burk equation), an analysis of the reliability of the van't Hoff analysis, the problem of correlated data in the Guggenheim method, and the optimization of isothermal titration calorimetry procedures using the variance-covariance matrix for experiment design. The work concludes with illustrations on assessing and presenting results.
Intuitive introductory statistics
Wolfe, Douglas A
2017-01-01
This textbook is designed to give an engaging introduction to statistics and the art of data analysis. The unique scope includes, but also goes beyond, classical methodology associated with the normal distribution. What if the normal model is not valid for a particular data set? This cutting-edge approach provides the alternatives. It is an introduction to the world and possibilities of statistics that uses exercises, computer analyses, and simulations throughout the core lessons. These elementary statistical methods are intuitive. Counting and ranking features prominently in the text. Nonparametric methods, for instance, are often based on counts and ranks and are very easy to integrate into an introductory course. The ease of computation with advanced calculators and statistical software, both of which factor into this text, allows important techniques to be introduced earlier in the study of statistics. This book's novel scope also includes measuring symmetry with Walsh averages, finding a nonp...
What type of statistical model to choose for the analysis of radioimmunoassays
International Nuclear Information System (INIS)
Huet, S.
1984-01-01
The current techniques used for statistical analysis of radioimmunoassays are not very satisfactory for either the statistician or the biologist. They are based on an attempt to make the response curve linear to avoid complicated computations. The present article shows that this practice has considerable effects (often neglected) on the statistical assumptions which must be formulated. A more strict analysis is proposed by applying the four-parameter logistic model. The advantages of this method are: the statistical assumptions formulated are based on observed data, and the model can be applied to almost all radioimmunoassays [fr
T1 VSAT Fade Compensation Statistical Results
Johnson, Sandra K.; Acosta, Roberto; Ugweje, Oke
2000-01-01
New satellite communication systems are steadily seeking to use higher frequency bands to accommodate the requirements for additional capacity. At these higher frequencies, propagation impairments that did not significantly affect the signal at lower frequencies begin to have considerable impact. In Ka-band. the next logical commercial frequency band to be used for satellite communication, attenuation of the signal due to rain is a primary concern. An experimental satellite built by NASA, the Advanced Communication Technology Satellite (ACTS). launched in September 1993, is the first U.S. communication satellite operating in the Ka-band. In addition to higher carrier frequencies, a number of other new technologies, including on-board baseband processing. multiple beam antennas, and rain fade detection and compensation techniques, were designed into the ACTS. Verification experiments have been conducted since the launch to characterize the new technologies. The focus of this paper is to characterize the method used by the ACTS TI Very Small Aperture Terminal (TI VSAT) ground stations in detecting the presence of fade in the communication signal and to adaptively compensate for it by the addition of burst rate reduction and forward error correction. Measured data obtained from the ACTS program was used to validate the compensation technique. A software process was developed and demonstrated to statistically characterize the increased availability achieved by the compensation techniques in terms of the bit error rate time enhancement factor. Several improvements to the ACTS technique are discussed and possible implementations for future Ka band system are offered.
Directory of Open Access Journals (Sweden)
Maria Cristina DiCiaula
2014-01-01
Full Text Available A statistical mixture-design technique was used to study the effects of different solvents and their mixtures on the yield, total polyphenol content, and antioxidant capacity of the crude extracts from the bark of Schinus terebinthifolius Raddi (Anacardiaceae. The experimental results and their response-surface models showed that ternary mixtures with equal portions of all the three solvents (water, ethanol and acetone were better than the binary mixtures in generating crude extracts with the highest yield (22.04 ± 0.48%, total polyphenol content (29.39 ± 0.39%, and antioxidant capacity (6.38 ± 0.21. An analytical method was developed and validated for the determination of total polyphenols in the extracts. Optimal conditions for the various parameters in this analytical method, namely, the time for the chromophoric reaction to stabilize, wavelength of the absorption maxima to be monitored, the reference standard and the concentration of sodium carbonate were determined to be 5 min, 780 nm, pyrogallol, and 14.06% w v-1, respectively. UV-Vis spectrophotometric monitoring of the reaction under these conditions proved the method to be linear, specific, precise, accurate, reproducible, robust, and easy to perform.
Techniques for Wireless Applications
Gaaloul, Fakhreddine
2012-05-01
Switching techniques have been first proposed as a spacial diversity techniques. These techniques have been shown to reduce considerably the processing load while letting multi-antenna systems achieve a specific target performance. In this thesis, we take a different look at the switching schemes by implementing them for different other wireless applications. More specifically, this thesis consists of three main parts, where the first part considers a multiuser environment and an adaptive scheduling algorithm based on the switching with post-selection scheme for statistically independent but non-identically distributed channel conditions. The performance of this switched based scheduler is investigated and a multitude of performance metrics are presented. In a second part, we propose and analyze the performance of three switched-based algorithms for interference reduction in the downlink of over-loaded femtocells. For instance, performance metrics are derived in closed-form and these metrics are used to compare these three proposed schemes. Finally in a third part, a switch based opportunistic channel access scheme is proposed for a cognitive radio system and its performance is analyzed in terms of two new proposed metrics namely the average cognitive radio access and the waiting time duration.
Engaging with the Art & Science of Statistics
Peters, Susan A.
2010-01-01
How can statistics clearly be mathematical and yet distinct from mathematics? The answer lies in the reality that statistics is both an art and a science, and both aspects are important for teaching and learning statistics. Statistics is a mathematical science in that it applies mathematical theories and techniques. Mathematics provides the…
2012-01-01
Background A statistical analysis plan (SAP) is a critical link between how a clinical trial is conducted and the clinical study report. To secure objective study results, regulatory bodies expect that the SAP will meet requirements in pre-specifying inferential analyses and other important statistical techniques. To write a good SAP for model-based sensitivity and ancillary analyses involves non-trivial decisions on and justification of many aspects of the chosen setting. In particular, trials with longitudinal count data as primary endpoints pose challenges for model choice and model validation. In the random effects setting, frequentist strategies for model assessment and model diagnosis are complex and not easily implemented and have several limitations. Therefore, it is of interest to explore Bayesian alternatives which provide the needed decision support to finalize a SAP. Methods We focus on generalized linear mixed models (GLMMs) for the analysis of longitudinal count data. A series of distributions with over- and under-dispersion is considered. Additionally, the structure of the variance components is modified. We perform a simulation study to investigate the discriminatory power of Bayesian tools for model criticism in different scenarios derived from the model setting. We apply the findings to the data from an open clinical trial on vertigo attacks. These data are seen as pilot data for an ongoing phase III trial. To fit GLMMs we use a novel Bayesian computational approach based on integrated nested Laplace approximations (INLAs). The INLA methodology enables the direct computation of leave-one-out predictive distributions. These distributions are crucial for Bayesian model assessment. We evaluate competing GLMMs for longitudinal count data according to the deviance information criterion (DIC) or probability integral transform (PIT), and by using proper scoring rules (e.g. the logarithmic score). Results The instruments under study provide excellent
Directory of Open Access Journals (Sweden)
Adrion Christine
2012-09-01
Full Text Available Abstract Background A statistical analysis plan (SAP is a critical link between how a clinical trial is conducted and the clinical study report. To secure objective study results, regulatory bodies expect that the SAP will meet requirements in pre-specifying inferential analyses and other important statistical techniques. To write a good SAP for model-based sensitivity and ancillary analyses involves non-trivial decisions on and justification of many aspects of the chosen setting. In particular, trials with longitudinal count data as primary endpoints pose challenges for model choice and model validation. In the random effects setting, frequentist strategies for model assessment and model diagnosis are complex and not easily implemented and have several limitations. Therefore, it is of interest to explore Bayesian alternatives which provide the needed decision support to finalize a SAP. Methods We focus on generalized linear mixed models (GLMMs for the analysis of longitudinal count data. A series of distributions with over- and under-dispersion is considered. Additionally, the structure of the variance components is modified. We perform a simulation study to investigate the discriminatory power of Bayesian tools for model criticism in different scenarios derived from the model setting. We apply the findings to the data from an open clinical trial on vertigo attacks. These data are seen as pilot data for an ongoing phase III trial. To fit GLMMs we use a novel Bayesian computational approach based on integrated nested Laplace approximations (INLAs. The INLA methodology enables the direct computation of leave-one-out predictive distributions. These distributions are crucial for Bayesian model assessment. We evaluate competing GLMMs for longitudinal count data according to the deviance information criterion (DIC or probability integral transform (PIT, and by using proper scoring rules (e.g. the logarithmic score. Results The instruments under study
Considerations and Algorithms for Compression of Sets
DEFF Research Database (Denmark)
Larsson, Jesper
We consider compression of unordered sets of distinct elements. After a discus- sion of the general problem, we focus on compressing sets of fixed-length bitstrings in the presence of statistical information. We survey techniques from previous work, suggesting some adjustments, and propose a novel...... compression algorithm that allows transparent incorporation of various estimates for probability distribution. Our experimental results allow the conclusion that set compression can benefit from incorporat- ing statistics, using our method or variants of previously known techniques....
Computational statistics handbook with Matlab
Martinez, Wendy L
2007-01-01
Prefaces Introduction What Is Computational Statistics? An Overview of the Book Probability Concepts Introduction Probability Conditional Probability and Independence Expectation Common Distributions Sampling Concepts Introduction Sampling Terminology and Concepts Sampling Distributions Parameter Estimation Empirical Distribution Function Generating Random Variables Introduction General Techniques for Generating Random Variables Generating Continuous Random Variables Generating Discrete Random Variables Exploratory Data Analysis Introduction Exploring Univariate Data Exploring Bivariate and Trivariate Data Exploring Multidimensional Data Finding Structure Introduction Projecting Data Principal Component Analysis Projection Pursuit EDA Independent Component Analysis Grand Tour Nonlinear Dimensionality Reduction Monte Carlo Methods for Inferential Statistics Introduction Classical Inferential Statistics Monte Carlo Methods for Inferential Statist...
A statistical manual for chemists
Bauer, Edward
1971-01-01
A Statistical Manual for Chemists, Second Edition presents simple and fast statistical tools for data analysis of working chemists. This edition is organized into nine chapters and begins with an overview of the fundamental principles of the statistical techniques used in experimental data analysis. The subsequent chapters deal with the concept of statistical average, experimental design, and analysis of variance. The discussion then shifts to control charts, with particular emphasis on variable charts that are more useful to chemists and chemical engineers. A chapter focuses on the effect
Statistical methods for ranking data
Alvo, Mayer
2014-01-01
This book introduces advanced undergraduate, graduate students and practitioners to statistical methods for ranking data. An important aspect of nonparametric statistics is oriented towards the use of ranking data. Rank correlation is defined through the notion of distance functions and the notion of compatibility is introduced to deal with incomplete data. Ranking data are also modeled using a variety of modern tools such as CART, MCMC, EM algorithm and factor analysis. This book deals with statistical methods used for analyzing such data and provides a novel and unifying approach for hypotheses testing. The techniques described in the book are illustrated with examples and the statistical software is provided on the authors’ website.
Finkelstein, Michael O
2015-01-01
This classic text, first published in 1990, is designed to introduce law students, law teachers, practitioners, and judges to the basic ideas of mathematical probability and statistics as they have been applied in the law. The third edition includes over twenty new sections, including the addition of timely topics, like New York City police stops, exonerations in death-sentence cases, projecting airline costs, and new material on various statistical techniques such as the randomized response survey technique, rare-events meta-analysis, competing risks, and negative binomial regression. The book consists of sections of exposition followed by real-world cases and case studies in which statistical data have played a role. The reader is asked to apply the theory to the facts, to calculate results (a hand calculator is sufficient), and to explore legal issues raised by quantitative findings. The authors' calculations and comments are given in the back of the book. As with previous editions, the cases and case stu...
DOE handbook: Design considerations
Energy Technology Data Exchange (ETDEWEB)
NONE
1999-04-01
The Design Considerations Handbook includes information and suggestions for the design of systems typical to nuclear facilities, information specific to various types of special facilities, and information useful to various design disciplines. The handbook is presented in two parts. Part 1, which addresses design considerations, includes two sections. The first addresses the design of systems typically used in nuclear facilities to control radiation or radioactive materials. Specifically, this part addresses the design of confinement systems and radiation protection and effluent monitoring systems. The second section of Part 1 addresses the design of special facilities (i.e., specific types of nonreactor nuclear facilities). The specific design considerations provided in this section were developed from review of DOE 6430.1A and are supplemented with specific suggestions and considerations from designers with experience designing and operating such facilities. Part 2 of the Design Considerations Handbook describes good practices and design principles that should be considered in specific design disciplines, such as mechanical systems and electrical systems. These good practices are based on specific experiences in the design of nuclear facilities by design engineers with related experience. This part of the Design Considerations Handbook contains five sections, each of which applies to a particular engineering discipline.
DOE handbook: Design considerations
International Nuclear Information System (INIS)
1999-04-01
The Design Considerations Handbook includes information and suggestions for the design of systems typical to nuclear facilities, information specific to various types of special facilities, and information useful to various design disciplines. The handbook is presented in two parts. Part 1, which addresses design considerations, includes two sections. The first addresses the design of systems typically used in nuclear facilities to control radiation or radioactive materials. Specifically, this part addresses the design of confinement systems and radiation protection and effluent monitoring systems. The second section of Part 1 addresses the design of special facilities (i.e., specific types of nonreactor nuclear facilities). The specific design considerations provided in this section were developed from review of DOE 6430.1A and are supplemented with specific suggestions and considerations from designers with experience designing and operating such facilities. Part 2 of the Design Considerations Handbook describes good practices and design principles that should be considered in specific design disciplines, such as mechanical systems and electrical systems. These good practices are based on specific experiences in the design of nuclear facilities by design engineers with related experience. This part of the Design Considerations Handbook contains five sections, each of which applies to a particular engineering discipline
International Nuclear Information System (INIS)
Oelkers, E.; Heller, A.S.; Farnsworth, D.A.; Kearfott, K.J.
1978-01-01
The report describes the statistical analysis of DNBR thermal-hydraulic margin of a 3800 MWt, 205-FA core under design overpower conditions. The analysis used LYNX-generated data at predetermined values of the input variables whose uncertainties were to be statistically combined. LYNX data were used to construct an efficient response surface model in the region of interest; the statistical analysis was accomplished through the evaluation of core reliability; utilizing propagation of the uncertainty distributions of the inputs. The response surface model was implemented in both the analytical error propagation and Monte Carlo Techniques. The basic structural units relating to the acceptance criteria are fuel pins. Therefore, the statistical population of pins with minimum DNBR values smaller than specified values is determined. The specified values are designated relative to the most probable and maximum design DNBR values on the power limiting pin used in present design analysis, so that gains over the present design criteria could be assessed for specified probabilistic acceptance criteria. The results are equivalent to gains ranging from 1.2 to 4.8 percent of rated power dependent on the acceptance criterion. The corresponding acceptance criteria range from 95 percent confidence that no pin will be in DNB to 99.9 percent of the pins, which are expected to avoid DNB
Computer intensive statistical methods
Yakowitz, S.
The special session “Computer-Intensive Statistical Methods” was held in morning and afternoon parts at the 1985 AGU Fall Meeting in San Francisco. Calif. Its mission was to provide a forum for hydrologists and statisticians who are active in bringing unconventional, algorithmic-oriented statistical techniques to bear on problems of hydrology. Statistician Emanuel Parzen (Texas A&M University, College Station, Tex.) opened the session by relating recent developments in quantile estimation methods and showing how properties of such methods can be used to advantage to categorize runoff data previously analyzed by I. Rodriguez-Iturbe (Universidad Simon Bolivar, Caracas, Venezuela). Statistician Eugene Schuster (University of Texas, El Paso) discussed recent developments in nonparametric density estimation which enlarge the framework for convenient incorporation of prior and ancillary information. These extensions were motivated by peak annual flow analysis. Mathematician D. Myers (University of Arizona, Tucson) gave a brief overview of “kriging” and outlined some recently developed methodology.
Morphological Analysis for Statistical Machine Translation
National Research Council Canada - National Science Library
Lee, Young-Suk
2004-01-01
We present a novel morphological analysis technique which induces a morphological and syntactic symmetry between two languages with highly asymmetrical morphological structures to improve statistical...
Energy Technology Data Exchange (ETDEWEB)
Hernandez M, B. [ININ, 52750 La Marquesa, Estado de Mexico (Mexico)
1997-07-01
The objectives of this work are: to identify the heavy metals present in the air, and its concentrations. To know the behavior from the polluting chemical elements to the long of an annual cycle corresponding to 1990, based on the concentrations of the same ones, obtained through the PIXE technique. To identify the suitable statistical methods to use to the data of metals concentration in form of total suspended particle (PST), found in this investigation. To relate the concentrations and the meteorological parameters considered to be able to suggest the possible pollution sources. In function of the obtained results, to serve as base to the decisions making and measures control that are planned by diverse institutions focused to the problem of the atmospheric pollution in the Metropolitan area of Mexico City (ZMCM). (Author)
Managing Macroeconomic Risks by Using Statistical Simulation
Directory of Open Access Journals (Sweden)
Merkaš Zvonko
2017-06-01
Full Text Available The paper analyzes the possibilities of using statistical simulation in the macroeconomic risks measurement. At the level of the whole world, macroeconomic risks are, due to the excessive imbalance, significantly increased. Using analytical statistical methods and Monte Carlo simulation, the authors interpret the collected data sets, compare and analyze them in order to mitigate potential risks. The empirical part of the study is a qualitative case study that uses statistical methods and Monte Carlo simulation for managing macroeconomic risks, which is the central theme of this work. Application of statistical simulation is necessary because the system, for which it is necessary to specify the model, is too complex for an analytical approach. The objective of the paper is to point out the previous need for consideration of significant macroeconomic risks, particularly in terms of the number of the unemployed in the society, the movement of gross domestic product and the country’s credit rating, and the use of data previously processed by statistical methods, through statistical simulation, to analyze the existing model of managing the macroeconomic risks and suggest elements for a management model development that will allow, with the lowest possible probability and consequences, the emergence of the recent macroeconomic risks. The stochastic characteristics of the system, defined by random variables as input values defined by probability distributions, require the performance of a large number of iterations on which to record the output of the model and calculate the mathematical expectations. The paper expounds the basic procedures and techniques of discrete statistical simulation applied to systems that can be characterized by a number of events which represent a set of circumstances that have caused a change in the system’s state and the possibility of its application in the field of assessment of macroeconomic risks. The method has no
Introductory statistics for engineering experimentation
Nelson, Peter R; Coffin, Marie
2003-01-01
The Accreditation Board for Engineering and Technology (ABET) introduced a criterion starting with their 1992-1993 site visits that "Students must demonstrate a knowledge of the application of statistics to engineering problems." Since most engineering curricula are filled with requirements in their own discipline, they generally do not have time for a traditional two semesters of probability and statistics. Attempts to condense that material into a single semester often results in so much time being spent on probability that the statistics useful for designing and analyzing engineering/scientific experiments is never covered. In developing a one-semester course whose purpose was to introduce engineering/scientific students to the most useful statistical methods, this book was created to satisfy those needs. - Provides the statistical design and analysis of engineering experiments & problems - Presents a student-friendly approach through providing statistical models for advanced learning techniques - Cove...
On quantum statistical inference
DEFF Research Database (Denmark)
Barndorff-Nielsen, Ole Eiler; Gill, Richard D.; Jupp, Peter E.
Recent developments in the mathematical foundations of quantum mechanics have brought the theory closer to that of classical probability and statistics. On the other hand, the unique character of quantum physics sets many of the questions addressed apart from those met classically in stochastics....... Furthermore, concurrent advances in experimental techniques and in the theory of quantum computation have led to a strong interest in questions of quantum information, in particular in the sense of the amount of information about unknown parameters in given observational data or accessible through various...
Considerations for Visualizing Comparison.
Gleicher, Michael
2018-01-01
Supporting comparison is a common and diverse challenge in visualization. Such support is difficult to design because solutions must address both the specifics of their scenario as well as the general issues of comparison. This paper aids designers by providing a strategy for considering those general issues. It presents four considerations that abstract comparison. These considerations identify issues and categorize solutions in a domain independent manner. The first considers how the common elements of comparison-a target set of items that are related and an action the user wants to perform on that relationship-are present in an analysis problem. The second considers why these elements lead to challenges because of their scale, in number of items, complexity of items, or complexity of relationship. The third considers what strategies address the identified scaling challenges, grouping solutions into three broad categories. The fourth considers which visual designs map to these strategies to provide solutions for a comparison analysis problem. In sequence, these considerations provide a process for developers to consider support for comparison in the design of visualization tools. Case studies show how these considerations can help in the design and evaluation of visualization solutions for comparison problems.
Tuberous sclerosis Anaesthetic considerations
African Journals Online (AJOL)
QuickSilver
SYNDROMIC VIGNETTES IN ANAESTHESIA. Southern African Journal of Anaesthesia & Analgesia - May 2003. 4. Tuberous sclerosis. Anaesthetic considerations. Tuberous sclerosis. Tuberous sclerosis(TS) was first described by Bourneville in. 1880.1 TS is said to be one of the commonest autosomal domi- nant diseases.
Energy Technology Data Exchange (ETDEWEB)
Georg, Dietmar, E-mail: Dietmar.Georg@akhwien.at [Department of Radiation Oncology, Medical University of Vienna/Allgemeines Krankenhaus der Stadt Wien, Vienna (Austria); Christian Doppler Laboratory for Medical Radiation Research for Radiation Oncology, Medical University of Vienna/Allgemeines Krankenhaus der Stadt Wien, Vienna (Austria); Hopfgartner, Johannes [Department of Radiation Oncology, Medical University of Vienna/Allgemeines Krankenhaus der Stadt Wien, Vienna (Austria); Christian Doppler Laboratory for Medical Radiation Research for Radiation Oncology, Medical University of Vienna/Allgemeines Krankenhaus der Stadt Wien, Vienna (Austria); Gòra, Joanna [Department of Radiation Oncology, Medical University of Vienna/Allgemeines Krankenhaus der Stadt Wien, Vienna (Austria); Kuess, Peter [Department of Radiation Oncology, Medical University of Vienna/Allgemeines Krankenhaus der Stadt Wien, Vienna (Austria); Christian Doppler Laboratory for Medical Radiation Research for Radiation Oncology, Medical University of Vienna/Allgemeines Krankenhaus der Stadt Wien, Vienna (Austria); Kragl, Gabriele [Department of Radiation Oncology, Medical University of Vienna/Allgemeines Krankenhaus der Stadt Wien, Vienna (Austria); Berger, Daniel [Department of Radiation Oncology, Medical University of Vienna/Allgemeines Krankenhaus der Stadt Wien, Vienna (Austria); Christian Doppler Laboratory for Medical Radiation Research for Radiation Oncology, Medical University of Vienna/Allgemeines Krankenhaus der Stadt Wien, Vienna (Austria); Hegazy, Neamat [Department of Radiation Oncology, Medical University of Vienna/Allgemeines Krankenhaus der Stadt Wien, Vienna (Austria); Goldner, Gregor; Georg, Petra [Department of Radiation Oncology, Medical University of Vienna/Allgemeines Krankenhaus der Stadt Wien, Vienna (Austria); Christian Doppler Laboratory for Medical Radiation Research for Radiation Oncology, Medical University of Vienna/Allgemeines Krankenhaus der Stadt Wien, Vienna (Austria)
2014-03-01
Purpose: To assess the dosimetric differences among volumetric modulated arc therapy (VMAT), scanned proton therapy (intensity-modulated proton therapy, IMPT), scanned carbon-ion therapy (intensity-modulated carbon-ion therapy, IMIT), and low-dose-rate (LDR) and high-dose-rate (HDR) brachytherapy (BT) treatment of localized prostate cancer. Methods and Materials: Ten patients were considered for this planning study. For external beam radiation therapy (EBRT), planning target volume was created by adding a margin of 5 mm (lateral/anterior–posterior) and 8 mm (superior–inferior) to the clinical target volume. Bladder wall (BW), rectal wall (RW), femoral heads, urethra, and pelvic tissue were considered as organs at risk. For VMAT and IMPT, 78 Gy(relative biological effectiveness, RBE)/2 Gy were prescribed. The IMIT was based on 66 Gy(RBE)/20 fractions. The clinical target volume planning aims for HDR-BT ({sup 192}Ir) and LDR-BT ({sup 125}I) were D{sub 90%} ≥34 Gy in 8.5 Gy per fraction and D{sub 90%} ≥145 Gy. Both physical and RBE-weighted dose distributions for protons and carbon-ions were converted to dose distributions based on 2-Gy(IsoE) fractions. From these dose distributions various dose and dose–volume parameters were extracted. Results: Rectal wall exposure 30-70 Gy(IsoE) was reduced for IMIT, LDR-BT, and HDR-BT when compared with VMAT and IMPT. The high-dose region of the BW dose–volume histogram above 50 Gy(IsoE) of IMPT resembled the VMAT shape, whereas all other techniques showed a significantly lower high-dose region. For all 3 EBRT techniques similar urethra D{sub mean} around 74 Gy(IsoE) were obtained. The LDR-BT results were approximately 30 Gy(IsoE) higher, HDR-BT 10 Gy(IsoE) lower. Normal tissue and femoral head sparing was best with BT. Conclusion: Despite the different EBRT prescription and fractionation schemes, the high-dose regions of BW and RW expressed in Gy(IsoE) were on the same order of magnitude. Brachytherapy techniques
International Nuclear Information System (INIS)
Georg, Dietmar; Hopfgartner, Johannes; Gòra, Joanna; Kuess, Peter; Kragl, Gabriele; Berger, Daniel; Hegazy, Neamat; Goldner, Gregor; Georg, Petra
2014-01-01
Purpose: To assess the dosimetric differences among volumetric modulated arc therapy (VMAT), scanned proton therapy (intensity-modulated proton therapy, IMPT), scanned carbon-ion therapy (intensity-modulated carbon-ion therapy, IMIT), and low-dose-rate (LDR) and high-dose-rate (HDR) brachytherapy (BT) treatment of localized prostate cancer. Methods and Materials: Ten patients were considered for this planning study. For external beam radiation therapy (EBRT), planning target volume was created by adding a margin of 5 mm (lateral/anterior–posterior) and 8 mm (superior–inferior) to the clinical target volume. Bladder wall (BW), rectal wall (RW), femoral heads, urethra, and pelvic tissue were considered as organs at risk. For VMAT and IMPT, 78 Gy(relative biological effectiveness, RBE)/2 Gy were prescribed. The IMIT was based on 66 Gy(RBE)/20 fractions. The clinical target volume planning aims for HDR-BT ( 192 Ir) and LDR-BT ( 125 I) were D 90% ≥34 Gy in 8.5 Gy per fraction and D 90% ≥145 Gy. Both physical and RBE-weighted dose distributions for protons and carbon-ions were converted to dose distributions based on 2-Gy(IsoE) fractions. From these dose distributions various dose and dose–volume parameters were extracted. Results: Rectal wall exposure 30-70 Gy(IsoE) was reduced for IMIT, LDR-BT, and HDR-BT when compared with VMAT and IMPT. The high-dose region of the BW dose–volume histogram above 50 Gy(IsoE) of IMPT resembled the VMAT shape, whereas all other techniques showed a significantly lower high-dose region. For all 3 EBRT techniques similar urethra D mean around 74 Gy(IsoE) were obtained. The LDR-BT results were approximately 30 Gy(IsoE) higher, HDR-BT 10 Gy(IsoE) lower. Normal tissue and femoral head sparing was best with BT. Conclusion: Despite the different EBRT prescription and fractionation schemes, the high-dose regions of BW and RW expressed in Gy(IsoE) were on the same order of magnitude. Brachytherapy techniques were clearly superior in
Energy Technology Data Exchange (ETDEWEB)
Ostrowsky, A.; Daures, J
2008-07-01
Calorimetry is the most direct dosimetric technique to reach absorbed dose. A calorimeter gives direct access to the energy imparted to matter by ionizing radiation per mass unit by measuring the heat quantity Q produced under irradiation in its sensitive element which is thermally insulated. Graphite was chosen as construction material because all the energy imparted to graphite by ionizing radiation is converted into heat. Thermistors are used for temperature measurements as well as for the electrical heating of the different bodies of the calorimeter. The construction of a calorimeter is the result of a compromise between dosimetric requirements and mechanical constraints. The difficulties encountered are examined and the solutions chosen are detailed. All technical data are gathered in this document. The aim is to provide a practical operative instruction and guidance document, which can help interested laboratories in designing such an instrument. The electrical and thermal tests have shown a good behaviour of the GR9 calorimeter.
A primer of multivariate statistics
Harris, Richard J
2014-01-01
Drawing upon more than 30 years of experience in working with statistics, Dr. Richard J. Harris has updated A Primer of Multivariate Statistics to provide a model of balance between how-to and why. This classic text covers multivariate techniques with a taste of latent variable approaches. Throughout the book there is a focus on the importance of describing and testing one's interpretations of the emergent variables that are produced by multivariate analysis. This edition retains its conversational writing style while focusing on classical techniques. The book gives the reader a feel for why
... Standards Act and Program MQSA Insights MQSA National Statistics Share Tweet Linkedin Pin it More sharing options ... but should level off with time. Archived Scorecard Statistics 2018 Scorecard Statistics 2017 Scorecard Statistics 2016 Scorecard ...
Fusion facility siting considerations
International Nuclear Information System (INIS)
Bussell, G.T.
1985-01-01
Inherent in the fusion program's transition from hydrogen devices to commercial power machines is a general increase in the size and scope of succeeding projects. This growth will lead to increased emphasis on safety, environmental impact, and the external effects of fusion in general, and of each new device in particular. A critically important consideration in this regard is site selection. The purpose of this paper is to examine major siting issues that may affect the economics, safety, and environmental impact of fusion
Storage array reflection considerations
International Nuclear Information System (INIS)
Haire, M.J.; Jordan, W.C.; Taylor, R.G.
1997-01-01
The assumptions used for reflection conditions of single containers are fairly well established and consistently applied throughout the industry in nuclear criticality safety evaluations. Containers are usually considered to be either fully water reflected (i.e., surrounded by 6 to 12 in. of water) for safety calculations or reflected by 1 in. of water for nominal (structural material and air) conditions. Tables and figures are usually available for performing comparative evaluations of containers under various loading conditions. Reflection considerations used for evaluating the safety of storage arrays of fissile material are not as well established. When evaluating arrays, it has become more common for analysts to use calculations to demonstrate the safety of the array configuration. In performing these calculations, the analyst has considerable freedom concerning the assumptions made for modeling the reflection of the array. Considerations are given for the physical layout of the array with little or no discussion (or demonstration) of what conditions are bounded by the assumed reflection conditions. For example, an array may be generically evaluated by placing it in a corner of a room in which the opposing walls are far away. Typically, it is believed that complete flooding of the room is incredible, so the array is evaluated for various levels of water mist interspersed among array containers. This paper discusses some assumptions that are made regarding storage array reflection
Mc Leod, Roger D.; Mc Leod, David M.
2002-10-01
Archimedes articulated an applied physics experience of many children who observe the upward movement of floating objects when they get into their "tubs." This same principle can effectively allow massive Egyptian construction blocks and obelisks to be elevated and erected. Platform bases at Giza were leveled by means of water channels that were cut into the rock. There is a canal behind the pyramids. The bathtub technique can elevate or transport the water-borne block (or obelisk) to sites involved, including the Sphinx temple. Water outflow from the barge locks (tubs) can erode Sphinx surrounds, without invoking 7000+ year-ago rainy weather. Our previously detailed account of how constellations, Canis Major, Phoenix, Leo can be detected at sites like America's Stonehenge, while they are below the local horizon, also indicates ancient Egyptians may have done likewise. Orion, or Leo the Sphinx could have been detected while they were in the "underground," around BCE 2500, in alignments otherwise requiring a date of BCE 1050.
Sampling, Probability Models and Statistical Reasoning Statistical ...
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Sampling, Probability Models and Statistical Reasoning Statistical Inference. Mohan Delampady V R Padmawar. General Article Volume 1 Issue 5 May 1996 pp 49-58 ...
Sampling, Probability Models and Statistical Reasoning Statistical
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Sampling, Probability Models and Statistical Reasoning Statistical Inference. Mohan Delampady V R Padmawar. General Article Volume 1 Issue 5 May 1996 pp 49-58 ...
Methodological Considerations in Gender Studies
Directory of Open Access Journals (Sweden)
F. Asghari
2015-11-01
Full Text Available What differentiates a scientific research study from a non-scientific one and makes it valid or non-valid is the research methodology. Methodological considerations, especially in studies such as gender studies, which are not generally covered in the framework of a special discipline, call for more precision and sensitivity due to their complicated nature and dimensions. Therefore, the present article, by examining the methodological aspects of the research carried out on the effect of the variable of gender on job satisfaction of faculty members and using qualitative approach and systematic review technique, tries to draw researchers’ attention to the necessity, importance and role of methodological considerations in making a deeper understanding and explanation of research possible, and providing a clear and precise answer in conformity with reality for problem-solving and fulfilling research objective. The findings of this study show that there are two main problems in the works of research studied here. One is ambiguity and lack of transparency in problem statement and research objective, and the other is tendency toward using a single (quantitative research method without paying attention to differences in the nature and dimensions of fields of research. It seems that in fields such as gender studies, in order to gain a deeper understanding and a more valid and clear answer, adopting a combined approach will be a more reliable choice.
Statistics for dental researchers: descriptive statistics
Mohammad Reza Baneshi PhD; Amir Reza Ghassemi DDS; Arash Shahravan DDS, MS
2012-01-01
Descriptive statistics is the process of summarizing gathered raw data from a research and creating useful statistics,which help the better understanding of data. According to the types of variables, which consist of qualitative andquantitative variables, some descriptive statistics have been introduced. Frequency percentage is used in qualitativedata, and mean, median, mode, standard deviation, standard error, variance, and range are some of the statistics whichare used in quantitative data....
Statistics in biomedical research
Directory of Open Access Journals (Sweden)
González-Manteiga, Wenceslao
2007-06-01
Full Text Available The discipline of biostatistics is nowadays a fundamental scientific component of biomedical, public health and health services research. Traditional and emerging areas of application include clinical trials research, observational studies, physiology, imaging, and genomics. The present article reviews the current situation of biostatistics, considering the statistical methods traditionally used in biomedical research, as well as the ongoing development of new methods in response to the new problems arising in medicine. Clearly, the successful application of statistics in biomedical research requires appropriate training of biostatisticians. This training should aim to give due consideration to emerging new areas of statistics, while at the same time retaining full coverage of the fundamentals of statistical theory and methodology. In addition, it is important that students of biostatistics receive formal training in relevant biomedical disciplines, such as epidemiology, clinical trials, molecular biology, genetics, and neuroscience.La Bioestadística es hoy en día una componente científica fundamental de la investigación en Biomedicina, salud pública y servicios de salud. Las áreas tradicionales y emergentes de aplicación incluyen ensayos clínicos, estudios observacionales, fisología, imágenes, y genómica. Este artículo repasa la situación actual de la Bioestadística, considerando los métodos estadísticos usados tradicionalmente en investigación biomédica, así como los recientes desarrollos de nuevos métodos, para dar respuesta a los nuevos problemas que surgen en Medicina. Obviamente, la aplicación fructífera de la estadística en investigación biomédica exige una formación adecuada de los bioestadísticos, formación que debería tener en cuenta las áreas emergentes en estadística, cubriendo al mismo tiempo los fundamentos de la teoría estadística y su metodología. Es importante, además, que los estudiantes de
National Statistical Commission and Indian Official Statistics*
Indian Academy of Sciences (India)
IAS Admin
existence in July. 2006, is mandated, among its functions, to exercise statistical co-ordination between. Ministries,. Departments and other agencies of the. Central government; ... tween the Directorate General of Commercial Intelligence and. Statistics ... in some states do not play a nodal role in the coordination of statistical ...
National Statistical Commission and Indian Official Statistics*
Indian Academy of Sciences (India)
IAS Admin
Advanced Institute of. Maths, Stats and Com- puter Science, UoH. Campus, Hyderabad. His research interests include theory and practice of sample surveys .... other agencies of the. Central government; and to exercise statistical audit over the statistical activities to ensure quality and integrity of the statistical products.
Single-case research design in pediatric psychology: considerations regarding data analysis.
Cohen, Lindsey L; Feinstein, Amanda; Masuda, Akihiko; Vowles, Kevin E
2014-03-01
Single-case research allows for an examination of behavior and can demonstrate the functional relation between intervention and outcome in pediatric psychology. This review highlights key assumptions, methodological and design considerations, and options for data analysis. Single-case methodology and guidelines are reviewed with an in-depth focus on visual and statistical analyses. Guidelines allow for the careful evaluation of design quality and visual analysis. A number of statistical techniques have been introduced to supplement visual analysis, but to date, there is no consensus on their recommended use in single-case research design. Single-case methodology is invaluable for advancing pediatric psychology science and practice, and guidelines have been introduced to enhance the consistency, validity, and reliability of these studies. Experts generally agree that visual inspection is the optimal method of analysis in single-case design; however, statistical approaches are becoming increasingly evaluated and used to augment data interpretation.
Permutation statistical methods an integrated approach
Berry, Kenneth J; Johnston, Janis E
2016-01-01
This research monograph provides a synthesis of a number of statistical tests and measures, which, at first consideration, appear disjoint and unrelated. Numerous comparisons of permutation and classical statistical methods are presented, and the two methods are compared via probability values and, where appropriate, measures of effect size. Permutation statistical methods, compared to classical statistical methods, do not rely on theoretical distributions, avoid the usual assumptions of normality and homogeneity of variance, and depend only on the data at hand. This text takes a unique approach to explaining statistics by integrating a large variety of statistical methods, and establishing the rigor of a topic that to many may seem to be a nascent field in statistics. This topic is new in that it took modern computing power to make permutation methods available to people working in the mainstream of research. This research monograph addresses a statistically-informed audience, and can also easily serve as a ...
Statistical application of groundwater monitoring data at the Hanford Site
International Nuclear Information System (INIS)
Chou, C.J.; Johnson, V.G.; Hodges, F.N.
1993-09-01
Effective use of groundwater monitoring data requires both statistical and geohydrologic interpretations. At the Hanford Site in south-central Washington state such interpretations are used for (1) detection monitoring, assessment monitoring, and/or corrective action at Resource Conservation and Recovery Act sites; (2) compliance testing for operational groundwater surveillance; (3) impact assessments at active liquid-waste disposal sites; and (4) cleanup decisions at Comprehensive Environmental Response Compensation and Liability Act sites. Statistical tests such as the Kolmogorov-Smirnov two-sample test are used to test the hypothesis that chemical concentrations from spatially distinct subsets or populations are identical within the uppermost unconfined aquifer. Experience at the Hanford Site in applying groundwater background data indicates that background must be considered as a statistical distribution of concentrations, rather than a single value or threshold. The use of a single numerical value as a background-based standard ignores important information and may result in excessive or unnecessary remediation. Appropriate statistical evaluation techniques include Wilcoxon rank sum test, Quantile test, ''hot spot'' comparisons, and Kolmogorov-Smirnov types of tests. Application of such tests is illustrated with several case studies derived from Hanford groundwater monitoring programs. To avoid possible misuse of such data, an understanding of the limitations is needed. In addition to statistical test procedures, geochemical, and hydrologic considerations are integral parts of the decision process. For this purpose a phased approach is recommended that proceeds from simple to the more complex, and from an overview to detailed analysis
A method for statistical steady state thermal analysis of reactor cores
International Nuclear Information System (INIS)
Whetton, P.A.
1981-01-01
In a previous publication the author presented a method for undertaking statistical steady state thermal analyses of reactor cores. The present paper extends the technique to an assessment of confidence limits for the resulting probability functions which define the probability that a given thermal response value will be exceeded in a reactor core. Establishing such confidence limits is considered an integral part of any statistical thermal analysis and essential if such analysis are to be considered in any regulatory process. In certain applications the use of a best estimate probability function may be justifiable but it is recognised that a demonstrably conservative probability function is required for any regulatory considerations. (orig.)
Statistics for dental researchers: descriptive statistics
Directory of Open Access Journals (Sweden)
Mohammad Reza Baneshi PhD
2012-09-01
Full Text Available Descriptive statistics is the process of summarizing gathered raw data from a research and creating useful statistics,which help the better understanding of data. According to the types of variables, which consist of qualitative andquantitative variables, some descriptive statistics have been introduced. Frequency percentage is used in qualitativedata, and mean, median, mode, standard deviation, standard error, variance, and range are some of the statistics whichare used in quantitative data. In health sciences, the majority of continuous variables follow a normal distribution.skewness and kurtosis are two statistics which help to compare a given distribution with the normal distribution.
Room considerations with TAVR.
Kleiman, Neal
2012-01-01
While transcatheter aortic valve replacement is considered a viable alternative to traditional surgery for patients with critical aortic stenosis, it is still a cardiac surgical procedure with a steep learning curve. Space consideration is a key aspect of the procedure's success. A TAVR program requires the commitment from and investment of institutional resources, the outfitting of an appropriate procedure room, and meticulous training of a multidisciplinary TAVR team. Careful integration of the various imaging modalities, medical specialties, and equipment is necessary to ensure the safety and efficacy of the procedure and to treat complications that may arise.
Part 8. Deployment considerations
International Nuclear Information System (INIS)
Dance, K.D.; Chang, Y.I.; Daly, T.A.
1980-01-01
This report addresses considerations of fast breeder reactor development and deployment from a national perspective. Nations vary greatly in their expertise and interest relative to nuclear power, and hence a single set of steps to be taken by a nation in decision-making on breeder development and deployment cannot be presented. The approach taken in this report is to present discussions on key factors influencing the breeder development and deployment decisions, especially in non-breeder nations, by drawing upon historical perspectives of the Light Water Reactor for comparison
Intermediate statistics a modern approach
Stevens, James P
2007-01-01
Written for those who use statistical techniques, this text focuses on a conceptual understanding of the material. It uses definitional formulas on small data sets to provide conceptual insight into what is being measured. It emphasizes the assumptions underlying each analysis, and shows how to test the critical assumptions using SPSS or SAS.
International Nuclear Information System (INIS)
Rodionov, Andrei; Atwood, Corwin L.; Kirchsteiger, Christian; Patrik, Milan
2008-01-01
The paper presents some results of a case study on 'Demonstration of statistical approaches to identify the component's ageing by operational data analysis', which was done in the frame of the EC JRC Ageing PSA Network. Several techniques: visual evaluation, nonparametric and parametric hypothesis tests, were proposed and applied in order to demonstrate the capacity, advantages and limitations of statistical approaches to identify the component's ageing by operational data analysis. Engineering considerations are out of the scope of the present study
Statistical physics of vaccination
Wang, Zhen; Bauch, Chris T.; Bhattacharyya, Samit; d'Onofrio, Alberto; Manfredi, Piero; Perc, Matjaž; Perra, Nicola; Salathé, Marcel; Zhao, Dawei
2016-12-01
Historically, infectious diseases caused considerable damage to human societies, and they continue to do so today. To help reduce their impact, mathematical models of disease transmission have been studied to help understand disease dynamics and inform prevention strategies. Vaccination-one of the most important preventive measures of modern times-is of great interest both theoretically and empirically. And in contrast to traditional approaches, recent research increasingly explores the pivotal implications of individual behavior and heterogeneous contact patterns in populations. Our report reviews the developmental arc of theoretical epidemiology with emphasis on vaccination, as it led from classical models assuming homogeneously mixing (mean-field) populations and ignoring human behavior, to recent models that account for behavioral feedback and/or population spatial/social structure. Many of the methods used originated in statistical physics, such as lattice and network models, and their associated analytical frameworks. Similarly, the feedback loop between vaccinating behavior and disease propagation forms a coupled nonlinear system with analogs in physics. We also review the new paradigm of digital epidemiology, wherein sources of digital data such as online social media are mined for high-resolution information on epidemiologically relevant individual behavior. Armed with the tools and concepts of statistical physics, and further assisted by new sources of digital data, models that capture nonlinear interactions between behavior and disease dynamics offer a novel way of modeling real-world phenomena, and can help improve health outcomes. We conclude the review by discussing open problems in the field and promising directions for future research.
Statistical validation of stochastic models
Energy Technology Data Exchange (ETDEWEB)
Hunter, N.F. [Los Alamos National Lab., NM (United States). Engineering Science and Analysis Div.; Barney, P.; Paez, T.L. [Sandia National Labs., Albuquerque, NM (United States). Experimental Structural Dynamics Dept.; Ferregut, C.; Perez, L. [Univ. of Texas, El Paso, TX (United States). Dept. of Civil Engineering
1996-12-31
It is common practice in structural dynamics to develop mathematical models for system behavior, and the authors are now capable of developing stochastic models, i.e., models whose parameters are random variables. Such models have random characteristics that are meant to simulate the randomness in characteristics of experimentally observed systems. This paper suggests a formal statistical procedure for the validation of mathematical models of stochastic systems when data taken during operation of the stochastic system are available. The statistical characteristics of the experimental system are obtained using the bootstrap, a technique for the statistical analysis of non-Gaussian data. The authors propose a procedure to determine whether or not a mathematical model is an acceptable model of a stochastic system with regard to user-specified measures of system behavior. A numerical example is presented to demonstrate the application of the technique.
For the last 30 years static chamber methodologies have been most commonly used to measure N2O fluxes from agricultural soils. The main advantages of this technique are that it is relatively inexpensive, versatile in the field, and the technology is very easy to adopt. Consequently, the majority of ...
Reliability Considerations for the Operation of Large Accelerator User Facilities
Willeke, F.J.
2016-01-01
The lecture provides an overview of considerations relevant for achieving highly reliable operation of accelerator based user facilities. The article starts with an overview of statistical reliability formalism which is followed by high reliability design considerations with examples. The article closes with operational aspects of high reliability such as preventive maintenance and spares inventory.
Algebraic statistics computational commutative algebra in statistics
Pistone, Giovanni; Wynn, Henry P
2000-01-01
Written by pioneers in this exciting new field, Algebraic Statistics introduces the application of polynomial algebra to experimental design, discrete probability, and statistics. It begins with an introduction to Gröbner bases and a thorough description of their applications to experimental design. A special chapter covers the binary case with new application to coherent systems in reliability and two level factorial designs. The work paves the way, in the last two chapters, for the application of computer algebra to discrete probability and statistical modelling through the important concept of an algebraic statistical model.As the first book on the subject, Algebraic Statistics presents many opportunities for spin-off research and applications and should become a landmark work welcomed by both the statistical community and its relatives in mathematics and computer science.
Radiological considerations for decommissioning
International Nuclear Information System (INIS)
Adler, J.J.
1993-01-01
It has been said, by those uninitiated to decommissioning work, that radiological considerations required for decommissioning are the same as those for an operating facility. In reality, nothing could be further from the truth. The act of decommissioning can be likened to cutting off a tree limb while sitting on it. This paper discusses some of the unique radiological aspects that are associated with implementing a decommissioning health physics program. There are physical constraints that may cause major differences between a normal operational and a decommissioning health physics program. Throughout the decommissioning process, the installed equipment and services that were needed to support an operational program are constantly being removed or may already be disabled due to the age of the facility. Those affecting radiological protection programs typically would include radiation shielding, ventilation systems, breathing air supply for respiratory protection, and radiological monitoring systems
Adrenal Gland Tumors: Statistics
... Gland Tumor: Statistics Request Permissions Adrenal Gland Tumor: Statistics Approved by the Cancer.Net Editorial Board , 03/ ... primary adrenal gland tumor is very uncommon. Exact statistics are not available for this type of tumor ...
Neuroendocrine Tumor: Statistics
... Tumor > Neuroendocrine Tumor: Statistics Request Permissions Neuroendocrine Tumor: Statistics Approved by the Cancer.Net Editorial Board , 11/ ... the body. It is important to remember that statistics on the survival rates for people with a ...
State transportation statistics 2009
2009-01-01
The Bureau of Transportation Statistics (BTS), a part of DOTs Research and Innovative Technology Administration (RITA), presents State Transportation Statistics 2009, a statistical profile of transportation in the 50 states and the District of Col...
BTS statistical standards manual
2005-10-01
The Bureau of Transportation Statistics (BTS), like other federal statistical agencies, establishes professional standards to guide the methods and procedures for the collection, processing, storage, and presentation of statistical data. Standards an...
Savage, Leonard J
1972-01-01
Classic analysis of the foundations of statistics and development of personal probability, one of the greatest controversies in modern statistical thought. Revised edition. Calculus, probability, statistics, and Boolean algebra are recommended.
... Childhood Cancer Statistics Childhood Cancer Statistics – Graphs and Infographics Number of Diagnoses Incidence Rates Over Time Cancer Deaths Per Year 5-Year Survival Rate Infographics Childhood Cancer Statistics – Important Facts Each year, the ...
Modern statistics for the social and behavioral sciences a practical introduction
Wilcox, Rand
2011-01-01
Relative advantages/disadvantages of various techniques are presented so that the reader can be helped to understand the choices they make on using the techniques. … A considerable number of illustrations are included and the book focuses on using R for its computer software application. … A useful text for … postgraduate students in the social science disciplines.-Susan Starkings, International Statistical Review, 2012This is an interesting and valuable book … By gathering a mass of results on that topic into a single volume with references, alternative procedures, and supporting software, th
Directory of Open Access Journals (Sweden)
Gabriel Constantino Blain
2009-03-01
Full Text Available O correto entendimento das diversas sucessões dos tipos de tempo, que podem ser observadas em uma região, é uma das etapas fundamentais na redução do risco climático associado ao setor agrícola. O objetivo do trabalho foi analisar a variabilidade temporal dos dados mensais de precipitação pluvial de oito localidades do Estado de São Paulo. Investigações sobre possíveis tendências climáticas também foram realizadas. Por meio da análise de ondeletas, do teste da razão da máxima verossimilhança e do teste de Mann-Kendall, observa-se elevada variabilidade temporal dos valores de precipitação pluvial mensal nas oito regiões analisadas. O tratamento de tais séries como sendo estritamente estacionárias, ou a condução de análises apenas no domínio da freqüência acarretará em perda de informações sobre as forçantes moduladoras desse processo estocástico. Apesar dessa característica, não houve detecção de marcantes tendências de ordem climática no regime de precipitação pluvial. Sob o ponto de vista agrometeorológico, essa elevada variabilidade imprime, em um zoneamento, considerável grau de incerteza na determinação do potencial de atendimento hídrico às culturas. Tal incerteza deve ser considerada na determinação de áreas aptas, inaptas e marginais.The correct understanding of the weather variability is a fundamental step in reducing the agricultural climate risk. The aim of the study was to evaluate the temporal variability of monthly precipitation data from eight regions of the State of São Paulo, Brazil. Investigations related to possible climate trends were also held. Using the wavelet analysis, the likelihood ratio test, and the Mann-Kendall test, it was observed a very high temporal variability of the monthly precipitation data in the eight analyzed regions. The treatment of such series as strictly stationary, or the use of statistical models (such as Fourier spectral analysis, that only
Short clinical crowns (SCC) – treatment considerations and techniques
Rahul, G. R.; Poduval, Soorya T.; Shetty, Karunakar
2012-01-01
When the clinical crowns of teeth are dimensionally inadequate, esthetically and biologically acceptable restoration of these dental units is difficult. Often an acceptable restoration cannot be accomplished without first surgically increasing the length of the existing clinical crowns; therefore, successful management requires an understanding of both the dental and periodontal parameters of treatment. The complications presented by teeth with short clinical crowns demand a comprehensive treatment plan and proper sequencing of therapy to ensure a satisfactory result. Visualization of the desired result is a prerequisite of successful therapy. This review examines the periodontal and restorative factors related to restoring teeth with short clinical crowns. Modes of therapy are usually combined to meet the biologic, restorative, and esthetic requirements imposed by short clinical crowns. In this study various methods for treating short clinical crowns are reviewed, the role that restoration margin location play in the maintenance of periodontal and dental symbiosis and the effects of violation of the supracrestal gingivae by improper full-coverage restorations has also been discussed. Key words:Short clinical crown, surgical crown lengthening, forced eruption, diagnostic wax up, alveoloplasty, gingivectomy. PMID:24558561
Short clinical crowns (SCC) - treatment considerations and techniques.
Sharma, Ashu; Rahul, G R; Poduval, Soorya T; Shetty, Karunakar
2012-10-01
When the clinical crowns of teeth are dimensionally inadequate, esthetically and biologically acceptable restoration of these dental units is difficult. Often an acceptable restoration cannot be accomplished without first surgically increasing the length of the existing clinical crowns; therefore, successful management requires an understanding of both the dental and periodontal parameters of treatment. The complications presented by teeth with short clinical crowns demand a comprehensive treatment plan and proper sequencing of therapy to ensure a satisfactory result. Visualization of the desired result is a prerequisite of successful therapy. This review examines the periodontal and restorative factors related to restoring teeth with short clinical crowns. Modes of therapy are usually combined to meet the biologic, restorative, and esthetic requirements imposed by short clinical crowns. In this study various methods for treating short clinical crowns are reviewed, the role that restoration margin location play in the maintenance of periodontal and dental symbiosis and the effects of violation of the supracrestal gingivae by improper full-coverage restorations has also been discussed. Key words:Short clinical crown, surgical crown lengthening, forced eruption, diagnostic wax up, alveoloplasty, gingivectomy.
Techniques and Considerations for FIA forest fragmentation analysis
Andrew J. Lister; Tonya W. Lister; Rachel Riemann; Mike Hoppus
2002-01-01
The Forest Inventory and Analysis unit of the Northeastern Research Station (NEFIA) is charged with inventorying and monitoring the Nation's forests. NEFIA has not gathered much information on forest fragmentation, but recent developments in computing and remote sensing technologies now make it possible to assess forest fragmentation on a regional basis. We...
A consideration of veld condition assessment techniques for ...
African Journals Online (AJOL)
... of the major livestock producing areas of South Africa.Language: English. Keywords: botany; decreaser species; evaluation; Grazing; grazing capacity; increaser species; Livestock; Livestock production; procedure; south africa; vegetation type; vegetation types; veld condition; veld condition assessment; Veld conditions ...
Considerations and Algorithms for Compression of Sets
DEFF Research Database (Denmark)
Larsson, Jesper
We consider compression of unordered sets of distinct elements. After a discus- sion of the general problem, we focus on compressing sets of fixed-length bitstrings in the presence of statistical information. We survey techniques from previous work, suggesting some adjustments, and propose a novel...
Epigenetic considerations in aquaculture
Directory of Open Access Journals (Sweden)
Mackenzie R. Gavery
2017-12-01
Full Text Available Epigenetics has attracted considerable attention with respect to its potential value in many areas of agricultural production, particularly under conditions where the environment can be manipulated or natural variation exists. Here we introduce key concepts and definitions of epigenetic mechanisms, including DNA methylation, histone modifications and non-coding RNA, review the current understanding of epigenetics in both fish and shellfish, and propose key areas of aquaculture where epigenetics could be applied. The first key area is environmental manipulation, where the intention is to induce an ‘epigenetic memory’ either within or between generations to produce a desired phenotype. The second key area is epigenetic selection, which, alone or combined with genetic selection, may increase the reliability of producing animals with desired phenotypes. Based on aspects of life history and husbandry practices in aquaculture species, the application of epigenetic knowledge could significantly affect the productivity and sustainability of aquaculture practices. Conversely, clarifying the role of epigenetic mechanisms in aquaculture species may upend traditional assumptions about selection practices. Ultimately, there are still many unanswered questions regarding how epigenetic mechanisms might be leveraged in aquaculture.
Saudi dental students' perceptions of pediatric behavior guidance techniques.
Al-Jobair, Asma M; Al-Mutairi, Manal A
2015-09-10
Dental students receive theoretical and clinical training in pediatric behavioral guidance techniques at university. Therefore, the content of the educational course and the degree of training in behavioral techniques may have an impact on the students' perceptions and practice of such techniques. The purpose of this study was to evaluate Saudi dental students' perceptions of behavior guidance techniques used in pediatric dentistry, and to assess the changes in their perceptions after 1 academic year of a didactic and clinical educational course. This longitudinal study was carried out once at the beginning and once at the end of the 2013/2014 academic year at the College of Dentistry, King Saud University in Riyadh, Saudi Arabia. A questionnaire measuring the perceived acceptability of behavior guidance techniques was completed by 78 fourth-year dental students before and after a pediatric dental course. Acceptability ratings were scored on a 5-point Likert scale and compared and evaluated in relation to demographic data. Paired t-test and one-way analysis of variance were used for the statistical analyses. Before the course, the highest scores were for reinforcement and desensitizing techniques and the lowest were for aversive and communicative techniques. After the course, statistically significant increases were found in the acceptability of aversive techniques (voice control and hand-over-mouth), all pharmacological techniques, and modeling. Most communicative techniques and clinical situations were also rated as significantly more acceptable. Statistically significant decreases in acceptability ratings were found in promising a toy, and immobilization by staff or a parent. Immobilization using a papoose board, modeling, the presence of parents during the child's treatment, and most communicative techniques were rated as significantly more acceptable by male students than female students. In general, Saudi dental students rated most basic behavior guidance
Specific Considerations for LGBT Eldercare
Claassen, Ashley
2014-01-01
The LGBT community has changed considerably in numbers, awareness, and acceptance in the past few decades. Due to the growing number of LGBT elderly, special considera-tions must be taken into account in planning their care. This study aimed at showcasing considerations that should be taken into account when planning elderly care and future elderly living arrangements. The research questions used were (a) are there special considerations for LGBT seniors? and (b), if so, what are the cons...
Energy Technology Data Exchange (ETDEWEB)
Petridis, C.; Ries, T.; Cramer, M.C.; Graessner, J.; Petersen, K.U.; Reitmeier, F.; Jaehne, M.; Weiss, F.; Adam, G.; Habermann, C.R.
2007-02-15
Purpose: To evaluate an ultra-fast sequence for MR sialography requiring no post-processing and to compare the acquisition technique regarding the effect of oral stimulation with a parallel acquisition technique in patients with salivary gland diseases. Materials and Methods: 128 patients with salivary gland disease were prospectively examined using a 1.5-T superconducting system with a 30 mT/m maximum gradient capability and a maximum slew rate of 125 mT/m/sec. A single-shot turbo-spin-echo sequence (ss-TSE) with an acquisition time of 2.8 sec was used in transverse and oblique sagittal orientation. All images were obtained with and without a parallel imaging technique. The evaluation of the ductal system of the parotid and submandibular gland was performed using a visual scale of 1-5 for each side. The images were assessed by two independent experienced radiologists. An ANOVA with posthoc comparisons and an overall two tailed significance level of p=0.05 was used for the statistical evaluation. An intraclass correlation was computed to evaluate interobserver variability and a correlation of >0.8 was determined, thereby indicating a high correlation. Results: Depending on the diagnosed diseases and the absence of abruption of the ducts, all parts of excretory ducts were able to be visualized in all patients using the developed technique with an overall rating for all ducts of 2.70 (SD{+-}0.89). A high correlation was achieved between the two observers with an intraclass correlation of 0.73. Oral application of a sialogogum improved the visibility of excretory ducts significantly (p<0.001). In contrast, the use of a parallel imaging technique led to a significant decrease in image quality (p=0,011). (orig.)
Generalized quantum statistics
International Nuclear Information System (INIS)
Chou, C.
1992-01-01
In the paper, a non-anyonic generalization of quantum statistics is presented, in which Fermi-Dirac statistics (FDS) and Bose-Einstein statistics (BES) appear as two special cases. The new quantum statistics, which is characterized by the dimension of its single particle Fock space, contains three consistent parts, namely the generalized bilinear quantization, the generalized quantum mechanical description and the corresponding statistical mechanics
Statistical Analysis by Statistical Physics Model for the STOCK Markets
Wang, Tiansong; Wang, Jun; Fan, Bingli
A new stochastic stock price model of stock markets based on the contact process of the statistical physics systems is presented in this paper, where the contact model is a continuous time Markov process, one interpretation of this model is as a model for the spread of an infection. Through this model, the statistical properties of Shanghai Stock Exchange (SSE) and Shenzhen Stock Exchange (SZSE) are studied. In the present paper, the data of SSE Composite Index and the data of SZSE Component Index are analyzed, and the corresponding simulation is made by the computer computation. Further, we investigate the statistical properties, fat-tail phenomena, the power-law distributions, and the long memory of returns for these indices. The techniques of skewness-kurtosis test, Kolmogorov-Smirnov test, and R/S analysis are applied to study the fluctuation characters of the stock price returns.
Characterizing Financial and Statistical Literacy
DEFF Research Database (Denmark)
Di Girolamo, Amalia; Harrison, Glenn W.; Lau, Morten
We characterize the literacy of an individual in a domain by their elicited subjective belief distribution over the possible responses to a question posed in that domain. We consider literacy across several financial, economic and statistical domains. We find considerable demographic heterogeneity...... in the degree of literacy. We also characterize the degree of consistency within a sample about their knowledge, even when that knowledge is imperfect. We show how uncertainty aversion might be a normatively attractive behavior for individuals who have imperfect literacy. Finally, we discuss extensions of our...... approach to characterize financial capability, the consequences of non-literacy, social literacy, and the information content of hypothetical survey measures of literacy....
Swiss electricity statistics 2003
International Nuclear Information System (INIS)
2004-01-01
This publication by the Swiss Federal Office of Energy (SFOE) for the Swiss Federal Office of Energy (SFOE) provides statistical information on electricity supply, production, trading and consumption in Switzerland in 2003. Apart from a general overview of the Swiss electricity supply that includes details on power generation, energy transfer with neighbouring countries and data on prices, average consumption and capital investment, the publication also includes graphical representations of electrical energy flows in and out of Switzerland. Tables of data give information on electricity production, import and export for the years 1950 to 2003, the data being supplied for each hydrological year and the summer and winter seasons respectively. The structure of power production in Switzerland is examined in detail and compared with that of foreign countries. Details are given on the development of production capacities and the various means of production together with their respective shares of total production. Further tables and diagrams provide information on power production in various geographical regions and on the management of pumped storage hydro-electricity schemes. A further chapter deals in detail with the consumption of electricity, its growth between 1984 and 2003 and its use in various sectors. A fifth chapter examines electricity consumption, generation, import and export on single, typical days, presenting data in tables and diagrams. The next chapter examines energy transfer with foreign countries and the trading structures involved. The next two chapters cover the future developments in energy exchange and trading with foreign countries and the possibilities of augmenting power generation capacities up to 2010. The final chapter looks at economic considerations involved in the supply of electricity. An annex provides detailed tables of data
Swiss electricity statistics 2002
International Nuclear Information System (INIS)
2003-01-01
This publication by the Swiss Federal Office of Energy (SFOE) for the Swiss Federal Office of Energy (SFOE) provides statistical information on electricity supply, production, trading and consumption in Switzerland in 2002. Apart from a general overview of the Swiss electricity supply that includes details on power generation, energy transfer with neighbouring countries and data on prices, average consumption and capital investment, the publication also includes graphical representations of electrical energy flows in and out of Switzerland. Tables of data give information on electricity production, import and export for the years 1950 to 2002, the data being supplied for each hydrological year and the summer and winter seasons respectively. The structure of power production in Switzerland is examined in detail and compared with that of foreign countries. Details are given on the development of production capacities and the various means of production together with their respective shares of total production. Further tables and diagrams provide information on power production in various geographical regions and on the management of pumped storage hydro-electricity schemes. A further chapter deals in detail with the consumption of electricity, its growth between 1984 and 2002 and its use in various sectors. A fifth chapter examines electricity consumption, generation, import and export on single, typical days, presenting data in tables and diagrams. The next chapter examines energy transfer with foreign countries and the trading structures involved. The next two chapters cover the future developments in energy exchange and trading with foreign countries and the possibilities of augmenting power generation capacities up to 2009. The final chapter looks at economic considerations involved in the supply of electricity. An annex provides detailed tables of data
Aerobiology: Experimental Considerations, Observations, and Future Tools
Haddrell, Allen E.; Thomas, Richard J.
2017-01-01
ABSTRACT Understanding airborne survival and decay of microorganisms is important for a range of public health and biodefense applications, including epidemiological and risk analysis modeling. Techniques for experimental aerosol generation, retention in the aerosol phase, and sampling require careful consideration and understanding so that they are representative of the conditions the bioaerosol would experience in the environment. This review explores the current understanding of atmospheri...
Technical note - Considerations for MR imaging of small animals
Energy Technology Data Exchange (ETDEWEB)
Baker, Martin A., E-mail: m.a.baker@liv.ac.u [Small Animal Teaching Hospital, University of Liverpool, Chester High Road, Neston, Wirral CH64 7TE (United Kingdom)
2011-05-15
Routine clinical veterinary use of MR scanning is becoming more common. This article addresses the major technical considerations for radiographers performing MR examinations on small animals and provides practical advice for scanning techniques.
Telling the truth with statistics
CERN. Geneva; CERN. Geneva. Audiovisual Unit
2002-01-01
This course of lectures will cover probability, distributions, fitting, errors and confidence levels, for practising High Energy Physicists who need to use Statistical techniques to express their results. Concentrating on these appropriate specialist techniques means that they can be covered in appropriate depth, while assuming only the knowledge and experience of a typical Particle Physicist. The different definitions of probability will be explained, and it will be appear why this basic subject is so controversial; there are several viewpoints and it is important to understand them all, rather than abusing the adherents of different beliefs. Distributions will be covered: the situations they arise in, their useful properties, and the amazing result of the Central Limit Theorem. Fitting a parametrisation to a set of data is one of the most widespread uses of statistics: these are lots of ways of doing this and these will be presented, with discussion of which is appropriate in different circumstances. This t...
Understanding search trees via statistical physics
Indian Academy of Sciences (India)
ary search tree model (where stands for the number of branches of the search tree), an important problem for data storage in computer science, using a variety of statistical physics techniques that allow us to obtain exact asymptotic results.
Data Mining: Going beyond Traditional Statistics
Zhao, Chun-Mei; Luan, Jing
2006-01-01
The authors provide an overview of data mining, giving special attention to the relationship between data mining and statistics to unravel some misunderstandings about the two techniques. (Contains 1 figure.)
On Quantum Statistical Inference, II
Barndorff-Nielsen, O. E.; Gill, R. D.; Jupp, P. E.
2003-01-01
Interest in problems of statistical inference connected to measurements of quantum systems has recently increased substantially, in step with dramatic new developments in experimental techniques for studying small quantum systems. Furthermore, theoretical developments in the theory of quantum measurements have brought the basic mathematical framework for the probability calculations much closer to that of classical probability theory. The present paper reviews this field and proposes and inte...
Practical Statistics for Particle Physicists
Lista, Luca
2017-01-01
These three lectures provide an introduction to the main concepts of statistical data analysis useful for precision measurements and searches for new signals in High Energy Physics. The frequentist and Bayesian approaches to probability theory will introduced and, for both approaches, inference methods will be presented. Hypothesis tests will be discussed, then significance and upper limit evaluation will be presented with an overview of the modern and most advanced techniques adopted for data analysis at the Large Hadron Collider.
Applied statistics for social and management sciences
Miah, Abdul Quader
2016-01-01
This book addresses the application of statistical techniques and methods across a wide range of disciplines. While its main focus is on the application of statistical methods, theoretical aspects are also provided as fundamental background information. It offers a systematic interpretation of results often discovered in general descriptions of methods and techniques such as linear and non-linear regression. SPSS is also used in all the application aspects. The presentation of data in the form of tables and graphs throughout the book not only guides users, but also explains the statistical application and assists readers in interpreting important features. The analysis of statistical data is presented consistently throughout the text. Academic researchers, practitioners and other users who work with statistical data will benefit from reading Applied Statistics for Social and Management Sciences. .
International Conference on Robust Statistics 2015
Basu, Ayanendranath; Filzmoser, Peter; Mukherjee, Diganta
2016-01-01
This book offers a collection of recent contributions and emerging ideas in the areas of robust statistics presented at the International Conference on Robust Statistics 2015 (ICORS 2015) held in Kolkata during 12–16 January, 2015. The book explores the applicability of robust methods in other non-traditional areas which includes the use of new techniques such as skew and mixture of skew distributions, scaled Bregman divergences, and multilevel functional data methods; application areas being circular data models and prediction of mortality and life expectancy. The contributions are of both theoretical as well as applied in nature. Robust statistics is a relatively young branch of statistical sciences that is rapidly emerging as the bedrock of statistical analysis in the 21st century due to its flexible nature and wide scope. Robust statistics supports the application of parametric and other inference techniques over a broader domain than the strictly interpreted model scenarios employed in classical statis...
Contributions to sampling statistics
Conti, Pier; Ranalli, Maria
2014-01-01
This book contains a selection of the papers presented at the ITACOSM 2013 Conference, held in Milan in June 2013. ITACOSM is the bi-annual meeting of the Survey Sampling Group S2G of the Italian Statistical Society, intended as an international forum of scientific discussion on the developments of theory and application of survey sampling methodologies and applications in human and natural sciences. The book gathers research papers carefully selected from both invited and contributed sessions of the conference. The whole book appears to be a relevant contribution to various key aspects of sampling methodology and techniques; it deals with some hot topics in sampling theory, such as calibration, quantile-regression and multiple frame surveys, and with innovative methodologies in important topics of both sampling theory and applications. Contributions cut across current sampling methodologies such as interval estimation for complex samples, randomized responses, bootstrap, weighting, modeling, imputati...
Understanding search trees via statistical physics
Indian Academy of Sciences (India)
Other applications of statistical physics (networks, traffic flows, algorithmic problems, econophysics, astrophysical applications, etc.) ... of branches of the search tree), an important problem for data storage in computer science, using a variety of statistical physics techniques that allow us to obtain exact asymptotic results.
Marrakesh International Conference on Probability and Statistics
Ouassou, Idir; Rachdi, Mustapha
2015-01-01
This volume, which highlights recent advances in statistical methodology and applications, is divided into two main parts. The first part presents theoretical results on estimation techniques in functional statistics, while the second examines three key areas of application: estimation problems in queuing theory, an application in signal processing, and the copula approach to epidemiologic modelling. The book’s peer-reviewed contributions are based on papers originally presented at the Marrakesh International Conference on Probability and Statistics held in December 2013.
Statistical and thermal physics with computer applications
Gould, Harvey
2010-01-01
This textbook carefully develops the main ideas and techniques of statistical and thermal physics and is intended for upper-level undergraduate courses. The authors each have more than thirty years' experience in teaching, curriculum development, and research in statistical and computational physics. Statistical and Thermal Physics begins with a qualitative discussion of the relation between the macroscopic and microscopic worlds and incorporates computer simulations throughout the book to provide concrete examples of important conceptual ideas. Unlike many contemporary texts on the
National transportation statistics 2010
2010-01-01
National Transportation Statistics presents statistics on the U.S. transportation system, including its physical components, safety record, economic performance, the human and natural environment, and national security. This is a large online documen...
National transportation statistics 2011
2011-04-01
Compiled and published by the U.S. Department of Transportation's Bureau of Transportation Statistics (BTS), National Transportation Statistics presents information on the U.S. transportation system, including its physical components, safety record, ...
... News & Events About Us Home > Health Information Share Statistics Research shows that mental illnesses are common in ... of mental illnesses, such as suicide and disability. Statistics Top ı cs Mental Illness Any Anxiety Disorder ...
... Home > Learn About Blood > Blood Facts and Statistics Blood Facts and Statistics Facts about blood needs Facts ... about American Red Cross Blood Services Facts about blood needs Every two seconds someone in the U.S. ...
U.S. Department of Health & Human Services — The CMS Office of Enterprise Data and Analytics has developed CMS Program Statistics, which includes detailed summary statistics on national health care, Medicare...
DEFF Research Database (Denmark)
Lindström, Erik; Madsen, Henrik; Nielsen, Jan Nygaard
Statistics for Finance develops students’ professional skills in statistics with applications in finance. Developed from the authors’ courses at the Technical University of Denmark and Lund University, the text bridges the gap between classical, rigorous treatments of financial mathematics...
Developments in Statistical Education.
Kapadia, Ramesh
1980-01-01
The current status of statistics education at the secondary level is reviewed, with particular attention focused on the various instructional programs in England. A description and preliminary evaluation of the Schools Council Project on Statistical Education is included. (MP)
Principles of applied statistics
National Research Council Canada - National Science Library
Cox, D. R; Donnelly, Christl A
2011-01-01
.... David Cox and Christl Donnelly distil decades of scientific experience into usable principles for the successful application of statistics, showing how good statistical strategy shapes every stage of an investigation...
Statistical data analysis handbook
National Research Council Canada - National Science Library
Wall, Francis J
1986-01-01
It must be emphasized that this is not a text book on statistics. Instead it is a working tool that presents data analysis in clear, concise terms which can be readily understood even by those without formal training in statistics...
Lenard, Christopher; McCarthy, Sally; Mills, Terence
2014-01-01
There are many different aspects of statistics. Statistics involves mathematics, computing, and applications to almost every field of endeavour. Each aspect provides an opportunity to spark someone's interest in the subject. In this paper we discuss some ethical aspects of statistics, and describe how an introduction to ethics has been…
Fisher's Contributions to Statistics
Indian Academy of Sciences (India)
T Krishnan received his. Ph.D. from the Indian. Statistical Institute. He joined the faculty of lSI in. 1965 and has been with the Institute ever since. He is at present a professor in the Applied. Statistics, Surveys and. Computing Division of the Institute. Krishnan's research interests are in. Statistical Pattern. Recognition ...
Indian Academy of Sciences (India)
IAS Admin
Dirac statistics, identical and in- distinguishable particles, Fermi gas. ... They obey. Fermi–Dirac statistics. In contrast, those with integer spin such as photons, mesons, 7Li atoms are called bosons and they obey. Bose–Einstein statistics. .... hypothesis (which later was extended as the third law of thermody- namics) was ...
Understanding the statistics of small risks
International Nuclear Information System (INIS)
Siddall, E.
1983-10-01
Monte Carlo analyses are used to show what inferences can and cannot be drawn when either a very small number of accidents result from a considerable exposure or where a very small number of people, down to a single individual, are exposed to small added risks. The distinction between relative and absolute uncertainty is illustrated. No new statistical principles are involved
Nonparametric statistical inference
Gibbons, Jean Dickinson
2010-01-01
Overall, this remains a very fine book suitable for a graduate-level course in nonparametric statistics. I recommend it for all people interested in learning the basic ideas of nonparametric statistical inference.-Eugenia Stoimenova, Journal of Applied Statistics, June 2012… one of the best books available for a graduate (or advanced undergraduate) text for a theory course on nonparametric statistics. … a very well-written and organized book on nonparametric statistics, especially useful and recommended for teachers and graduate students.-Biometrics, 67, September 2011This excellently presente
Statistics & probaility for dummies
Rumsey, Deborah J
2013-01-01
Two complete eBooks for one low price! Created and compiled by the publisher, this Statistics I and Statistics II bundle brings together two math titles in one, e-only bundle. With this special bundle, you'll get the complete text of the following two titles: Statistics For Dummies, 2nd Edition Statistics For Dummies shows you how to interpret and critique graphs and charts, determine the odds with probability, guesstimate with confidence using confidence intervals, set up and carry out a hypothesis test, compute statistical formulas, and more. Tra
Dowdy, Shirley; Chilko, Daniel
2011-01-01
Praise for the Second Edition "Statistics for Research has other fine qualities besides superior organization. The examples and the statistical methods are laid out with unusual clarity by the simple device of using special formats for each. The book was written with great care and is extremely user-friendly."-The UMAP Journal Although the goals and procedures of statistical research have changed little since the Second Edition of Statistics for Research was published, the almost universal availability of personal computers and statistical computing application packages have made it possible f
Griffiths, Dawn
2009-01-01
Wouldn't it be great if there were a statistics book that made histograms, probability distributions, and chi square analysis more enjoyable than going to the dentist? Head First Statistics brings this typically dry subject to life, teaching you everything you want and need to know about statistics through engaging, interactive, and thought-provoking material, full of puzzles, stories, quizzes, visual aids, and real-world examples. Whether you're a student, a professional, or just curious about statistical analysis, Head First's brain-friendly formula helps you get a firm grasp of statistics
DEFF Research Database (Denmark)
Wang, X.; Heimann, T.; Lo, P.
2012-01-01
The segmentation of tree-like tubular structures such as coronary arteries and airways is an essential step for many 3D medical imaging applications. Statistical tracking techniques for the extraction of elongated structures have received considerable attention in recent years due...... to their robustness against image noise and pathological changes. However, most tracking methods are limited to a specific application and do not support branching structures efficiently. In this work, we present a novel statistical tracking approach for the extraction of different types of tubular structures...
Naghshpour, Shahdad
2012-01-01
Statistics is the branch of mathematics that deals with real-life problems. As such, it is an essential tool for economists. Unfortunately, the way you and many other economists learn the concept of statistics is not compatible with the way economists think and learn. The problem is worsened by the use of mathematical jargon and complex derivations. Here's a book that proves none of this is necessary. All the examples and exercises in this book are constructed within the field of economics, thus eliminating the difficulty of learning statistics with examples from fields that have no relation to business, politics, or policy. Statistics is, in fact, not more difficult than economics. Anyone who can comprehend economics can understand and use statistics successfully within this field, including you! This book utilizes Microsoft Excel to obtain statistical results, as well as to perform additional necessary computations. Microsoft Excel is not the software of choice for performing sophisticated statistical analy...
STATISTICAL ANALYSIS OF MONETARY POLICY INDICATORS VARIABILITY
Directory of Open Access Journals (Sweden)
ANAMARIA POPESCU
2016-10-01
Full Text Available This paper attempts to characterize through statistical indicators of statistical data that we have available. The purpose of this paper is to present statistical indicators, primary and secondary, simple and synthetic, which is frequently used for statistical characterization of statistical series. We can thus analyze central tendency, and data variability, form and concentration distributions package data using analytical tools in Microsoft Excel that enables automatic calculation of descriptive statistics using Data Analysis option from the Tools menu. We will also study the links which exist between statistical variables can be studied using two techniques, correlation and regression. From the analysis of monetary policy in the period 2003 - 2014 and information provided by the website of the National Bank of Romania (BNR seems to be a certain tendency towards eccentricity and asymmetry of financial data series.
Statistical methods for astronomical data analysis
Chattopadhyay, Asis Kumar
2014-01-01
This book introduces “Astrostatistics” as a subject in its own right with rewarding examples, including work by the authors with galaxy and Gamma Ray Burst data to engage the reader. This includes a comprehensive blending of Astrophysics and Statistics. The first chapter’s coverage of preliminary concepts and terminologies for astronomical phenomenon will appeal to both Statistics and Astrophysics readers as helpful context. Statistics concepts covered in the book provide a methodological framework. A unique feature is the inclusion of different possible sources of astronomical data, as well as software packages for converting the raw data into appropriate forms for data analysis. Readers can then use the appropriate statistical packages for their particular data analysis needs. The ideas of statistical inference discussed in the book help readers determine how to apply statistical tests. The authors cover different applications of statistical techniques already developed or specifically introduced for ...
Semiclassical analysis, Witten Laplacians, and statistical mechanis
Helffer, Bernard
2002-01-01
This important book explains how the technique of Witten Laplacians may be useful in statistical mechanics. It considers the problem of analyzing the decay of correlations, after presenting its origin in statistical mechanics. In addition, it compares the Witten Laplacian approach with other techniques, such as the transfer matrix approach and its semiclassical analysis. The author concludes by providing a complete proof of the uniform Log-Sobolev inequality. Contents: Witten Laplacians Approach; Problems in Statistical Mechanics with Discrete Spins; Laplace Integrals and Transfer Operators; S
National Statistical Commission and Indian Official Statistics
Indian Academy of Sciences (India)
T J Rao1. C. R. Rao Advanced Institute of Mathematics, Statistics and Computer Science (AIMSCS) University of Hyderabad Campus Central University Post Office, Prof. C. R. Rao Road Hyderabad 500 046, AP, India. Resonance – Journal of Science Education. Current Issue : Vol. 23, Issue 2 · Current Issue Volume 23 ...
Baseline Statistics of Linked Statistical Data
Scharnhorst, Andrea; Meroño-Peñuela, Albert; Guéret, Christophe
2014-01-01
We are surrounded by an ever increasing ocean of information, everybody will agree to that. We build sophisticated strategies to govern this information: design data models, develop infrastructures for data sharing, building tool for data analysis. Statistical datasets curated by National
Adaptive RAC codes employing statistical channel evaluation ...
African Journals Online (AJOL)
An adaptive encoding technique using row and column array (RAC) codes employing a different number of parity columns that depends on the channel state is proposed in this paper. The trellises of the proposed adaptive codes and a statistical channel evaluation technique employing these trellises are designed and ...
Statistical feature extraction based iris recognition system
Indian Academy of Sciences (India)
Atul Bansal
Abstract. Iris recognition systems have been proposed by numerous researchers using different feature extraction techniques for accurate and reliable biometric authentication. In this paper, a statistical feature extraction technique based on correlation between adjacent pixels has been proposed and implemented. Ham-.
Dealing with statistics what you need to know
Brown, Reva Berman
2007-01-01
A guide to the essential statistical skills needed for success in assignments, projects or dissertations. It explains why it is impossible to avoid using statistics in analysing data. It also describes the language of statistics to make it easier to understand the various terms used for statistical techniques.
Uncertainty the soul of modeling, probability & statistics
Briggs, William
2016-01-01
This book presents a philosophical approach to probability and probabilistic thinking, considering the underpinnings of probabilistic reasoning and modeling, which effectively underlie everything in data science. The ultimate goal is to call into question many standard tenets and lay the philosophical and probabilistic groundwork and infrastructure for statistical modeling. It is the first book devoted to the philosophy of data aimed at working scientists and calls for a new consideration in the practice of probability and statistics to eliminate what has been referred to as the "Cult of Statistical Significance". The book explains the philosophy of these ideas and not the mathematics, though there are a handful of mathematical examples. The topics are logically laid out, starting with basic philosophy as related to probability, statistics, and science, and stepping through the key probabilistic ideas and concepts, and ending with statistical models. Its jargon-free approach asserts that standard methods, suc...
Statistical Methods for Environmental Pollution Monitoring
Energy Technology Data Exchange (ETDEWEB)
Gilbert, Richard O. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)
1987-01-01
The application of statistics to environmental pollution monitoring studies requires a knowledge of statistical analysis methods particularly well suited to pollution data. This book fills that need by providing sampling plans, statistical tests, parameter estimation procedure techniques, and references to pertinent publications. Most of the statistical techniques are relatively simple, and examples, exercises, and case studies are provided to illustrate procedures. The book is logically divided into three parts. Chapters 1, 2, and 3 are introductory chapters. Chapters 4 through 10 discuss field sampling designs and Chapters 11 through 18 deal with a broad range of statistical analysis procedures. Some statistical techniques given here are not commonly seen in statistics book. For example, see methods for handling correlated data (Sections 4.5 and 11.12), for detecting hot spots (Chapter 10), and for estimating a confidence interval for the mean of a lognormal distribution (Section 13.2). Also, Appendix B lists a computer code that estimates and tests for trends over time at one or more monitoring stations using nonparametric methods (Chapters 16 and 17). Unfortunately, some important topics could not be included because of their complexity and the need to limit the length of the book. For example, only brief mention could be made of time series analysis using Box-Jenkins methods and of kriging techniques for estimating spatial and spatial-time patterns of pollution, although multiple references on these topics are provided. Also, no discussion of methods for assessing risks from environmental pollution could be included.
Advanced LBB methodology and considerations
Energy Technology Data Exchange (ETDEWEB)
Olson, R.; Rahman, S.; Scott, P. [Battelle, Columbus, OH (United States)] [and others
1997-04-01
LBB applications have existed in many industries and more recently have been applied in the nuclear industry under limited circumstances. Research over the past 10 years has evolved the technology so that more advanced consideration of LBB can now be given. Some of the advanced considerations for nuclear plants subjected to seismic loading evaluations are summarized in this paper.
Ethical Considerations in Technology Transfer.
Froehlich, Thomas J.
1991-01-01
Examines ethical considerations involved in the transfer of appropriate information technology to less developed countries. Approaches to technology are considered; two philosophical frameworks for studying ethical considerations are discussed, i.e., the Kantian approach and the utilitarian perspective by John Stuart Mill; and integration of the…
Advanced LBB methodology and considerations
International Nuclear Information System (INIS)
Olson, R.; Rahman, S.; Scott, P.
1997-01-01
LBB applications have existed in many industries and more recently have been applied in the nuclear industry under limited circumstances. Research over the past 10 years has evolved the technology so that more advanced consideration of LBB can now be given. Some of the advanced considerations for nuclear plants subjected to seismic loading evaluations are summarized in this paper
Revealed preference with limited consideration
Demuynck, T.; Seel, C.
2014-01-01
We derive revealed preference tests for models where individuals use consideration sets to simplify their consumption problem. Our basic test provides necessary and sufficient conditions for consistency of observed choices with the existence of consideration set restrictions. The same conditions can
Statistical methods for mechanical characterization of randomly reinforced media
Tashkinov, Mikhail
2017-12-01
Advanced materials with heterogeneous microstructure attract extensive interest of researchers and engineers due to combination of unique properties and ability to create materials that are most suitable for each specific application. One of the challenging tasks is development of models of mechanical behavior for such materials since precision of the obtained numerical results highly depends on level of consideration of features of their heterogeneous microstructure. In most cases, numerical modeling of composite structures is based on multiscale approaches that require special techniques for establishing connection between parameters at different scales. This work offers a review of instruments of the statistics and the probability theory that are used for mechanical characterization of heterogeneous media with random positions of reinforcements. Such statistical descriptors are involved in assessment of correlations between the microstructural components and are parts of mechanical theories which require formalization of the information about microstructural morphology. Particularly, the paper addresses application of the instruments of statistics for geometry description and media reconstruction as well as their utilization in homogenization methods and local stochastic stress and strain field analysis.
Powerful Statistical Inference for Nested Data Using Sufficient Summary Statistics.
Dowding, Irene; Haufe, Stefan
2018-01-01
Hierarchically-organized data arise naturally in many psychology and neuroscience studies. As the standard assumption of independent and identically distributed samples does not hold for such data, two important problems are to accurately estimate group-level effect sizes, and to obtain powerful statistical tests against group-level null hypotheses. A common approach is to summarize subject-level data by a single quantity per subject, which is often the mean or the difference between class means, and treat these as samples in a group-level t -test. This "naive" approach is, however, suboptimal in terms of statistical power, as it ignores information about the intra-subject variance. To address this issue, we review several approaches to deal with nested data, with a focus on methods that are easy to implement. With what we call the sufficient-summary-statistic approach, we highlight a computationally efficient technique that can improve statistical power by taking into account within-subject variances, and we provide step-by-step instructions on how to apply this approach to a number of frequently-used measures of effect size. The properties of the reviewed approaches and the potential benefits over a group-level t -test are quantitatively assessed on simulated data and demonstrated on EEG data from a simulated-driving experiment.
Powerful Statistical Inference for Nested Data Using Sufficient Summary Statistics
Dowding, Irene; Haufe, Stefan
2018-01-01
Hierarchically-organized data arise naturally in many psychology and neuroscience studies. As the standard assumption of independent and identically distributed samples does not hold for such data, two important problems are to accurately estimate group-level effect sizes, and to obtain powerful statistical tests against group-level null hypotheses. A common approach is to summarize subject-level data by a single quantity per subject, which is often the mean or the difference between class means, and treat these as samples in a group-level t-test. This “naive” approach is, however, suboptimal in terms of statistical power, as it ignores information about the intra-subject variance. To address this issue, we review several approaches to deal with nested data, with a focus on methods that are easy to implement. With what we call the sufficient-summary-statistic approach, we highlight a computationally efficient technique that can improve statistical power by taking into account within-subject variances, and we provide step-by-step instructions on how to apply this approach to a number of frequently-used measures of effect size. The properties of the reviewed approaches and the potential benefits over a group-level t-test are quantitatively assessed on simulated data and demonstrated on EEG data from a simulated-driving experiment. PMID:29615885
Statistical Physics An Introduction
Yoshioka, Daijiro
2007-01-01
This book provides a comprehensive presentation of the basics of statistical physics. The first part explains the essence of statistical physics and how it provides a bridge between microscopic and macroscopic phenomena, allowing one to derive quantities such as entropy. Here the author avoids going into details such as Liouville’s theorem or the ergodic theorem, which are difficult for beginners and unnecessary for the actual application of the statistical mechanics. In the second part, statistical mechanics is applied to various systems which, although they look different, share the same mathematical structure. In this way readers can deepen their understanding of statistical physics. The book also features applications to quantum dynamics, thermodynamics, the Ising model and the statistical dynamics of free spins.
Statistical baseline assessment in cardiotocography.
Agostinelli, Angela; Braccili, Eleonora; Marchegiani, Enrico; Rosati, Riccardo; Sbrollini, Agnese; Burattini, Luca; Morettini, Micaela; Di Nardo, Francesco; Fioretti, Sandro; Burattini, Laura
2017-07-01
Cardiotocography (CTG) is the most common non-invasive diagnostic technique to evaluate fetal well-being. It consists in the recording of fetal heart rate (FHR; bpm) and maternal uterine contractions. Among the main parameters characterizing FHR, baseline (BL) is fundamental to determine fetal hypoxia and distress. In computerized applications, BL is typically computed as mean FHR±ΔFHR, with ΔFHR=8 bpm or ΔFHR=10 bpm, both values being experimentally fixed. In this context, the present work aims: to propose a statistical procedure for ΔFHR assessment; to quantitatively determine ΔFHR value by applying such procedure to clinical data; and to compare the statistically-determined ΔFHR value against the experimentally-determined ΔFHR values. To these aims, the 552 recordings of the "CTU-UHB intrapartum CTG database" from Physionet were submitted to an automatic procedure, which consisted in a FHR preprocessing phase and a statistical BL assessment. During preprocessing, FHR time series were divided into 20-min sliding windows, in which missing data were removed by linear interpolation. Only windows with a correction rate lower than 10% were further processed for BL assessment, according to which ΔFHR was computed as FHR standard deviation. Total number of accepted windows was 1192 (38.5%) over 383 recordings (69.4%) with at least an accepted window. Statistically-determined ΔFHR value was 9.7 bpm. Such value was statistically different from 8 bpm (P<;10 -19 ) but not from 10 bpm (P=0.16). Thus, ΔFHR=10 bpm is preferable over 8 bpm because both experimentally and statistically validated.
Applied statistics for economists
Lewis, Margaret
2012-01-01
This book is an undergraduate text that introduces students to commonly-used statistical methods in economics. Using examples based on contemporary economic issues and readily-available data, it not only explains the mechanics of the various methods, it also guides students to connect statistical results to detailed economic interpretations. Because the goal is for students to be able to apply the statistical methods presented, online sources for economic data and directions for performing each task in Excel are also included.
Directory of Open Access Journals (Sweden)
Mirjam Nielen
2017-01-01
Full Text Available Always wondered why research papers often present rather complicated statistical analyses? Or wondered how to properly analyse the results of a pragmatic trial from your own practice? This talk will give an overview of basic statistical principles and focus on the why of statistics, rather than on the how.This is a podcast of Mirjam's talk at the Veterinary Evidence Today conference, Edinburgh November 2, 2016.
Equilibrium statistical mechanics
Jackson, E Atlee
2000-01-01
Ideal as an elementary introduction to equilibrium statistical mechanics, this volume covers both classical and quantum methodology for open and closed systems. Introductory chapters familiarize readers with probability and microscopic models of systems, while additional chapters describe the general derivation of the fundamental statistical mechanics relationships. The final chapter contains 16 sections, each dealing with a different application, ordered according to complexity, from classical through degenerate quantum statistical mechanics. Key features include an elementary introduction t
Equilibrium statistical mechanics
Mayer, J E
1968-01-01
The International Encyclopedia of Physical Chemistry and Chemical Physics, Volume 1: Equilibrium Statistical Mechanics covers the fundamental principles and the development of theoretical aspects of equilibrium statistical mechanics. Statistical mechanical is the study of the connection between the macroscopic behavior of bulk matter and the microscopic properties of its constituent atoms and molecules. This book contains eight chapters, and begins with a presentation of the master equation used for the calculation of the fundamental thermodynamic functions. The succeeding chapters highlight t
Mahalanobis, P C
1965-01-01
Contributions to Statistics focuses on the processes, methodologies, and approaches involved in statistics. The book is presented to Professor P. C. Mahalanobis on the occasion of his 70th birthday. The selection first offers information on the recovery of ancillary information and combinatorial properties of partially balanced designs and association schemes. Discussions focus on combinatorial applications of the algebra of association matrices, sample size analogy, association matrices and the algebra of association schemes, and conceptual statistical experiments. The book then examines latt
Annual Statistical Supplement, 2001
Social Security Administration — The Annual Statistical Supplement, 2001 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2011
Social Security Administration — The Annual Statistical Supplement, 2011 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2003
Social Security Administration — The Annual Statistical Supplement, 2003 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2015
Social Security Administration — The Annual Statistical Supplement, 2015 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2000
Social Security Administration — The Annual Statistical Supplement, 2000 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2005
Social Security Administration — The Annual Statistical Supplement, 2005 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2014
Social Security Administration — The Annual Statistical Supplement, 2014 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2009
Social Security Administration — The Annual Statistical Supplement, 2009 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2017
Social Security Administration — The Annual Statistical Supplement, 2017 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2008
Social Security Administration — The Annual Statistical Supplement, 2008 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2010
Social Security Administration — The Annual Statistical Supplement, 2010 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2016
Social Security Administration — The Annual Statistical Supplement, 2016 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2004
Social Security Administration — The Annual Statistical Supplement, 2004 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2002
Social Security Administration — The Annual Statistical Supplement, 2002 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2007
Social Security Administration — The Annual Statistical Supplement, 2007 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2006
Social Security Administration — The Annual Statistical Supplement, 2006 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Bulmer, M G
1979-01-01
There are many textbooks which describe current methods of statistical analysis, while neglecting related theory. There are equally many advanced textbooks which delve into the far reaches of statistical theory, while bypassing practical applications. But between these two approaches is an unfilled gap, in which theory and practice merge at an intermediate level. Professor M. G. Bulmer's Principles of Statistics, originally published in 1965, was created to fill that need. The new, corrected Dover edition of Principles of Statistics makes this invaluable mid-level text available once again fo
Fundamental statistical theories
International Nuclear Information System (INIS)
Demopoulos, W.
1976-01-01
Einstein argued that since quantum mechanics is not a fundamental theory it cannot be regarded as in any sense final. The pure statistical states of the quantum theory are not dispersion-free. In this sense, the theory is significantly statistical. The problem investigated in this paper is to determine under what conditions is a significalty statistical theory correctly regarded as fundamental. The solution developed in this paper is that a statistical theory is fundamental only if it is complete; moreover the quantum theory is complete. (B.R.H.)
... Healthcare Professionals Clinicians Public Health Officials Veterinarians Prevention History of Plague Resources FAQ Maps and Statistics Recommend on Facebook Tweet Share Compartir Plague in the United States ...
Valley Fever (Coccidioidomycosis) Statistics
... mouth, throat, and esophagus Vaginal candidiasis Invasive candidiasis Definition Symptoms Risk & Prevention Sources Diagnosis Treatment Statistics Healthcare Professionals More Resources Candida auris General Information ...
Understanding Computational Bayesian Statistics
Bolstad, William M
2011-01-01
A hands-on introduction to computational statistics from a Bayesian point of view Providing a solid grounding in statistics while uniquely covering the topics from a Bayesian perspective, Understanding Computational Bayesian Statistics successfully guides readers through this new, cutting-edge approach. With its hands-on treatment of the topic, the book shows how samples can be drawn from the posterior distribution when the formula giving its shape is all that is known, and how Bayesian inferences can be based on these samples from the posterior. These ideas are illustrated on common statistic
Kanji, Gopal K
2006-01-01
This expanded and updated Third Edition of Gopal K. Kanji's best-selling resource on statistical tests covers all the most commonly used tests with information on how to calculate and interpret results with simple datasets. Each entry begins with a short summary statement about the test's purpose, and contains details of the test objective, the limitations (or assumptions) involved, a brief outline of the method, a worked example, and the numerical calculation. 100 Statistical Tests, Third Edition is the one indispensable guide for users of statistical materials and consumers of statistical information at all levels and across all disciplines.
Boslaugh, Sarah
2008-01-01
Need to learn statistics as part of your job, or want some help passing a statistics course? Statistics in a Nutshell is a clear and concise introduction and reference that's perfect for anyone with no previous background in the subject. This book gives you a solid understanding of statistics without being too simple, yet without the numbing complexity of most college texts. You get a firm grasp of the fundamentals and a hands-on understanding of how to apply them before moving on to the more advanced material that follows. Each chapter presents you with easy-to-follow descriptions illustrat
Shasha, Dennis
2010-01-01
Statistics is the activity of inferring results about a population given a sample. Historically, statistics books assume an underlying distribution to the data (typically, the normal distribution) and derive results under that assumption. Unfortunately, in real life, one cannot normally be sure of the underlying distribution. For that reason, this book presents a distribution-independent approach to statistics based on a simple computational counting idea called resampling. This book explains the basic concepts of resampling, then systematically presents the standard statistical measures along
Experimental techniques; Techniques experimentales
Energy Technology Data Exchange (ETDEWEB)
Roussel-Chomaz, P. [GANIL CNRS/IN2P3, CEA/DSM, 14 - Caen (France)
2007-07-01
This lecture presents the experimental techniques, developed in the last 10 or 15 years, in order to perform a new class of experiments with exotic nuclei, where the reactions induced by these nuclei allow to get information on their structure. A brief review of the secondary beams production methods will be given, with some examples of facilities in operation or under project. The important developments performed recently on cryogenic targets will be presented. The different detection systems will be reviewed, both the beam detectors before the targets, and the many kind of detectors necessary to detect all outgoing particles after the reaction: magnetic spectrometer for the heavy fragment, detection systems for the target recoil nucleus, {gamma} detectors. Finally, several typical examples of experiments will be detailed, in order to illustrate the use of each detector either alone, or in coincidence with others. (author)
Demystifying EQA statistics and reports.
Coucke, Wim; Soumali, Mohamed Rida
2017-02-15
Reports act as an important feedback tool in External Quality Assessment (EQA). Their main role is to score laboratories for their performance in an EQA round. The most common scores that apply to quantitative data are Q- and Z-scores. To calculate these scores, EQA providers need to have an assigned value and standard deviation for the sample. Both assigned values and standard deviations can be derived chemically or statistically. When derived statistically, different anomalies against the normal distribution of the data have to be handled. Various procedures for evaluating laboratories are able to handle these anomalies. Formal tests and graphical representation techniques are discussed and suggestions are given to help choosing between the different evaluations techniques. In order to obtain reliable estimates for calculating performance scores, a satisfactory number of data is needed. There is no general agreement about the minimal number that is needed. A solution for very small numbers is proposed by changing the limits of evaluation. Apart from analyte- and sample-specific laboratory evaluation, supplementary information can be obtained by combining results for different analytes and samples. Various techniques are overviewed. It is shown that combining results leads to supplementary information, not only for quantitative, but also for qualitative and semi-quantitative analytes.
Radar imaging using statistical orthogonality
Falconer, David G.
2000-08-01
Statistical orthogonality provides a mathematical basis for imaging scattering data with an inversion algorithm that is both robust and economic. The statistical technique is based on the approximate orthogonality of vectors whose elements are exponential functions with imaginary arguments and random phase angles. This orthogonality allows one to image radar data without first inverting a matrix whose dimensionality equals or exceeds the number of pixels or voxels in the algorithmic image. Additionally, statistical-based methods are applicable to data sets collected under a wide range of operational conditions, e.g., the random flight paths of the curvilinear SAR, the frequency-hopping emissions of ultra- wideband radar, or the narrowband data collected with a bistatic radar. The statistical approach also avoids the often-challenging and computationally intensive task of converting the collected measurements to a data format that is appropriate for imaging with a fast Fourier transform (FFT) or fast tomography algorithm (FTA), e.g., interpolating from polar to rectangular coordinates, or conversely.
Statistical perspectives on inverse problems
DEFF Research Database (Denmark)
Andersen, Kim Emil
of the interior of an object from electrical boundary measurements. One part of this thesis concerns statistical approaches for solving, possibly non-linear, inverse problems. Thus inverse problems are recasted in a form suitable for statistical inference. In particular, a Bayesian approach for regularisation...... problem is given in terms of probability distributions. Posterior inference is obtained by Markov chain Monte Carlo methods and new, powerful simulation techniques based on e.g. coupled Markov chains and simulated tempering is developed to improve the computational efficiency of the overall simulation......Inverse problems arise in many scientific disciplines and pertain to situations where inference is to be made about a particular phenomenon from indirect measurements. A typical example, arising in diffusion tomography, is the inverse boundary value problem for non-invasive reconstruction...
Statistics for High Energy Physics
CERN. Geneva
2018-01-01
The lectures emphasize the frequentist approach used for Dark Matter search and the Higgs search, discovery and measurements of its properties. An emphasis is put on hypothesis test using the asymptotic formulae formalism and its derivation, and on the derivation of the trial factor formulae in one and two dimensions. Various test statistics and their applications are discussed. Some keywords: Profile Likelihood, Neyman Pearson, Feldman Cousins, Coverage, CLs. Nuisance Parameters Impact, Look Elsewhere Effect... Selected Bibliography: G. J. Feldman and R. D. Cousins, A Unified approach to the classical statistical analysis of small signals, Phys.\\ Rev.\\ D {\\bf 57}, 3873 (1998). A. L. Read, Presentation of search results: The CL(s) technique,'' J.\\ Phys.\\ G {\\bf 28}, 2693 (2002). G. Cowan, K. Cranmer, E. Gross and O. Vitells, Asymptotic formulae for likelihood-based tests of new physics,' Eur.\\ Phys.\\ J.\\ C {\\bf 71}, 1554 (2011) Erratum: [Eur.\\ Phys.\\ J.\\ C {\\bf 73}...
Energy statistics yearbook 2000
International Nuclear Information System (INIS)
2002-01-01
The Energy Statistics Yearbook 2000 is a comprehensive collection of international energy statistics prepared by the United Nations Statistics Division. It is the forty-third in a series of annual compilations which commenced under the title World Energy Supplies in Selected Years, 1929-1950. It updates the statistical series shown in the previous issue. Supplementary series of monthly and quarterly data on production of energy may be found in the Monthly Bulletin of Statistics. The principal objective of the Yearbook is to provide a global framework of comparable data on long-term trends in the supply of mainly commercial primary and secondary forms of energy. Data for each type of fuel and aggregate data for the total mix of commercial fuels are shown for individual countries and areas and are summarized into regional and world totals. The data are compiled primarily from the annual energy questionnaire distributed by the United Nations Statistics Division and supplemented by official national statistical publications. Where official data are not available or are inconsistent, estimates are made by the Statistics Division based on governmental, professional or commercial materials. Estimates include, but are not limited to, extrapolated data based on partial year information, use of annual trends, trade data based on partner country reports, breakdowns of aggregated data as well as analysis of current energy events and activities
Energy statistics yearbook 2001
International Nuclear Information System (INIS)
2004-01-01
The Energy Statistics Yearbook 2001 is a comprehensive collection of international energy statistics prepared by the United Nations Statistics Division. It is the forty-fifth in a series of annual compilations which commenced under the title World Energy Supplies in Selected Years, 1929-1950. It updates the statistical series shown in the previous issue. Supplementary series of monthly and quarterly data on production of energy may be found in the Monthly Bulletin of Statistics. The principal objective of the Yearbook is to provide a global framework of comparable data on long-term trends in the supply of mainly commercial primary and secondary forms of energy. Data for each type of fuel and aggregate data for the total mix of commercial fuels are shown for individual countries and areas and are summarized into regional and world totals. The data are compiled primarily from the annual energy questionnaire distributed by the United Nations Statistics Division and supplemented by official national statistical publications. Where official data are not available or are inconsistent, estimates are made by the Statistics Division based on governmental, professional or commercial materials. Estimates include, but are not limited to, extrapolated data based on partial year information, use of annual trends, trade data based on partner country reports, breakdowns of aggregated data as well as analysis of current energy events and activities
DEFF Research Database (Denmark)
Lauritzen, Steffen Lilholt
This book studies the brilliant Danish 19th Century astronomer, T.N. Thiele who made important contributions to statistics, actuarial science, astronomy and mathematics. The most important of these contributions in statistics are translated into English for the first time, and the text includes c...
Huizingh, Eelko K. R. E.
2007-01-01
Accessibly written and easy to use, "Applied Statistics Using SPSS" is an all-in-one self-study guide to SPSS and do-it-yourself guide to statistics. What is unique about Eelko Huizingh's approach is that this book is based around the needs of undergraduate students embarking on their own research project, and its self-help style is designed to…
This tool allows users to animate cancer trends over time by cancer site and cause of death, race, and sex. Provides access to incidence, mortality, and survival. Select the type of statistic, variables, format, and then extract the statistics in a delimited format for further analyses.
Principles of medical statistics
National Research Council Canada - National Science Library
Feinstein, Alvan R
2002-01-01
... or limited attention. They are then offered a simple, superﬁcial account of the most common doctrines and applications of statistical theory. The "get-it-over-withquickly" approach has been encouraged and often necessitated by the short time given to statistics in modern biomedical education. The curriculum is supposed to provide fundament...
Statistical Engine Knock Control
DEFF Research Database (Denmark)
Stotsky, Alexander A.
2008-01-01
A new statistical concept of the knock control of a spark ignition automotive engine is proposed . The control aim is associated with the statistical hy pothesis test which compares the threshold value to the average value of the max imal amplitud e of the knock sensor signal at a given freq uency...